March 23, 2014

Google’s algorithms are a closely guarded secret

One of the most interesting announcements in last week’s Budget – well, for me at least, as someone who has no savings and doesn’t play bingo or drink much – was the new Alan Turing Institute: £220 million of government support will be invested into “big data and algorithm” research.

Today, especially with the processing power of modern computing and enormous “big data” analysis, algorithms increasingly influence the media we read, the products we buy, and with whom we socialise. But most of us have no idea what they are. At its core, an algorithm is just a simple formula which must be followed to calculate the answer to a mathematical problem. (The word “algorithm” itself is derived from the eighth-century Persian mathematician Al-Khwārizmī but the concept goes all the way back to the Greeks.)

Algorithms are vital to the internet because they help to order and arrange vast volumes of data at a scale and speed impossible for a human. Google’s famous PageRank algorithm counts the number of links to a page and assesses their quality to determine how important a website is. The quality and quantity of websites’ links to each other are compared and ordered; the more important websites are displayed first on the Google search page when a search query is entered.

Facebook’s feed prioritisation algorithm calculates which posts a user sees on his or her news feed, how close a user and the creator of a post are, how valuable the content is (with photos determined to be the most worthwhile, and plain text the least so), and how long a post has been up for. Nothing is left to chance: by weighing up these three factors, the algorithm decides what you see – and Facebook updates and improves these regularly. And you’ll all be familiar with Amazon’s algorithms that constantly propose annoyingly accurate recommendations of what books that you might also like to buy, based on what others have bought.

All are based on proprietary algorithms. Without its clever algorithm, Google is nothing. Better algorithms are at the core of Amazon’s future plans too. On Christmas Eve 2013, they patented something called a “method and system for anticipatory package shipping”: an algorithm-based system that could potentially ship products before consumers place an order for them. Algorithms that make the best sense of data can earn companies billions. That’s why they are as closely guarded as the recipe for Coca-Cola.

There are remarkable social and economic benefits from being able to process data at much greater efficiency than humans. Some commentators believe that algorithm-led data analysis of NHS records could result in huge improvements in treatment outcomes, as we’d get a much better understanding of works and what doesn’t. According to the CEBR, the use of big data analytics across the healthcare sector could deliver additional revenues of £14bn from 2012 to 2017. Others have estimated it could cut the chronic care bill by £80 billion. Already, at the University of California San Francisco’s new medical centre, an algorithmically operated robot runs a fully automated hospital pharmacy. The dispensing room is sterile and secure, and the algorithm, having prepared hundreds of thousands of prescriptions, is yet to make a mistake.

However, life by algorithm brings worries too. Evgeny Morozov thinks all of this algorithmic analysis of personal information on a massive scale amounts to an incremental erosion of privacy by private companies and the state, and could even end up in Minority Report-style preventive policing. Others, like Filter Bubble author Eli Pariser, believe that by trying to perfectly predict our preferences with his search function, Google limits our opportunities for serendipity, discovery, exploration – we end up reading and watching the same things, having our horizons slowly narrowed. Around half of all trading on the world’s markets are today performed by computer algorithms, and are capable of incredible feats of financial mathematics at speeds measured in milliseconds: but errors are also magnified. In May 2010, American stock markets trended down and then crashed in the second largest swing in the history of the Dow Jones Index. Subsequent analysis showed this “Flash Crash” – which was quickly recovered – was caused by the aggressive activity of algorithmic traders.

Algorithms rule the world, and we still don’t really know much about them. Some of this is extremely promising. But there are also serious political, cultural, economic and social consequences when these clever calculations hold so much sway over what we information we encounter and how. As Turing put it in his 1950 masterpiece Computing Machinery and Intelligence: “We can only see a short distance ahead, but we can see plenty there that needs to be done.” A bit more research into the human side wouldn’t go a miss. Fortunately, I know just the place.



No comments

Be the first one to leave a comment.

Post a Comment