October 25, 2011

In the run-up to the 20th century, the industrial revolution saw the migration of jobs to the factories from the fields. Technology created new industries, opened up new possibilities for entrepreneurs and innovators, introduced new conveniences into the American household and raised the quality of life for vast numbers of working and middle class families. But the transition was not without strife: The share of Americans working on farms was cut in half from 1800 to 1900, and in the early, rockier days of the industrial revolution, pioneers and financiers wielded a tight grip over their profits, accumulating more wealth in fewer hands than at any other time in American history. The wealth disparity led to labor riots, violence against immigrants and widespread populist anger that engulfed cities throughout the country.

Now, in the infancy of the 21st century, a new revolution is reshaping the American economy, what we might call the “A.I. revolution.” New technologies fueled by advances in artificial intelligence are, once again, forging new industries, giving rise to a new brand of entrepreneur and introducing a new digital age that is transforming the way we learn, think, work and interact with one another. And, just like the last industrial revolution, this one has coincided with a period of considerable economic turmoil, a widening wealth gap and a scarcity of opportunity for working and middle-class Americans. Eventually, of course, the first industrial revolution opened up new sources of wealth and opportunity that were accessible to Americans of all socio-economic classes. But this revolution, the A.I. revolution, might not be as kind.

A new e-book published Monday by two researchers at the Massachusetts Institute of Technology argues that the rising tide of digital innovation may not lift all boats. The researchers, Erik Brynjolfsson and Andrew McAfee, argue in their book, “Race Against the Machine,” that the artificial intelligence boom has created machines that will replace humans in service industries that have traditionally been considered cornerstones of our economy. Equipped with new capabilities, such as the capacity for natural language, these machines will begin to displace human beings in core economic sectors, such as sales. And it’s unclear what will happen to those displaced workers once they’ve lost their jobs to machines that can do the work of several humans at a much lower cost.

“It may seem paradoxical that faster progress can hurt wages and jobs for millions of people, but we argue that’s what’s been happening,” Brynjolfsson and McAfee write. “Computers are now doing many things that used to be the domain of people only. The pace and scale of this encroachment into human skills is relatively recent and has profound economic implications. Perhaps the most important of these is that while digital progress grows the overall economic pie, it can do so while leaving some people, or even a lot of them, worse off.”

The authors conclude that we are not in the middle of a Great Recession, as many observers have claimed, but a Great Restructuring, in which white-collar jobs considered for decades to be the backbone of our economy — such as marketing, retail and sales — are being re-distributed to machines that can do them just as well. What’s especially threatening about this new restructuring is that, unlike the industrial revolution of the early 20th century, it’s less clear what the next step is for the American worker. Although there were decades of labor strife and unequal wealth distribution when factories began replacing farms as the engines of job creation, the industrial revolution ultimately added jobs to the economy. This current restructuring, however, has so far mostly subtracted jobs and stagnated wages, all while allowing the corporations that control the new technologies to continue maximizing profits. (The Great Restructuring has added mountains of new wealth to the top tiers of American earners while stalling income growth for the working and middle classes. The 400 richest Americans, for example, have as much wealth as the bottom 50 percent of Americans combined.)

The theory of the Great Restructuring is based in large part on the notion that advances in technology are not only growing, but that their growth is accelerating (the futurist Ray Kurzweil famously popularized the term “the singularity” the describe the tipping point at which this acceleration would become unpredictable to humans). Watson, the Jeopardy-playing IBM computer, and Siri, Apple’s new personal assistant software, are the most recent examples of this accelerating growth. Just a few years ago, one of the most fundamental and perplexing challenges in computing circles was designing a machine that could understand natural language. The capacity to process language as uttered by humans was considered to be a perhaps impenetrable barrier for computer engineers. In the span of just four years, however, IBM designed a computer that could process natural-language questions and, almost instantaneously, provide answers. That technology is now being tested in even the most advanced sectors of the economy, such as health care, were Watson is being used to determine medical diagnoses based on patients’ symptoms provided by doctors.

And where Watson focuses on one aspect of artificial intelligence, natural language processing, Siri employs another: voice recognition. The new technology has the potential to fundamentally transform everyday business transactions. For one thing, corporate back-office operations might be easily replaced by software that is nearly indistinguishable from human beings in how it understands and responses to requests. Rick Bookstaber, a financial author and Treasury Department official, wrote recently that Siri might even pass what’s known in computing circles as the “Turing Test,” a theoretical test of whether a machine is capable of exhibiting intelligent behavior. “If a Turing Test is fashioned to distinguish a computer from a person in the day-to-day tasks of working with a personal assistant,” Bookstaber wrote, “it is only a matter of time before the iPhone becomes indistinguishable from the human.”

While most experts agree that rapid technological advances are fundamentally reshaping our culture, not everyone fears the “singularity,” or the point of no return — the moment, according to futurists, at which artificial intelligence will surpass human intelligence, making the consequences of future technological advances unpredictable. Paul Allen, the co-founder of Microsoft, wrote in an article forTechnology Review over the weekend that the intricacy and complexity of the human mind, while theoretically quantifiable, only reveals more mysteries as scientists probe deeper, and that mimicking this complexity in robots will be nearly impossible for the foreseeable future. “The closer we look at the brain, the greater the degree of neural variation we find,” Allen and co-author Mark Greaves wrote. “Understanding the neural structure of the human brain is getting harder as we learn more. Put another way, the more we learn, the more we realize there is to know, and the more we have to go back and revise our earlier understandings.”

So, what does this mean for our jobs? Simply put, it means there are still things about the human brain that are unknowable even to us, and certainly to machines. This may not be an obvious economic advantage, but it reveals something about the divide between human and artificial intelligence. Computers will not be able to mimic humans any time soon, which means we’re unlikely to be wholly replaced any time soon.

Nonetheless, machines employing natural language processors, voice recognition software and other tools of artificial intelligence are proliferating, just as textile mills and, later, assembly lines proliferated and fundamentally altered the American economy in the 19th and early 20th centuries. Then, American workers won the race against machines by using advances in technology to usher in a new era of consumerism and mass production. This time, Brynjolfsson and McAfee argue, we must learn to co-exist with machines, rather than race against them. “We can’t win that race, especially as computers continue to become more powerful and capable,” they write. “But we can learn to better race with machines, using them as allies rather than adversaries.”


No comments

Be the first one to leave a comment.

Post a Comment