Sunday 6 November 2011

Difference Engine: Luddite legacy


Difference Engine: Luddite legacy

Difference Engine: Luddite legacy, AN APOCRYPHAL tale is told about Henry Ford II showing Walter Reuther, the veteran leader of the United Automobile Workers, around a newly automated car plant. “Walter, how are you going to get those robots to pay your union dues,” gibed the boss of Ford Motor Company. Without skipping a beat, Reuther replied, “Henry, how are you going to get them to buy your cars?”



Whether the exchange was true or not is irrelevant. The point was that any increase in productivity required a corresponding increase in the number of consumers capable of buying the product. The original Henry Ford, committed to raising productivity and lowering prices remorselessly, appreciated this profoundly—and insisted on paying his workers twice the going rate, so they could afford to buy his cars.

For the company, there was an added bonus. By offering an unprecedented $5 a day in 1914, he caused the best tool-makers and machinists in America to flock to Ford. The know-how they brought boosted production efficiency still further and made Ford cars ever more affordable. With its ingenious Model T, Ford became the first car company in the world to bring motoring to the masses.

Economists see this as a classic example of how advancing technology, in the form of automation and innovation, increases productivity. This, in turn, causes prices to fall, demand to rise, more workers to be hired, and the economy to grow. Such thinking has been one of the tenets of economics since the early 1800s, when hosiery and lace-makers in Nottingham—inspired by Ned Ludd, a legendary hero of the English proletariat—smashed the mechanical knitting looms being introduced at the time for fear of losing their jobs.

Some did lose their jobs, of course. But if the Luddite Fallacy (as it has become known in development economics) were true, we would all be out of work by now—as a result of the compounding effects of productivity. While technological progress may cause workers with out-dated skills to become redundant, the past two centuries have shown that the idea that increasing productivity leads axiomatically to widespread unemployment is nonsense.

But here is the question: if the pace of technological progress is accelerating faster than ever, as all the evidence indicates it is, why has unemployment remained so stubbornly high—despite the rebound in business profits to record levels? Two-and-a-half years after the Great Recession officially ended, unemployment has remained above 9% in America. That is only one percentage point better than the country’s joblessness three years ago at the depths of the recession.

The modest 80,000 jobs added to the economy in October were not enough to keep up with population growth, let alone re-employ any of the 12.3m Americans made redundant between 2007 and 2009. Even if job creation were miraculously to nearly triple to the monthly average of 208,000 that is was in 2005, it would still take a dozen years to close the yawning employment gap caused by the recent recession, says Laura D’Andrea Tyson, an economist at University of California, Berkeley, who was chairman of the Council of Economic Advisers during the Clinton administration.

The conventional explanation for America's current plight is that, at an annualised 2.5% for the most recent quarter (compared with an historical average of 3.3%), the economy is simply not expanding fast enough to put all the people who lost their jobs back to work. Consumer demand, say economists like Dr Tyson, is evidently not there for companies to start hiring again. Clearly, too many chastened Americans are continuing to pay off their debts and save for rainy days, rather than splurging on things they may fancy but can easily manage without.

There is a good deal of truth in that. But it misses a crucial change that economists are loth to accept, though technologists have been concerned about it for several years. This is the disturbing thought that, sluggish business cycles aside, America's current employment woes stem from a precipitous and permanent change caused by not too little technological progress, but too much. The evidence is irrefutable that computerised automation, networks and artificial intelligence (AI)—including machine-learning, language-translation, and speech- and pattern-recognition software—are beginning to render many jobs simply obsolete.

This is unlike the job destruction and creation that has taken place continuously since the beginning of the Industrial Revolution, as machines gradually replaced the muscle-power of human labourers and horses. Today, automation is having an impact not just on routine work, but on cognitive and even creative tasks as well. A tipping point seems to have been reached, at which AI-based automation threatens to supplant the brain-power of large swathes of middle-income employees.

That makes a huge, disruptive difference. Not only is AI software much cheaper than mechanical automation to install and operate, there is a far greater incentive to adopt it—given the significantly higher cost of knowledge workers compared with their blue-collar brothers and sisters in the workshop, on the production line, at the check-out and in the field.

In many ways, the white-collar employees who man the cubicles of business today share the plight of agricultural workers a century ago. In 1900, nearly half of the adult population worked on the land. Thanks to tractors, combine harvesters, crop-picking machines and other forms of mechanisation, agriculture now accounts for little more than 2% of the working population.

Displaced agricultural workers then, though, could migrate from fields to factories and earn higher wages in the process. What is in store for the Dilberts of today? Media theorist Douglas Rushkoff (“Program or Be Programmed” and “Life Inc”) would argue "nothing in particular." Put bluntly, few new white-collar jobs, as people know them, are going to be created to replace those now being lost—despite the hopes many place in technology, innovation and better education.

The argument against the Luddite Fallacy rests on two assumptions: one is that machines are tools used by workers to increase their productivity; the other is that the majority of workers are capable of becoming machine operators. What happens when these assumptions cease to apply—when machines are smart enough to become workers? In other words, when capital becomes labour. At that point, the Luddite Fallacy looks rather less fallacious.

This is what Jeremy Rifkin, a social critic, was driving at in his book, “The End of Work”, published in 1995. Though not the first to do so, Mr Rifkin argued prophetically that society was entering a new phase—one in which fewer and fewer workers would be needed to produce all the goods and services consumed. “In the years ahead,” he wrote, “more sophisticated software technologies are going to bring civilisation ever closer to a near-workerless world.”

The process has clearly begun. And it is not just white-collar knowledge workers and middle managers who are being automated out of existence. As data-analytics, business-intelligence and decision-making software do a better and cheaper job, even professionals are not immune to the job-destruction trend now underway. Pattern-recognition technologies are making numerous highly paid skills redundant.

Radiologists, who can earn over $300,000 a year in America, after 13 years of college education and internship, are among the first to feel the heat. It is not just that the task of scanning tumour slides and X-ray pictures is being outsourced to Indian laboratories, where the job is done for a tenth of the cost. The real threat is that the latest automated pattern-recognition software can do much of the work for less than a hundredth of it.

Lawyers are in a similar boat now that smart algorithms can search case law, evaluate the issues at hand and summarise the results. Machines have already shown they can perform legal discovery for a fraction of the cost of human professionals—and do so with far greater thoroughness than lawyers and paralegals usually manage.

In 2009, Martin Ford, a software entrepreneur from Silicon Valley, noted in “The Lights in the Tunnel” that new occupations created by technology—web coders, mobile-phone salesmen, wind-turbine technicians and so on—represent a tiny fraction of employment. And while it is true that technology creates jobs, history shows that it can vaporise them pretty quickly, too. “The IT jobs that are now being off-shored and automated are brand new jobs that were largely created in the tech boom of the 1990s,” says Mr Ford.

In his analysis, Mr Ford noted how technology and innovation improve productivity exponentially, while human consumption increases in a more linear fashion. In his view, Luddism was, indeed, a fallacy when productivity improvements were still on the relatively flat, or slowly rising, part of the exponential curve. But after two centuries of technological improvements, productivity has "turned the corner" and is now moving rapidly up the more vertical part of the exponential curve. One implication is that productivity gains are now outstripping consumption by a large margin.

Another implication is that technology is no longer creating new jobs at a rate that replaces old ones made obsolete elsewhere in the economy. All told, Mr Ford has identified over 50m jobs in America—nearly 40% of all employment—which, to a greater or lesser extent, could be performed by a piece of software running on a computer. Within a decade, many of them are likely to vanish. “The bar which technology needs to hurdle in order to displace many of us in the workplace,” the author notes, “is much lower than we really imagine.”

In their recent book, “Race Against the Machine”, Erik Brynjolfsson and Andrew McAfee from the Massachusetts Institute of Technology agree with Mr Ford's analysis—namely, that the jobs lost since the Great Recession are unlikely to return. They agree, too, that the brunt of the shake-out will be borne by middle-income knowledge workers, including those in the retail, legal and information industries. But the authors' perspective is from an ivory tower rather than from the hands-on world of creating start-ups in Silicon Valley. Their proposals for reform, while spot on in principle, expect rather a lot from the political system and other vested interests.

Unlike Mr Ford, Dr Brynjolfsson and Dr McAfee are more sanguine about the impact smart technology is having on the job market. As they see it, those threatened the most by technology should learn to work with machines, rather than against them. Do that, they suggest, and the shake-out among knowledge workers becomes less of a threat and more of an opportunity.

As an example, they point to the way Amazon and eBay have spurred over 600,000 people to earn their livings by dreaming up products for a world-wide customer base. Likewise, Apple’s App Store and Google’s Android Marketplace have made it easy for those with ideas for doing things with phones to distribute their products globally. Such activities may not create a new wave of billion-dollar businesses, but they can put food on the table for many a family and pay the rent, and perhaps even the college fees.

In the end, the Luddites may still be wrong. But the nature of what constitutes work today—the notion of a full-time job—will have to change dramatically. The things that make people human—the ability to imagine, feel, learn, create, adapt, improvise, have intuition, act spontaneously—are the comparative advantages they have over machines. They are also the skills that machines, no matter how smart, have had the greatest difficulty replicating.

Marina Gorbis of the Institute for the Future, an independent think-tank in Palo Alto, California, believes that, while machines will replace people in any number of tasks, “they will amplify us, enabling us to do things we never dreamed of doing before.” If that new “human-machine partnership” gives people the dignity of work, as well as some means for financial reward, all the better. But for sure, the world is going to be a different place.

No comments:

Post a Comment