The introduction of modern technology seems, on the face of it, to have done nothing to improve productivity – but we may be speaking too soon. Ben Hammersley finds out what’s really going on
There’s a weird thing in economics, and it’s not showing what people think it’s showing. It’s called the productivity paradox, and it goes like this: despite all these new technologies – personal computers, digital communications, the internet and so on – productivity growth in western economies hasn’t increased. Computers, some are saying, haven’t really proved themselves to be that great an idea, and the productivity figures show it. This is wrong, I think, but for an interesting reason. It’s worth considering why.
Productivity is hard to measure once you move away from activities that don’t form a core part of the money-based economy. That’s already a weird phrase to use – but bear with me. If you’re digging stuff out of the ground, heating it up and then hitting it with hammers, productivity is relatively easy to measure. Taking a thing and turning it into another thing through effort makes it easy to understand how well you’re doing. The more things you have at the end of the day, the more productive you’re being.
Today’s modern post-industrial economies, though, aren’t about making stuff. At least not directly. The UK, for example, prides itself on being a service economy – something once described as ‘being really good at paperwork’ – and while accountants and lawyers and bankers and the like do add value, it’s much harder to measure in terms of productivity. Attempts to do this with software programmers have even been counter-productive: companies that rewarded their coders for the number of lines of code written a day just ended up with lots of buggy code. Indeed, some of the most productive days I’ve personally spent coding have actually produced fewer lines of code than I started with. This is hard to measure, and harder to explain to bosses who value such things above all others.
So, while modern technologies allow us to measure, quantify and compare large parts of our lives (yesterday I slept seven hours and 32 minutes, and walked precisely 18,231 steps, apparently), the ways we work with technology, and the business models it produces, are increasingly difficult to quantify in ways that compare directly to more traditional jobs.
Rise of the machines
This is only going to get worse, and it has some deep implications. Right now, the biggest story in business technology is the arrival of artificial intelligence and automation technologies. There’s a growing realisation that a good deal of the service industries, those really good at paperwork, remember, can be at the least augmented, and perhaps mostly replaced, by software. Computers are really good at paperwork. In these scenarios, which we’re already seeing come to pass in banking and the law, the jobs that remain are ones that combine paperwork with a degree of emotional intelligence, empathy, psycho-motor skills and the like that computers just don’t have. We’ll need far fewer book-keepers, but nurses and primary school teachers and other such complex roles are going to be ever more valuable.
This, though, clearly speaks to the productivity paradox. How do you measure the productivity of a primary schoolteacher in a meaningful way that compares to a coal-miner or widget-hammerer? You can’t. At least not clearly. And so national policy is made from a perceived worry that we’re somehow becoming less effective at work; that somehow modern technologies are reducing our abilities to get stuff done. And that this is hurting us in some way.
But our lived experience is that the opposite is true: modern technologies allow you to get a whole lot more done. We can see this through the jobs that have already been lost. The executive of 40 years ago had a team of typists, secretaries, diary-wranglers and assistants to help them handle their work (pictured below).
Today, those roles are greatly diminished, replaced by applications on their laptops and phones. This ‘hyperemployment’, as it has been called, is the norm today. People generally do more, even if the more they are doing is mostly coordination and paperwork, rather than directly making a thing.
While the numbers seem to show that the technological revolution has not resulted in any increase in productivity, and while that is, without consideration, somewhat worrying, it actually becomes all the more interesting and problematic when you realise that the jobs and tasks remaining after so much work has been handed over to AIs are traditionally coded female in western societies.
Those traditionally female roles, of carer, teacher, co-ordinator, negotiator and so on, are the roles that are left when the more directly widget-cranking jobs are taken by the machines. But here’s the thing: to the political orthodoxy, and to the culture at large, hitting stuff with a hammer counts as productivity, but looking after the elderly does not. If I spend the day sending emails, it’s productive. If I spend it at the playground with my daughter, our culture says no.
What the productivity paradox is showing us is not that computers are rubbish, or that technology’s promise is vastly overstated, but that what we measure and the way we measure it has real-world implications in how we govern our nations, and what we as a society denote as valuable. This should change today but, nevertheless, over the next few decades, as the type of work available to all humans changes, we will need to find a new way of discussing and denoting the value of the work we do. Perhaps productivity’s real paradox is that productivity is considered so valuable today anyway.
Ben Hammersley is a British internet technologist, journalist, author and broadcaster, based in the US