Will AI Cause Wage Stagnation or Growth?
Summary: Average wages are typically flat for the first 50 years after a major economic revolution. AI is such a revolution, but the outcome is less certain this time, because of the ways it differs from past cases.
Most fretting about the AI revolution lament mass unemployment. True, roughly half a billion jobs will be eradicated over the next decade. Still, new vocations will emerge to fill the void — predominantly in uncharted sectors we are yet to fathom. AI won’t create unemployment but necessitates intensive investment in retraining programs to repurpose the imminent workforce surplus for roles that require novel skills. Thankfully, AI can immensely accelerate this retraining if we start creating better educational AI pronto.
Permalink for this article: https://www.uxtigers.com/post/wage-stagnation-growth
The looming problem is not unemployment; it’s wages. Historical patterns show that average wages remain static for around 50 years following a revolutionary economic shift. Of course, there's massive economic growth during a revolution — that's the definition. But historically, elites snatch most gains while average folks see little change for 50 years.
This is demonstrated by the following data about both the Industrial Revolution (starting in Great Britain in 1770) and the Computer Revolution (beginning in the United States in 1970). Both curves are plotted with the starting year for their respective revolution as year 1 and with average wages defined as the index value of 100 in that year. Thus, the two economic revolutions can be easily compared.
After a paradigm shift in the way the economy works, average wages are adjusted for purchasing power and indexed to a value of 100 for the first year of each revolution. The Industrial Revolution is plotted with wages in Great Britain starting in 1770 (source: Robert Allen, 2007), and the Computer Revolution is plotted with wages in the United States starting in 1970 (source: Federal Reserve Bank of St. Louis, 2023).
During the initial five decades following each revolution, wages remained flat due to the attractiveness of substituting capital for labor through automation to exploit the new opportunities created by the paradigm shift. This new paradigm allowed capital owners to commandeer an increasingly larger share of income while the laborers' share diminished. Skilled mechanics and engineers earned princely sums since their contribution was indispensable in harnessing the potential of new technologies, but common factory hands saw little change in their modest pay.
Whether by steam or silicon, each economic upheaval first rewarded the elite who stood ready at its helm.
Not until 50 years later do the spoils spread enough for average workers to benefit: by then, they can strongly contribute, usually through better education that readies them for the new normal. (This is why I'm gung-ho on upgrading AI for practical education — so this shift can happen in 10 years this time, not 50.)
A Tale of Two Economic Revolutions: Industrial and Computer
To recap history, the Industrial Revolution marked a seismic shift from agrarian, manual, labor-intensive economies to machine-driven manufacturing. It was driven by technological innovation spurred by the inventiveness of the era, such as James Watt's improvements to the steam engine and Richard Arkwright's water frame. Simultaneously, significant advancements in metallurgy, such as the development of the smelting process that enabled the large-scale production of iron, were instrumental in accelerating the transition. Great Britain's geopolitical advantages further propelled the revolution, including extensive coal and iron deposits.
While some of these inventions happened before 1770, I have set that year as the starting year for the true transformation of the British economy.
The Industrial Revolution began with a lot of steam power, a lot of coal, and a lot of pollution. (“1820 Industrial Revolution” by Midjourney.)
Similarly, the Computer Revolution originated in the United States around the mid-20th century. This movement was characterized by the transition from manual and analog processes to digital computation and information processing, igniting the birth of the Information Age. The roots of this revolution lie in innovations such as the electronic digital programmable computer, ENIAC, developed during World War II, and the transistor, a Bell Labs invention that drastically reduced the size, cost, and power consumption of computing devices. Concurrent advancements in programming languages and data storage techniques complemented these hardware evolutions.
I defined the start of the computer-driven economic revolution as 1970, when mainframe computers began to be common in big businesses in the United States. The Computer Revolution spread more quickly to other advanced countries than had been the case for the Industrial Revolution, but I am restricting my analysis to the United States to draw a parallel with Great Britain two hundred years earlier.
The Computer Revolution started with huge mainframes in dedicated data centers, managed by a priesthood of systems operators. (“1980 computer center” by Leonardo.AI — I cheated a bit by asking for both a male and female sysadmin, because the profession was more than 90% male in 1970.)
The chart above clearly shows that average wages were essentially flat for 50 years after each revolution, after adjusting for inflation. During the second half-century after the Industrial Revolution, average workers finally started to see wage growth, ending by earning about 60% more in 1870 than their forebears had done in 1770. (Remember that these are real earnings, after adjusting for inflation. The curves say nothing about the nominal number of pounds or dollars earned, but this number is also irrelevant unless you want to brag about the size of your bank account. What matters is purchasing power, and that’s where average workers were 60% better off a century after the Industrial Revolution began.)
Will History Repeat for AI?
If you have read some of my previous articles, you will know that I believe that artificial intelligence is a genuinely big deal that comes close to doubling the productivity of knowledge workers, even with the primitive tools we have in 2023. (I like the framing that current AI is the worst we’ll ever experience in the rest of our lives: next-generation products will be much more powerful and thus advance the economy even more.)
Since knowledge work is the only thing that truly matters for advanced economies, we seem to be facing a new economic revolution. Will history repeat itself? Will average incomes stagnate for the next 50 years?
My answer is “no.” AI will cause wage growth, not wage stagnation, for employees who embrace it. Of course, any dinosaurs who refuse to use AI tools will have dramatically reduced income, as they get summarily out-competed by twice-as-productive symbiants (AI-supported workers). But people who reject progress deserve what they get. My concern is with those who get with the program and learn AI now, so they will be twice as productive in 10 years.
It would only be fair if these people experienced wage growth in return for their increased contribution to society. But the world isn’t fair.
I have two reasons based on hardcore arguments, not fluffy appeals to fairness, to expect wage growth even during the early decades of the AI revolution.
Reason 1: AI Is the Second Phase of the Computer Revolution, Not a New Revolution
First, the AI revolution is not a distinct new revolution, which, if following historical patterns, would entail 50 years of wage stagnation. Rather, AI is the second part of the Computer Revolution, and thus represents those second 50 years where we expect to finally see wage increases based on the historical pattern.
Consider our analogy with the Industrial Revolution: it proceeded through water, coal, oil, and nuclear energy as the most potent source to power factories. But the revolution didn’t restart every time factories plugged into a more concentrated energy source. The defining parts of the Industrial Revolution were, first of all, the use of some power source (no matter which one) instead of manual human labor. And second, the ever-increasing efficiency with which this energy was applied to mass manufacturing in ever-better-designed factories.
Similarly, with the Computer Revolution, the key difference is not between mainframes, PCs, mobile, and AI. These different technologies do have different UX implications, but in terms of the economy, they are all ways of applying computation to information problems, as opposed to reliance on human cognitive processing.
Thus, AI merely extends the trend of increasing information processing which has now reached the level where immense productivity gains are finally to be had.
I remember a study by my mentor Dr. John D. Gould, of the potential productivity gains from early word processing. Since John worked for IBM, which sold word processors and PCs, I am sure management expected him to identify big productivity gains. However, the actual research findings were different. (To IBM’s credit, they were published: Gould 1981.)
People spent 50% more time writing letters with the word processor than writing similar letters in longhand. And yet the rated quality of the letters was the same. The excess time during word processor use was caused by incessant tweaking of the verbiage that made no real difference to the final letter.
In contrast, when using ChatGPT to write business documents, business professionals in the one controlled research study we have were 59% more productive than their colleagues who wrote their documents without AI support.
In this simple example, the 100-year-long Computer Revolution was in its infancy during John Gould’s study of early word processors, which were detrimental to productivity. We’re now a few years past the midpoint, and computers have reached the point of being major productivity boosters when using AI tools appropriately. Surely many more such gains will come over the next 40 years.
Reason 2: AI Narrows Skills Gaps
AI will increase the overall wealth available by enabling significant productivity gains. Why will average workers get a decent share? Of course, the best workers will always create the most value for society and be paid the most. But AI closes skills gaps and elevates average (or even subpar) workers’ performance. Not as high as the top performers, but close!
Consider again our Industrial Revolution precedents. Before the invention of forklifts, musclebound warehouse workers could move heavier boxes and would be paid more than weaklings:
“Strong warehouse worker” by Bing Create.
But with forklifts, any skilled operator can move much heavier pallets around the warehouse than would have been possible for even the brawniest weightlifter:
“Forklift in a warehouse” by Midjourney.
Thus, forklifts narrow the physical strength gap between people who can lift a lot and a little. Similarly, AI tools are forklifts for the mind and take over much of the heavy “lifting” involved in the cognitive processing of large amounts of information.
The measurement studies of real business performance with and without AI tools show a similar narrowing of the skills gaps between the smartest workers and duller ones. Since long-term compensation trends are driven by productivity, this will likely mean that the people who used to be the weakest knowledge workers will enjoy the biggest monetary gains from AI, at least in percent of their old salary.
Of course, the top performers will make the most money. That’s how the world works. But average wages will increase rather than being flat, as one might have feared. Everybody will share in economic progress this time around. (As they also eventually did toward the late 19th century, when the wealth created by the Industrial Revolution started getting shared.)
Both horses will cross the finish line this time, not just the best horse:
“Horse race” by Midjourney.
References
Federal Reserve Bank of St. Louis (2023): “Employed full time: Median usual weekly real earnings,” https://fred.stlouisfed.org/series/LEU0252881600A
John D. Gould (1981): “Composing Letters with Computer-Based Text Editors.” Human Factors vol 23, no 5. http://dx.doi.org/10.1177/001872088102300509
Robert C. Allen (2007): “Pessimism Preserved: Real Wages in the British Industrial Revolution.” Oxford University Department of Economics Working Paper 314. https://ora.ox.ac.uk/objects/uuid:1660bc3a-01c9-4560-9054-f2e57d119e50
More on AI UX
This article is part of a more extensive series I’m writing about the user experience of modern AI tools. Suggested reading order:
AI Vastly Improves Productivity for Business Users and Reduces Skill Gaps
The Articulation Barrier: Prompt-Driven AI UX Hurts Usability
ChatGPT Does Almost as Well as Human UX Researchers in a Case Study of Thematic Analysis
“Prompt Engineering” Showcases Poor Usability of Current Generative AI
UX Experts Misjudge Cost-Benefit from Broad AI Deployment Across the Economy
About the Author
Jakob Nielsen, Ph.D., is a usability pioneer with 40 years experience in UX. He founded the discount usability movement for fast and cheap iterative design, including heuristic evaluation and the 10 usability heuristics. He formulated the eponymous Jakob’s Law of the Internet User Experience. Named “the king of usability” by Internet Magazine, “the guru of Web page usability" by The New York Times, and “the next best thing to a true time machine” by USA Today. Before starting NN/g, Dr. Nielsen was a Sun Microsystems Distinguished Engineer and a Member of Research Staff at Bell Communications Research, the branch of Bell Labs owned by the Regional Bell Operating Companies. He is the author of 8 books, including Designing Web Usability: The Practice of Simplicity, Usability Engineering, and Multimedia and Hypertext: The Internet and Beyond. Dr. Nielsen holds 79 United States patents, mainly on making the Internet easier to use. He received the Lifetime Achievement Award for Human–Computer Interaction Practice from ACM SIGCHI.
Subscribe to Jakob’s newsletter to get the full text of new articles emailed to you as soon as they are published.