In 1982, Time The magazine skipped its annual tradition of being named “man of the year” and instead crowned the personal computer as “machine of the year.” The Apple II had been released only half a decade earlier, and the subsequent introduction of VisiCalc spreadsheet software in 1979 apparently suddenly convinced the management class of the commercial potential of computers. Soon IBM released its own PC, which became both widely copied and extremely popular. The journalist who wrote the Time characteristic noted in his article that he had typed his contribution on a typewriter. The following year, their newsroom switched to word processing. The productivity revolution in the workplace has begun.
At least that’s the plain version of the tale we’re telling. A closer look at what happened next – and in the decades that followed – complicates the narrative. We’re used to the idea that new office technology makes us strictly more productive, but the history of tools in the workplace teaches us that the quest to make common activities more efficient can produce unexpected side effects. This was the case with early PCs, and that probably explains the difficult relationship we have with a more recent office innovation: email.
Soon after the arrival of the PC, experts began to question the miraculous nature of this suddenly ubiquitous device. In 1991, an article in The New York Times quoted an economist who noted that although companies continue to spend heavily on technology, “white collar productivity has stagnated.” He concludes: “CEOs are no longer convinced that throwing computers at their office staff will result in greater efficiency.”
The data supported these concerns. A study from 1987 to 1993, conducted by economists Daniel Sichel and Stephen Oliner, estimated that computer technology contributed at most 0.2 percentage point per year to firms’ output growth, after adjusting for inflation, a period during which overall growth increased by 1.9%. one year. A contemporary article summed up these findings bluntly: “The impact of computers on recent productivity growth has been vastly overestimated.”
In his 1997 book Why things bite, Edward Tenner tackles the “productivity paradox” that surrounded the initial introduction of the PC to the office. He points out several explanations, but perhaps the most interesting concerns the disconnection between easy and effective. The computer made some common activities more efficient, but it also created more overall work to be done. Instead of tasking an accountant to update their paper ledger, now business owners can do it themselves using a digital spreadsheet. Taken in isolation, the spreadsheet is easier than the general ledger, but in practice, business owners now have less time for other potentially more interesting activities. “If computers really allowed a smaller number of people to do the same amount of work,” Tenner notes, “there would be little outcry about the longer hours for middle managers and professionals.” But of course, this is the opposite of what happened.
Tenner supports the claim that PCs can increase workload by citing the fascinating research of Georgia Tech economist Peter G. Sassone. In a 1992 article, Sassone reports what he found while studying the impact of new technologies on 20 departments of five large companies. As he documents, many of these departments laid off support staff after the arrival of time-saving computer software, rendering them unnecessary. (There is no need to maintain an input pool once you have word processors.) The obvious problem is that the work once done by these staff has now shifted to the workers they supported. While these reductions in support staff saved labor costs in the short run, they required hiring higher level employees – and therefore higher wages – in the long run to maintain similar production levels. . After analyzing the numbers, Sassone concluded that the introduction of a technology that was supposed to boost productivity ended up costing these companies 15%. After in overall salary expenditure.