The earliest electronic computer, ENIAC, was made to automate the calculation of artilliery firing tables. It could perform these calculations three orders of magnitude faster than the electro-mechanical machines it had replaced. If we had to attach a price to rapid, flawless, calculation, it would help to know that this machine sold for $487,000 United States dollars (USD) in 1946 (or $6,900,000 USD, inflation-adjusted up to the year 2023).
As the technology improved, the costs of memory and storage fell, and the computer became a viable tool not just for calculations, but for information storage and retrieval. The MEMEX idea set forth in Vannevar Bush's 1945 article "As We May Think" was gradually realized--from word processors to internet search engines--over the following decades. If we had to attach a price to the realization of Bush's MEMEX, one only needs to look at the stock market, and the trillions of dollars of market capitalization between Adobe, Apple, Autodesk, Google, IBM, Microsoft, SAP, et. al.
Purpose #3: augmenting human intellect
It would be silly to leave these two abilities in isolation, and on this particular count, industry has not been silly. We have word processors that correct spelling and grammar; image editors that correct colors and lighting; computer-aided design (CAD) systems that automatically calculate forces and check them against material constraints; audio workstations that remove noise and echoes, etc...
Yet for all the money this industry has harvested from the people, and for all the money it will continue to harvest from them, it is still selling itself short, for having relevant knowledge in advance of the big decision is priceless.
(Niklaus) Wirth attributed the saying to Martin Reiser, who in the preface to his book on the Oberon System wrote: "The hope is that the progress in hardware will cure all software ills. However, a critical observer may observe that software manages to outgrow hardware in size and sluggishness." Other observers had noted this for some time before; indeed, the trend was becoming obvious as early as 1987.
He states two contributing factors to the acceptance of ever-growing software as: "rapidly growing hardware performance" and "customers' ignorance of features that are essential versus nice-to-have". Enhanced user convenience and functionality supposedly justify the increased size of software, but Wirth argues that people are increasingly misinterpreting complexity as sophistication, that "these details are cute but not essential, and they have a hidden cost". As a result, he calls for the creation of "leaner" software and pioneered the development of Oberon, a software system developed between 1986 and 1989 based on nothing but hardware. Its primary goal was to show that software can be developed with a fraction of the memory capacity and processor power usually required, without sacrificing flexibility, functionality, or user convenience.
To add to this, by the year 2000, the three purposes had been competently fulfilled. By the year 2010, they had been fulfilled several times over, and to a degree of high refinement. Our e-mail transfer protocol has had only minor amendments since its creation in 1981. Microsoft Office file formats in 2025 have not appreciably changed since 2007. Many of the best programmers use a text editor whose core functionality has been with it since its release in 1991. The Google search engine of 2025 is staggeringly worse at providing relevant results than the Google of 2015.
My family purchased its first personal computer (an IBM PS/1) in 1992. That machine had a 25MHz processor, and only two megabytes (MB) of random-access memory (RAM). It could edit and view rich-text documents, and do so with plenty of RAM and processing power left over. With my current computer (2,500MHz processor; 32,768MB of RAM), each tab in my web browser is using somewhere between 25MB and 75MB of RAM to display a single rich-text document.
In 2025, despite rising anxieties, real and imagined, most of mankind is living in charmed times. There is no guarantee as to how long such times will continue. War, plague, famine, natural disaster--all may be lurking around the corner. Or it could be more gradual: empires fall. Sometimes centuries pass before lost knowledge is re-acquired.
The modern computer--the one that "modern" software requires--owes its existence to a very tall and very broad pyramid (the "supply chain") of complexities. If some small piece of this pyramid is lost--be it due to disruption of trade, or the disappearance of knowledge (gradual or abrupt)--the world will be reduced to producing a more primitive computer (a "collapse computer"). Say that this collapse computer is as capable as a state-of-the-art computer from the year 2000. How much of our software will still be able to run on it? How much of our knowledge will still be accessible from it?
The software industry has degraded itself. The the honorable work of providing useful tools has been superseded by a variety of assaults (dark patterns, information harvesting, "bling") for the maintenance of revenue. Every time our software "phones home" is a potential crash or freeze when the internet is no longer there. Every button it flashes, every window it animates, every image it hashes, uses resources that our "collapse computer" may not have.
Worse still, everything has been jammed into the web browser. As I write this, Google Chrome holds around 70% of the web browser market. This is a piece of software that typically uses hundreds of megabytes of RAM for every open tab, and requires tremendous computing resources to build. The collapse computer would be incapable of building it, and would struggle to open a single tab. This is not a competence issue. The Chrome developers are clever people. It comes back to Wirth's warning: "cute but not essential." Six-hundred and ninety-three distinct CSS property names and counting! I am not even going to start on its JavaScript API, for fear that there may be no end to it.
What a pity it would be if the next dark age is extended by a century so that, for one brief shining moment, we could have rounded-corners, and Microsoft could know every time I opened my calculator.