The rapid development of computing forces the user into buying faster and newer computers, throwing away their old, but still completely functional computers. The history of computing is very short, and older computers do not (yet) have any historical value. That is why in a period of only 50 years, thousands and thousands of computers ended up on waste depots. Computers that are today – because of the rapid technological development – looked upon as old and useless. But are they really useless? They may not be able to run the latest computer games, access the internet or render 3D graphics, but they are still able to perform the tasks for which they were designed for even today.
We can divide the history of computing roughly on the era of historical, mechanical and electromechanical machines, the era of first electronic accounting machines, and the time after the year 1950, when the invention of the chip caused a revolution in the computing industry.
Historical calculating “machines”
It may sound interesting that people had counted even before they “invented” numbers – adding and subtracting was made with fingers, stones and sticks of wood. The evolution of mathematics brought the with it machines and tools that made complex mathematical operations easier to calculate. Who has not heard of the abacus, a calculation tool made of a wooden frame with beads on wires? There is no accurate data about the first abacus-like devices, but educated guesses suggest it must have emerged already around 2500 BC. You can read more about the abacus here or in Wikipedia.
Mechanical calculating machines
Mechanical calculating machines were built from cogs, gears, wheels and other clockwork, and were run by a machine, e.g. a steam engine. The most famous inventor of such machines is Charles Babbage (1791-1871). His calculating machines were way ahead of their time, since there was no real need for such machines in Babbage’s time. In 1991, scientists in the London Science Museum constructed Difference machine II from Babbage’s original plans. Even today, it is still up and running and works completely without error. There is more on Babbage and his work at www.charlesbabbage.net/.
Electromechanic machines
The advent of electromechanics at the beginning of the 20th century has enabled electromotors to run mechanical machines. There emerged punched cards (Babbage’s notes already mention cards that would perform about the same role as today’s programs – they would determine the time and type of operation to be used for specific data), which became very popular because of the rising need to process large quantities of information, such as the US census. In 1935, IBM installed punched card equipment to support administration of the U.S. Social Security Act, which required the creation and maintenance of employment records on 26 million Americans. This landmark government contract was called “the biggest accounting operation of all time”, and has brought IBM further contracts with the government and substantially increased the size of the company.
The next big step was the invention of the electromechanic machine Z3 by the German engineer Konrad Zuse (1910 – 1996). Z3 was the first program-controlled computer and was finished in 1941. You can read more about Z3 here.
The age of electronic computers – the beginnings
Mechanic relays, which took a certain amount of time to flip, were replaced by electronic valves. The first computers based on electronic valves were built around 1940. In Great Britain , The Colossus machines were built in order to read encrypted German messages during World War II. Alan Turing, famous for his theoretical computing model – the Turing machine – also assisted with the Colossus project. The Colossus project has been kept secret by the British government for over 30 years. You can find more about the Colossus project here.
It was, however, the University of Pennsylvania in the USA, that was the most successful in this field of computing. In 1943, they began constructing the ENIAC, and later the EDVAC, which already had a program stored into it.
The first computer with a stored program was the British EDSAC. As much as the idea of a stored program goes without saying today, in 1945 (Von Neumann) it was pretty much unheard-of. A computer that would control its own actions, and where humans would not be able to understand the complete functionality of it? However, the idea – despite being frowned upon at first – stuck and today, all computers with stored programs are called Von Neumann computers. Which makes practically all computers in operation today Von Neumann computers. Two computers remain to be mentioned: UNIVAC 1 was the first American computer for commercial use, and IAS was a computer, for which Von Neumann edited the design plans.
Development after the invention of transistors
After 1950, unreliable electronic valves started being replaced by transistors, which were smaller and more reliable. In the year 1971, when microprocessors hit the market, the computing industry boomed and started thinking about the s. c. “home computer” or “personal computer” (PC) – a computer that would be available at a reasonable price to everyone. The first PC was the Altair 8800.
Various books, internet sites and other sources often sort computers built after 1940 into “computer generations”. The division is not very strict, and the dates vary from source to surce, but the general sorting into computer generations is still worthwhile to mention. A more detailed overview of important events is available at the Wikipedia.
The first generation (mid 1940s, end of 1950s)
The first generation of computers used electronic valve circuits, cathode-ray-tube memory modules, and magnetic drums for storage. They were programmed with machine programming languages, and were able to solve only one problem at a time. UNIVAC, ENIAC and IAS are typical representatives of the first generation of computers.
The second generation (1960s)
Despite the fact that Bell Labs had invented the transistor already in 1947, they only started replacing electronic valves by the beginning of 1960s. They were smaller, faster and cheaper than electronic valves. Since they were so small, computer sizes could be reduced. Computers previously taking up whole rooms (ENIAC, for example, measured 30 x 3 x 1 meters!) were made considerably smaller. Programmers started using assembly languages, high level programming languages emerged, e.g. COBOL and FORTRAN. The second generation also saw the debut of magnetic disks and ferrite core memory. Representatives of the second generation are e.g. the IBM 7090 and the IBM 360.
The third generation (1790s)
The emergence of integrated circuits joined several transistors on a single silicone chip. The so-called semiconductors have drastically improved the speed and efficiency of computers. Even programming did not survive unscathed and had to adapt to such things as using virtual memory and multiprocessing. In 1971, Intel was contracted by a Japanese business to develop a microprocessor – a central processing unit on a single chip. Low price made such computers available to a wide audience, and in 1975, the aforementioned Altair 8800 hit the market as the first home computer. Year 1976 saw Steve Jobs and Steve Wozniak found the Apple company and started selling their first Apple computer with the most original name Apple 1. Other representatives of the third generation include the IBM 307 and the PDP-11.
The fourth generation (1980 – today)
It was 1981 when IBM introduced its first IBM PC, and Apple its Apple Lisa and the Macintosh series, while computer firms bloomed. All important events of that age are described in Wikipedia.
The fifth generation (? – ?)
Some talked about the fifth generation already in 1982, while others claim that the fifth generation is still in development, that the fifth generation means artificial intelligence and organic computers. Speculations about the future of computers have been the source for many fictional stories (D.F. Jones – Colossus; Arthur C. Clarke – 2001: A space odyssey, etc.) and movies (Colossus: The Forbin Project (1970), 2001: A space odyssey (1968), Demon seed (1977), Westworld (1973)). These films usually depict computers as evil tyrants and are today looked upon with a smirking face. We’ll see what future brings when it does, so let’s return to the history of computers.
Did you know?
The first programmer was not a man, but a woman, Ada Byron, Lady Lovelace (1815 – 1852), the daughter of Lord Byron. She translated for Charles Babbage an article from the Italian mathematician Luigi Menabere and included a detailed description of a method for calculating Bernoulli numbers, which is even today regarded as the first programme.
Sir Clive Sinclair is the founder of Sinclair Radionics Ltd, a company that produced the most popular and successful British computer of all times – the ZX Spectrum. Despite his advanced age (65 years), he still invents – more or less successfully – various things. You can read more on this here.
The computer mouse wasn’t invented by Apple, when it introduced Apple Lisa in 1984, but already in 1969. Douglas C. Engelbart has together with 17 co-employees on 9th December 1968 presented a whole lot of computer innovations that would only see the light of day years thereafter. Besides the mouse, they also showed hypertext, e-mail and much more. There is a 90-minute video of this event available on the internet
In 2001, Edwin Black published a book entitled “IBM and the Holocaust”, accusing IBM of having collaborated with the Nazi leadership of World War 2 Germany. Links: http://www.nizkor.org/hweb/people/b/black-edwin/ibm-and-the-holocaust.html, http://en.wikipedia.org/wiki/Edwin_Black, http://www.wsws.org/articles/2001/jun2001/ibm-j27.shtml, http://www-03.ibm.com/press/us/en/pressrelease/828.wss.
Tetris was first written in 1985 as a game intended for testing computers if they performed all right. Its creator Alexey Pajitnov failed to make any fortune out of it despite the unbelievable popularity of the game, which is today available for almost any existing computer system, game console and video game system.
The Commodore company introduced in 1982 one of the most successful home computers of that time, the Commodore64. The company itself evolved from a small typewriter production and repair company.
Text-to-speech programs have been available since the mid-80s on home computers such as the Commodore64, Atari, Apple II and Apple Lisa. One of them was produced by the company SoftVoice, and was called S.A.M. (Software Automatic Mouth). You can listen to a sample of it on a Commodore64 on this link (mp3).
The famous and widely popular game Pac-Man has in the original version been called Puck-man (from the Japanese ‘pakupaku’), but the Americans thought the initial ‘p’ letter would quickly become bastardized to ‘f’ and renamed the game to Pac-man. The follow-up to the original Pac-man was written by MIT students and was called Ms. Pac-man. Since some people considered the title offensive, it was later renamed to pac-woman.
The beginning of the 80s saw BBC air educational programmes on computers, since the small but amazingly able boxes of electronics still scared some people. BBC even developed an educational computer called BBC Micro. Most of the stock was either sold or donated to schools.
In 1984, Apple introduced the Macintosh series with the famous TV ad, sporting the slogan “On January 24th Apple computer will Introduce Macintosh. And you’ll see why 1984 won’t be like ‘1984’”, ‘1984’ of course referring to the famous book by George Orwell.
A museum of computer history?
It has already been mentioned that computer history is just a small drop in the great ocean of world history. But this is precisely the problem when dealing with computer history. These 50 years have zoomed past us in an instant, the development has been lightning fast. For the sake of comparison, let’s take some other machine of the 20th century – the washing machine for example. Even if you bought it 10, 20, or even 40 years ago (and it still works today), you don’t really feel any need to replace it. You still have the same electricity and use basically the same washing powder as 40 years ago, so the machine still performs the tasks it was designed to do.
With computers, it’s different. If you bought it 10, 20 or 40 years ago (and it still works today), it still uses the same electricity… and that’s about it. Newer software and hardware won’t work with it. You want to play the latest first-person shooter? Buy a new computer. Want to produce professional-looking documents? Buy a laser printer and a word processor that can stomach more than one typeface, font size and color. The computer, despite the fact that it can still be used for the purpose it was intended to, becomes ‘obsolete’ very quickly.
It is interesting to note that a lot of people who never used an ‘older’ computer are convinced that these computers are of no use whatsoever and belong to the scrapheap. They may not be able to cope with latest, cutting-edge computer equipment, but on the other hand, how many of the computers produced today will still be working in 25 years’ time? It is sad that curiosities such as the Atari 1040sf from the 80s are being forgotten as if the digitalization and computer editing of videos was discovered only a couple of years ago.
This is why so many interesting pieces for the museum of computer history have accumulated in such a short time. Sadly, there have been little to none similar projects in Slovenia so far. The Technical Museum of Slovenia in Bistra lacks computing machines altogether, and there are only a few such museums in Europe, such as the technical museum in Munich, the Science museum of London and the Heinz-Nixdorf museum in Paderborn.