HISTORY OF COMPUTERS WITH TIMELINES

This page is regularly updated to reflect any significant changes in computer history. You can easily use the navigation above to find a part of history, quickly.






Computers truly became a revolutionized invention during the last part of the 20th century. They have already helped win wars, solve seemingly insoluble problems, saved lives and launched us into space. Their history spans more than 2500 years to the creation of the abacus.

The difference between the abacus and a modern day computer is massive, the abacus needs a human operator, whereby a computer doesn’t need constant input because it counts using binary code, but there are similarities between the two that they share which involves making repeated calculations. This resource seeks to discover the in-depth history of computers and how far we’ve come since the first computer.

THE ABACUS


The abacus was the earliest aid of mathematical computations, originating in the Middle East in 500 BC; it remained the fastest invention to perform calculations until the middle of the 17th century. A modern abacus, which is sometimes used in the Far East, consists of rings sliding over rods. However, the original ones were pebbles. The Latin word for pebble is calculus, which is where the term calculator first originated.

the abacus was the first computer
A modern dat Abacus being used by a child. Image source: Adobe Stock

THE PASCALINE


In 1642, an eighteen year old French scientist, writer and philosopher Blaise Pascal (1623–1662) invented the very first mechanical calculator, which was named the Pascaline. It had been created as an aid for Pascal’s father who was a tax collector. Pascal built fifty of these gear driven calculating machines, and couldn’t manage to sell many due to their expensive cost, the fact that they could only add and subtract numbers and they weren’t totally accurate.

the pascaline or pascal's computer
A Pascaline, signed by Pascal in 1652. Image source: Wikipedia


THE BINARY CODE


A number of decades later (sometime in 1685) a German philosopher and mathematician Gottfried Wilhelm Leibniz (1646–1716) had an idea for a similar machine to Pascal’s, but this one was to be much more advanced. It had a stepped drum, instead of the use of cogs and it had other benefits like adding and subtracting, division, multiplication, as well as having the ability to figure out square roots. Leibniz is also remembered for his contribution to computers, as he was the man responsible for inventing the binary code. He never made use of the binary code but it made other scientists and inventors think about the various ways in which it could be used. Later, Leibniz’s binary code of ones and zeros would be used in advanced modern computers, as well as becoming what computers use to make calculations.


THE BOOLEAN LOGIC


In 1854, an Englishman named George Boole (1815–1864) made a revolutionary discovery, a concept known as Boolean Logic, a radical and simple idea. There isn’t a piece of technology around today that doesn’t rely on Boolean Logic. Incorporated with binary code, it allows computers to make simple choices like comparing ones and zeros. It’s essentially an on/off switch, so if your phone is one and an off is a zero that is all you need. Boole was one of the first people to propose that the way we think is by using logic.


THE ANALYTICAL ENGINE


The central figures of 19th century computing were the mathematicians and computer pioneers, Charles Babbage (1791–1871) and Ada Lovelace (1815–1852), daughter of British poet Lord Byron. Charles Babbage is said to have pioneered the modern computer age with his differencing engines which mechanized arithmetic and his analytical engine, which enabled automatic computing, creating other automatic computing. However his designs had limited influence on succeeding generations. Charles, together with close friend Ada Lovelace, produced a clear commentary of the potential of the analytical engine. This was an introduction to what we now call programming.

THE BEGINNING OF WHAT WE NOW KNOW AS IBM


Toward the latter half of the 19th century, there were other inventors who envisioned bigger and better calculating engines and had more success than Babbage. One such man was American inventor Herman Hollerith (1860–1929) who developed an electrometrical machine named the Tabulator, which would help in compiling the census.

Throughout the 1880s, the population of the United States of America had increased so much that data collected by hand was taking as much as seven and a half years. It was thought that if this growth carried on, they wouldn’t have compiled the last census before the next one began. The Hollerith Tabulator was a roaring success and it only took the machine six weeks to complete the census, including the full analysis in only three years. The United States government saved five million dollars.

In 1896 Hollerith founded his own company Tabulating Machine Company. Then a few years later the name changed to Computing- Tabulating-Recording, and then in 1924 to its present name IBM, which stands for International Business Machines.



THE DIFFERENTIAL ANALYZER


As IBM was taking off, a young scientist and graduate student of the Massachusetts Institute of Technology Vannevar Bush (1890–1974), already the inventor of a new surveyor’s tool, earned his doctorate in electrical engineering in a single year. It was the start of an astonishing career. Bush played a role in the rise of radio, the building of the atomic bomb, and the beginning of the digital age.

During his teaching position at MIT, he kept up his inventions. Bush refined the S-tube, though not invented by him; it turned the new technology of home radio into a simple plug-in device. Later in the 1930s, Bush developed a room-filling machine called the Differential Analyzer, a mechanical computer that Bush said to represent the ability to think straight. It was an outstanding machine but it wasn’t to be the only key player in the history of computing.

Woman using the differential analyzer
A differential analyzer at the NACA Lewis Flight Propulsion Laboratory, 1951 as seen on Wikipedia

THE THEORY OF DIGITAL COMMUNICATION AND STORAGE

A significant movement in the history of computers


In 1948 Claude Shannon (1916–2001) created a challenge for engineers, and paved the way for the compact disk, the fax machine, and mp3 files with the basic theory of digital communications and storage. Shannon called it Information Theory. Then fifty years later, the engineers solved the challenge and created other aspects of digital technology. The whole idea of digitizing things, along with the fact that you can store them, download them and upload them, comes from Shannon’s landmark work. Shannon also contributed to the early development of integrated circuits, computers, cryptography and genomics. Today, not many people in the world know of Claude Shannon but he is by far one of the most influential figures of the 20th century.



THE IMAGINED TURING MACHINE MODEL


Mathematics in its purest form is a solution to many problems. One particular mathematician, Alan Turing (1912-1954) was perhaps one of the greatest mathematical minds of the 20th century. He was also a cryptographer and pioneer of computer science. Turing is known today for his part in breaking the German enigma code during World War II, and by then he was already well established as a mathematician of extraordinary abilities.

Turing decided to take on a groundbreaking thesis, the concede (context of admitting, validating or to stop resisting something) of a hypothetical machine that would read symbols on a strip of tape, rewriting or deleting them based on a finite set of rules. Turning described persons performing these operations as the computer.

When this machine was given a problem to compute, it would either stop and give you the answer or carry on forever if the answer didn’t exist. He mathematically proved that you never really know if and when the machine will stop. He created definitive examples of the undecidable problem and that areas in mathematics will always remain a barrier to the complete truth. The imagined Turing machine model went on to be one of the most influential mathematical abstractions of computer science.

At the age of 16, Alan Turing
Alan Turing at age 16. Image source: Wikipedia


THE HARVARD MARK I

In August 1944 IBM introduced the automatic sequence controlled calculator also known as the ASCC, the largest electromechanical calculator in the entire world, designed by Howard Aiken (1900-1973). It was a general-purpose computer that was used during the later half of World War II. Aiken presented the concept to IBM on November 1937. Thomas Watson senior approved the project and it’s funding in February 1939.

It was officially presented to Harvard University on the 7th of August 1944. The automatic sequence controlled calculator then became known as Harvard Mark I by the staff at the university. The Harvard Mark I was an enormous giant of a machine, it was 51 feet long, used 500 miles of wire with well over 3 million connections and thousands of switches.

The computer programmers of the Harvard Mark I were Richard Milton Bloch (1921- 2000), Robert Campbell and Grace Murray Hopper (1906-1992). The Harvard Mark I worked at such a high speed that in May 1940 it was used to make mathematical commutations, alongside the Manhattan projects atomic research. After 15 years in service, the Harvard Mark I was officially retired and taken offline in 1959. Some portions of the Harvard Mark I still remain today and can be found at Harvard University.

input and output details of the Harvard Mark I
Input/output and control readers closeup. Image source: Wikipedia.
Left end of the Harvard Mark I
"The left end consisted of electromechanical computing components." Image source: Wikipedia.
Right end of the Harvard Mark I
"The right end included data and program readers, and automatic typewriters." Image source: Wikipedia.

STEVE WOZNIAK & STEVE JOBS: The Start of Apple

Today’s leading men of inventors, engineers, and entrepreneurial designers are known all around the world for their contribution to the personal home computer in the 21st century. They have launched some amazing products, and with businesses worth millions of dollars, they have given us a technology that the likes of Charles Babbage could only have dreamt of. The leaders in the world of computing today are Steve Wozniak (Born August 11, 1950 - age 66), Steve Jobs (1955-2011) and Bill Gates (Born October 28, 1955 - age 61).

Steve Wozniak started computing in very humble beginnings. He had absolutely no money and his friend Steve Jobs was helping him along the way. Wozniak had the idea to build a PC board and Steve Jobs persuaded his friend to start a company. That was the premise in which they co-founded Apple Computer Corporation. Then they got lucky when a local store wanted to buy the computers, fully functional with all the parts.

They offered to pay them $50,000 for an order of their computer. After that, they were in business. However this was a short-term product that they were selling and they knew that they needed something better if they were to make an impact on the world of computing. The Apple I lacked the basic features that we take for granted today such as, a keyboard and monitor. So the Apple 2, complete with all features was the product that they knew would change the computing world and they didn’t want to just give it away, so it became their next project.

Wozniak and Job’s success with the Apple computer shocked IBM who absolutely dominated the industry at the time. So a year later they came up with a project that would rival their opposition, the IBM Personal Computer, known in short as the PC and based on an Intel 8080 microprocessor.

Steve Jobs co-founder of Apple
Image source: Wikipedia
Steve Wozniak co-founder of Apple
Image source: Wikipedia

Meanwhile a young 17 year old Bill Gates had started his company Microsoft, which was to be one of the greatest computing companies in the world. The Microsoft Company stayed his primary focus until he was 53, when he made the transition to the Bill Gates Foundation. When Gates started Microsoft, he was still in high school, was eager and driven to be involved with computers.

This was considered a fairly special time because computers were extremely expensive to purchase and Bill along with his friend Paul Allen, used the University computers at night. They were both fascinated about what the technology could do. They eventually had the idea of moving the computer onto a chip that Intel would later make and would in consequence make the computer much cheaper than the ones they had been using at the University.

This would make computers more powerful and available to more people on a personal level. The big moment came when Gates decided to work for a company writing software but companies ended up coming to him for the software. Microsoft famously won the contract to produce IBM’s operating system, and in effect let Bill Gate’s company license it, not buy. This was before the user graphical interface, when there was still just text on the screen. The software MS-DOS, was a critical thing.

Bill Gates and the history of computers
Image source: Future of Humanity Institute, University of Oxford.


However IBM didn’t see the values in the software, they thought the hardware was the key and the software was just necessary. Microsoft on the other hand realized that over time, the software would be a lot more important than the hardware. If they had realized that in the beginning then Bill Gates would have been given a different deal entirely.

Research Sources; History of Computers