A Brief History of Computers

Posted: Feb 2022

What is the origin of computers?

Through the work of pioneers in the 20th century, we have the smartphones, cloud-based platforms, and internet-enabled devices we rely on to manage our bank accounts and keep our homes at a steady temperature. With the expansion of virtual reality and other innovations, we’d like to look back at where it all started.

Where did the term ‘computer’ come from?

Prior to 1935, the term “computer” meant someone who processed information with a mechanical calculator. The term now signifies a machine instead of a person that uses a calculator to process information. Essentially, computers were invented to gain the ability to compute calculations faster than a team of humans ever could. The general idea has remained the same as we seek to manage data and processes.

Groundwork in the 19th century

Initially, computers used wooden punch cards to perform rudimentary calculations. The machine was essentially a loom that would weave fabric designs automatically without an operator constantly adjusting the device. Eventually, Charles Babbage used a similar system in 1821 to calculate tables of numbers using punch cards, although the project was never a success.

An English mathematician named Ada Lovelace wrote the first computer code in 1848 as she translated Babbage’s research from French to English. In the translation’s notes, Lovelace outlined how to use Babbage’s machine to compute Bernoulli numbers, and is credited with producing the first computer program.

In 1890 Herman Hollerith successfully manufactured a tabulating machine that used the punch-card method, founding the company International Business Machines Corporation afterward. Today, IBM is a recognized name in the hardware space.

Technology revolution in the 20th century

The story of the British scientist Alan Turing is an interesting one for those who’ve never really read about the role “computers” played during the World War II era. It was Turing who in 1936 popularized the idea that we could use a machine to calculate pretty much any data a human could need – especially German military communications to defend the UK from ongoing invasion.

As such, he and other inventors worked during World War II to crack intercepted enemy communications with a device called the Turing-Welchman Bombe. Code-breaking is still a hot topic in our modern-day with the invention of sophisticated encryption to secure digital communications, which is every bit as crucial as Turing’s efforts to break the Enigma code in the 1940s.

Nevertheless, in America, John Vincent Atanasoff submitted a grant proposal in 1937 to build a computer that operated solely on electricity, not on electro-mechanical components that would easily break and damage the device. From that point on, computers evolved into hulking beasts weighing tons with iterations developed by researchers at Harvard, Iowa State University, and others; however, one thing was missing – instructions.

Thus, it was necessary to develop a computational language, and Grace Hooper in 1953 stepped up to the challenge to create COBOL (Common, Business-Oriented Language), the original computer code. In her posthumous Presidential Medal of Freedom, citation Hopper was named the “First Lady of Software”.

A New Age

There were a number of early computers based on similar designs. In April of 1950, 1101 designed by ERA was the first commercially produced computer built by Remington-Rand. However, what is generally accepted as the first “general-purpose computer” is credited to Britain’s Ferranti Mark I. Some of the other models that were produced included Univac 1, LEO-1 by J. Lyons & Company, Johnniac, and IBM’s 650 and model 701.

In the 1960s, the US military took an interest in computer technology as a way to improve their own research capabilities, developing the ARPAnet project, which stood for Advanced Research Projects Agency Network. The ARPAnet was the first network, available for military use only. Among many others, the primary goal was to transmit information from one computer to another over long distances and streamline computations even further. By this time nearly half of all computers were IBM.

Other advancements included Minuteman I missiles that used transistorized computers as guidance systems. During this time the Naval Tactical Data System was introduced using integrated computer systems to display radar, sonar, and communications data.

Consumer Electronics Revolution

Computers were the culmination of other advances in electronics, especially the invention of microchips and analog electronics. Think more along the lines of portable, lightweight radios and smaller television sets. Video games and audio gear would also fall along those lines, so by the time computers became practical from a commercial standpoint, the public was already primed to try new consumer electronics.

The first microprocessor was introduced by Intel in the early 1970s, alongside some of the first personal computers, like the Kenbak-1. The commercialization of the laser printer was realized by Xerox through research by physicist Gary Starkweather. After the popularization of personal computer software engineers and computer scientists worked to design better products.

After successfully writing software for the Altair 8080, Paul Allen and Bill Gates form their own company, Microsoft. During this same time, Steve Jobs partnered with Steve Wozniak to design the world’s first single-board computer and founded the company Apple. Other big companies were not prepared for the early success of Apple. In 1980, IBM launched its own personal computer based on an Intel 8080 microprocessor dubbed “Acorn”.

By the late 1990s, the invention of Pentium microprocessors advanced user graphics and music on PCs (1993), Google (1996), and WiFi (1999) set the landscape for rapid innovation.

The Future Of Computers

Today we are experiencing a rise in virtual reality, integrated application development, and other technologies that manage every aspect of our day-to-day lives. The power of computers has doubled every 18 months since the 1960s.

We already have rudimentary artificial intelligence through social media algorithms and virtual reality headsets with the concept of the “metaverse,” on the horizon. Smart mirrors seemed like science fiction just ten short years ago, and now they are a reality. On the horizon, there is an anticipation of advancing biomolecular technology, quantum computers, and artificial intelligence.

The integration of different technologies will continue to shape the future of ever-evolving technology devices and infrastructure.

1949 CSIRAC
1949 CSIRAC

Originally known as CSIR Mk1, the CSIRAC (Commonwealth Scientific and Industrial Research Automatic Computer) is the oldest surviving first-generation electronic computer.

1961 MUDPAC
1961 MUDPAC

Made by Applied Dynamics in Ann Arbor, MI the MUDPAC was used from 1961-1986. It cost $70k and was delivered to Melbourne University in 1961 where technicians used it to solve engineering problems.

1977 APPLE II
1977 APPLE II

Introduced by Steve Jobs and Steve Wozniak at the 1977 West Coast Computer Faire the Apple II is one of the world’s first mass-produced microcomputer products.

1981 Sinclair ZX81
 1981 Sinclair ZX81

Produced by Sinclair Research the ZX81 was manufactured in Dundee, Scotland by Timex Corporation.


Connect with us today for all of your outsourced IT needs