Evolution of Computing:
History of computer is always an interesting topic for every computer science students, as it tell us the transformation of computing with the passage of time.
So here is the details about complete history of computers in step by step topics respectively.
Vacuum Tube – 1904:
A vacuum tube is just that: a glass tube surrounding a vacuum (an area from which all
gases has been removed). What makes it interesting is that when electrical contacts are
put on the ends, you can get a current to flow though that vacuum.
A British scientist named John A. Fleming made a vacuum tube known today as a diode.
Then the diode was known as a “valve,”
Turing Machine – 1936:
The most important invention in the field of CS, Turing Machine, introduced by Alan Turing in 1936, Turing machines are one of the key abstractions
used in modern computability theory, the study of what computers can and cannot do. A
Turing machine is a particularly simple kind of computer, one whose operations are
limited to reading and writing symbols on a tape, or moving along the tape to the left or
right. The tape is marked off into squares, each of which can be filled with at most one
symbol. At any given point in its operation, the Turing machine can only read or write
on one of these squares, the square located directly below its “read/write” head.
The “Turing test”
A test proposed to determine if a computer has the ability to think. In 1950, Alan Turing
(Turing, 1950) proposed a method for determining if machines can think. This method is
known as The Turing Test.
The test is conducted with two people and a machine. One person plays the role of an
interrogator and is in a separate room from the machine and the other person. The
interrogator only knows the person and machine as A and B. The interrogator does not
know which the person is and which the machine is.
Using a teletype, the interrogator, can ask A and B any question he/she wishes. The aim
of the interrogator is to determine which the person is and which the machine is. So basically
The aim of the machine is to fool the interrogator into thinking that it is a person. If the
machine succeeds then we can conclude that machines can think.
ABC – 1939:
The Atanasoff-Berry Computer was the world’s first ever electronic digital computer. The inventor are John Vincent Atanasoff and Clifford Berry at Iowa State University during 1937-
42. It incorporated several major innovations in computing including the use of binary
arithmetic, regenerative memory, parallel processing, and separation of memory and
Harvard Mark 1 – 1943:
Howard Aiken and Grace Hopper designed the MARK series of computers at Harvard
University. The MARK series of computers began with the Mark I in 1944. Imagine a
giant roomful of noisy, clicking metal parts, 55 feet long and 8 feet high. The 5-ton
device contained almost 760,000 separate pieces. Used by the US Navy for gunnery and
ballistic calculations, the Mark I was in operation until 1959.
The computer, controlled by pre-punched paper tape, could carry out addition,
subtraction, multiplication, division and reference to previous results. It had very special
subroutines for logarithms and trigonometric functions and used 23 decimal place
numbers. Data was stored and counted mechanically using 3000 decimal storage wheels,
1400 rotary dial switches, and 500 miles of wire. Its electromagnetic relays classified the
machine as a relay computer. All output was displayed on an electric typewriter. By
today’s standards, the Mark I was slow, requiring 3-5 seconds for a multiplication
ENIAC – 1946:
ENIAC I (Electrical Numerical Integrator And Calculator). The U.S. military sponsored
their research that they needed a device that can calculate for writing artillery-firing tables (the
settings used for different weapons under varied conditions for target accuracy).
John Mauchly was the chief consultant and J Presper Eckert was the chief engineer.
Eckert was a graduate student studying at the Moore School when he met John Mauchly
in 1943. It took the team about one year to design the ENIAC and 18 months and
500,000 tax dollars to build it.
The ENIAC contained 17,468 vacuum tubes, along with 70,000 resistors and 10,000
Transistor – 1947:
The first transistor was invented at Bell Laboratories on December 16, 1947 by William
Shockley. This was perhaps the most important electronics event of the 20th century, as
it later made possible the integrated circuit and microprocessor that are the basis of
modern electronics. Prior to the transistor the only alternative to its current regulation
and switching functions (TRANSfer resISTOR) was the vacuum tubes, which could only
be miniaturized to a certain extent, and wasted a lot of energy in the form of heat.
Compared to vacuum tubes, it offered:
lower power consumption
UNIVAC 1 – 1951:
UNIVAC-1. The first commercially successful electronic computer, UNIVAC I, was
also the first general purpose computer – designed to handle both numeric and textual
information. It was designed by J. Presper Eckert and John Mauchly. The
implementation of this machine marked the real beginning of the computer era.
Compiler – 1952:
In today’s era Compilers are known as programmers first love. Grace Murray Hopper an employee of Remington-Rand worked on the NUIVAC. She
took up the concept of reusable software in her 1952 paper entitled “The Education of a
Computer” and developed the first software that could translate symbols of higher
computer languages into machine language. (Compiler)
ARPANET – 1969:
The Advanced Research Projects Agency was formed with an emphasis towards
research, and thus was not oriented only to a military product. The formation of this
agency was part of the U.S. reaction to the then Soviet Union’s launch of Sputnik in
1957. (ARPA draft, III-6). ARPA was assigned to research how to utilize their
investment in computers via Command and Control Research (CCR). Dr. J.C.R.
Licklider was chosen to head this effort.
Developed for the US DoD Advanced Research Projects Agency
60,000 computers connected for communication among research organizations and
Intel 4004 – 1971:
The 4004 was the world’s first universal microprocessor. In the late 1960s, many
scientists had discussed the possibility of a computer on a chip, but nearly everyone felt
that integrated circuit technology was not yet ready to support such a chip. Intel’s Ted
Hoff felt differently; he was the first person to recognize that the new silicon-gated MOS
technology might make a single-chip CPU (central processing unit) possible.
Hoff and the Intel team developed such architecture with just over 2,300 transistors in
an area of only 3 by 4 millimeters. With its 4-bit CPU, command register, decoder,
decoding control, control monitoring of machine commands and interim register, the
4004 was one heck of a little invention. Today’s 64-bit microprocessors are still based on
similar designs, and the microprocessor is still the most complex mass-produced product
ever with more than 5.5 million transistors performing hundreds of millions of
calculations each second – numbers that are sure to be outdated fast.
Altair 8800 – 1975:
By 1975 the market for the personal computer was demanding a product that did not
require an electrical engineering background and thus the first mass produced and
marketed personal computer (available both as a kit or assembled) was welcomed with
open arms. Developers Edward Roberts, William Yates and Jim Bybee spent 1973-1974
to develop the MITS (Micro Instruments Telemetry Systems ) Altair 8800. The price was
$375, contained 256 bytes of memory (not 256k),but had no keyboard, no display, and
no auxiliary storage device. Later, Bill Gates and Paul Allen wrote their first product for
the Altair — a BASIC compiler (named after a planet on a Star Trek episode).
Cray 1 – 1976:
It looked like no other computer before, or for that matter, since. The Cray 1 was the
world’s first “supercomputer,” a machine that leapfrogged existing technology when it
was introduced in 1971.
And back then, you couldn’t just order up fast processors from Intel. “There weren’t any
microprocessors,” says Gwen Bell of The Computer Museum History Center. “These
individual integrated circuits that are on the board performed different functions.”
Each Cray 1, like this one at The Computer Museum History Center, took months to
build. The hundreds of boards and thousands of wires had to fit just right. “It was really
a hand-crafted machine,” adds Bell. “You think of all these wires as a kind of mess, but
each one has a precise length.”
IBM PC – 1981:
On August 12, 1981, IBM released their new computer, re-named the IBM PC. The
“PC” stood for “personal computer” making IBM responsible for popularizing the term
The first IBM PC ran on a 4.77 MHz Intel 8088 microprocessor. The PC came equipped
with 16 kilobytes of memory, expandable to 256k. The PC came with one or two 160k
Floppy Disks Drives and an optional color monitor. The price tag started at $1,565,
which would be nearly $4,000 today.
Apple Macintosh – 1984:
Apple introduced the Macintosh to the nation on January 22, 1984. The original
Macintosh had 128 kilobytes of RAM, although this first model was simply called
“Macintosh” until the 512K model came out in September 1984. The Macintosh retailed
for $2495. It wasn’t until the Macintosh that the general population really became aware
of the mouse-driven graphical user interface.