C SC 100 Lecture Notes
Autumn 2009
Pete Sanderson
The Machine That Changed The World
PBS Video Series
A production of WGBH and the BBC
Topic Summaries & Notes
© 1997-8 by David J. Stucki
Many of the links provided below provide more detailed or in
depth
information about the highlighted topic or person. Others may lead
you to
only loosely related sites that are worth exploring.
(Thanks to John A. N. (JAN) Lee for
providing links to many
of the resources listed below.)
Giant Brains, Episode
1
Introduction
Writing was invented
5000 years ago as a means of expanding our ability to communicate. The computer
may come to rival the written word as a communication medium.
Most machines are special purpose, that is they are
designed to do only one thing. Computers, on the other hand, are universal
machines. They can transform themselves into a variety of other machines
through the use of software. As a result the computer is a uniquely versatile
machine.
This universal character of
computers allows us to describe their functioning as:
- the manipulation of ideas
- the creation of artificial universes
- mind-like
In
general, computers perform tasks that in humans requires intelligence to
accomplish. Originally, computers were invented to provide a fast and effective
means of performing arithmetic.
The Human
Computer
Prior to World War II the word computer
referred to a person who computed arithmetic tables, such as for logarithms or
numerical constants. For example, William Shanks spent 28 years of his life
calculating the value of p, the ratio of the
circumference of a circle to its diameter, to 707 decimal places. A modern
desktop computer can solve the same task in 7 seconds or less.
The human computer relied primarily on tables
(prepared by other computers) and often made calculation errors, transcription
errors, or propagated errors located in the tables. For example, Shanks made an
error in the 528th decimal place and so the last few years of his
efforts were wasted.
The industrial
revolution was built on numbers. As technology grew it created an increasing
demand for accurate and reliable mathematical tables.
Charles
Babbage was a victorian mathematician who became frustrated by the amount of
wasted time caused by computer errors. He independently commissioned two sets of
tables in one instance to allow them to be checked against one another for
accuracy. (See also The Babbage
Pages, and the History
of Mathematics archive.)
The Difference
Engine
Babbage conceived of an idea of an engine
that would compute by steam or some other means of automation. He designed a
machine he called a difference
engine, so named because it computed using a technique called the method
of differences. Because of the practical usefulness of this machine, the British
government granted Babbage the financing he needed to build it. The machine,
however, was never completed, partially because he was a poor manager and
partially because he came up with a better idea.
The difference engine only could compute results by the single
method of differences, and was not, therefore, very versatile. Babbage
envisioned an alternative, general purpose machine that would be
programmable. He named this device the analytical engine. As a means of
encoding the programs, Babbage borrowed the idea of using a series of punched
cards from Jacquard, a french textile maker, who had used them to create looms
that could weave complex patterns.
This
was the first machine that had been designed not with a specific purpose or
function in mind, rather that was left to the user to specify. The concept of
software was born.
The First
Programmer
Augusta Ada Byron, Countess of
Lovelace, was the poet Lord Byron's daughter. She became fascinated with
Babbage's analytical engine and is largely regarded as the world's first
computer programmer (although there are some who disagree).
She formed a long term association with Babbage after translating a paper
written about his work. She spent a considerable amount of time working on the
implications of a programmable machine and what it might be capable of.
A New Era
Dawns
During the decade between 1935 and 1945 the
definition of the word changed dramatically. A common dictionary from before
this period identified a computer as a person who performed calculations. By the
end of World War II the word computer referred to a machine that was used for
the same purpose.
Konrad
Zuse
Konrad Zuse picked up where
Babbage left off, implementing several improvements and innovations along the
way. His computer was the first electromagnetic computer, whereas
Babbage's had been purely mechanical. He also pioneered the idea of using
electric relays (common in telephone systems) as simple components, using binary
arithmetic simplify the engineering. This was a considerable improvement over
Babbage's decimal machines. By 1939 Zuse was the world's leading computer
designer. To conserve resources and money during the war, he used discarded
movie film instead of punched cards.
He
also conceived of the idea of building a completely electronic computer out of
vacuum tubes, but was denied funding by the German High Command because they
thought they would be able to win the war before he could finish building it.
The idea of the vacuum tube machine, however, was a breakthrough since it could
have speeds up to 1000-2000 times as fast as the mechanical relay
machines.
ENIAC and its
Peers
The U.S. government was highly motivated
during the war to develop digital, electronic computers to help them calculate
artillery guidance trajectories. They had been utilizing the skill of human
computers, such as Kay Mauchly from the University of Pennsylvania, who
was employed building firing tables. Manually, it took about 4 years for one
human computer to complete a single set of tables.
John Mauchly and J. Presper Eckert, who were at the University of
Pennsylvania's Moore School of Electrical Engineering (founded by Benjamin
Franklin), developed a plan to build an electronic digital computer using vacuum
tube technology. Their design called for 18,000 tubes (which had a failure rate
of one every five seconds), which filled a room 50'x30'. The ENIAC (Electronic
Numerical Integrator and Calculator), in addition to its complexity,
incorporated several innovations. It was built according to a modular design
that allowed it to have built-in fault tolerance (redundancies and backups to
prevent overall system failure). The ENIAC was a technological triumph, but it
was too late for the war effort.
Kay
Mauchley became an ENIAC programmer, which involved learning all the circuit
diagrams for the machine, since reprogramming it involved rewiring all its
circuitry. There were no user manuals in those days. Because of the work
involved in reprogramming the ENIAC it spent much of its time sitting
idle.
America's most distinguished
mathematician, John von
Neumann, joined the ENIAC project towards it end and authored the report
describing a new kind of machine that could store its programs in memory, rather
than in the physical configuration of the machine. This stored program
computer became the model for every modern computer designed and built
since.
Due to contractual and patents
disagreements, Eckert and Mauchly left the University of Pennsylvania to start
the first
commercial computer company,where they developed the UNIVAC
In July, 1948, Freddie Williams
designed and built the Mark 1, the
first working stored program machine at the University of
Manchester.
Maurice Wilkes, designed and
built the Edsac computer (Electronic Delay Storage Automatic Calculator) with
the idea of presenting a more user friendly interface that would attract the
interest of scientists who may not have been competent electrical
engineers.
Predicting the
Future
The invention of the computer ushered in
the birth of entirely new disciples of science and research. For example, the
field of radio-astronomy could not exist without its incredible speed and
computational power.
In 1950, most
scientists might have been able to envision the continued innovations and
technological breakthroughs that led to the development of the supercomputer,
but few of them would have been able to predict the way in which personal
computers have become a commodity. In fact, the popularity of the personal
computer is very much a surprise given the ways in which the early computers
were utilized.
There were those,
however, who suspected that computers were far more significant a technology
than simply a fast means of automatic calculation.
Alan
Turing
Alan Turing was a British
mathematician who was instrumental in the British military's code-breaking
efforts in World War II. He is also recognized by many as the founder of
computer science. With the publication of two important papers, one on the
limits of computation (what computers can and cannot do) published in 1936 and
the other somewhat related paper on whether machines can be built that are
capable of thought, published in 1950, he laid the theoretical groundwork for
much of the work done in computational theory and artificial
intelligence.
Turing recognized that
there is nothing special about arithmetic that allows it to be automated as
opposed to logic, code-breaking, or any other task that can be described
symbolically in a formal system. He was associated with the Colossus project
where he worked at Bletchley Park, the heart of the British code-breaking
effort, which helped to win the war.
Turing's main interest, though, was in designing a computer for
non-numerical calculations. He designed a machine called the Automatic
Computing Engine (ACE) in 1945 which he intended to use for symbolic
computations, but a working model wasn't constructed until 1950.
In a lecture at his alma mater, Turing described
computation as anything that can be described in symbols, and compared the
behavior of machines manipulating numbers as not any different from those
manipulating characters. He suggested that computers are able to learn by
experience, and that as a result they don't really do only what they're
told to do. He also explained a kind of game (now called the Turing Test) that would
allow a computer to convince a person that it was intelligent.
Comments, feedback, or inquiries, please contact DStucki@otterbein.edu
Last updated:
Pete Sanderson (PSanderson@otterbein.edu)