EDVAC was the first stored-program computer designed; however it was not the first to run. Eckert and Mauchly left the project and its construction floundered. The first working von Neumann machine was the Manchester "Baby" or
Small-Scale Experimental Machine, developed by
Frederic C. Williams and
Tom Kilburn at the
University of Manchester in 1948 as a test bed for the
Williams tube;
it was followed in 1949 by the
Manchester Mark 1 computer, a complete system, using Williams tube and
magnetic drum memory, and introducing
index registers. The other contender for the title "first digital stored-program computer" had been
EDSAC, designed and constructed at the
University of Cambridge. Operational less than one year after the Manchester "Baby", it was also capable of tackling real problems. EDSAC was actually inspired by plans for EDVAC (Electronic Discrete Variable Automatic Computer), the successor to ENIAC; these plans were already in place by the time ENIAC was successfully operational. Unlike ENIAC, which used parallel processing, EDVAC used a single processing unit. This design was simpler and was the first to be implemented in each succeeding wave of miniaturization, and increased reliability. Some view Manchester Mark 1 / EDSAC / EDVAC as the "Eves" from which nearly all current computers derive their architecture. Manchester University's machine became the prototype for the
Ferranti Mark 1. The first Ferranti Mark 1 machine was delivered to the University in February, 1951 and at least nine others were sold between 1951 and 1957.
The first universal programmable computer in the Soviet Union was created by a team of scientists under direction of
Sergei Alekseyevich Lebedev from
Kiev Institute of Electrotechnology,
Soviet Union (now
Ukraine). The computer
MESM (
МÐСМ,
Small Electronic Calculating Machine) became operational in 1950. It had about 6,000 vacuum tubes and consumed 25 kW of power. It could perform approximately 3,000 operations per second. Another early machine was
CSIRAC, an Australian design that ran its first test program in 1949. CSIRAC is the oldest computer still in existence and the first to have been used to play digital music.
Commercial computers
In October 1947, the directors of
J. Lyons & Company, a British catering company famous for its teashops but with strong interests in new office management techniques, decided to take an active role in promoting the commercial development of computers. The
LEO I computer became operational in April 1951 and ran the world's first regular routine office computer
job. On 17 November 1951, the J. Lyons company began weekly operation of a bakery valuations job on the LEO (Lyons Electronic Office). This was the first business
application to go live on a stored program computer.
In June 1951, the
UNIVAC I (Universal Automatic Computer) was delivered to the
U.S. Census Bureau. Remington Rand eventually sold 46 machines at more than $1 million each ($8.38 million as of 2010). UNIVAC was the first "mass produced" computer. It used 5,200 vacuum tubes and consumed 125 kW of power. Its primary storage was
serial-access mercury delay lines capable of storing 1,000 words of 11 decimal digits plus sign (72-bit words). A key feature of the UNIVAC system was a newly invented type of metal magnetic tape, and a high-speed tape unit, for non-volatile storage. Magnetic media are still used in many computers. In 1952, IBM publicly announced the
IBM 701 Electronic Data Processing Machine, the first in its successful
700/7000 series and its first
IBM mainframe computer. The
IBM 704, introduced in 1954, used magnetic core memory, which became the standard for large machines. The first implemented high-level general purpose
programming language,
Fortran, was also being developed at IBM for the 704 during 1955 and 1956 and released in early 1957. (Konrad Zuse's 1945 design of the high-level language
Plankalkül was not implemented at that time.) A volunteer
user group, which exists to this day, was founded in 1955 to
share their software and experiences with the IBM 701.
IBM introduced a smaller, more affordable computer in 1954 that proved very popular. The
IBM 650 weighed over 900 kg, the attached power supply weighed around 1350 kg and both were held in separate cabinets of roughly 1.5 meters by 0.9 meters by 1.8 meters. It cost $500,000 ($4.05 million as of 2010) or could be leased for $3,500 a month ($30 thousand as of 2010). Its drum memory was originally 2,000 ten-digit words, later expanded to 4,000 words. Memory limitations such as this were to dominate programming for decades afterward. The program instructions were fetched from the spinning drum as the code ran. Efficient execution using drum memory was provided by a combination of hardware architecture: the instruction format included the address of the next instruction; and software: the Symbolic Optimal Assembly Program, SOAP,assigned instructions to optimal address (to the extent possible by static analysis of the source program). Thus many instructions were, when needed, located in the next row of the drum to be read and additional wait time for drum rotation was not required.
IBM introduced its
first magnetic disk system,
RAMAC (Random Access Method of Accounting and Control) in 1956. Using fifty 24-inch (610 mm) metal disks, with 100 tracks per side, it was able to store 5
megabytes of data at a cost of $10,000 per megabyte ($80 thousand as of 2010).
[edit]Second generation: transistors
The bipolar
transistor was invented in 1947. From 1955 onwards transistors replaced
vacuum tubes in computer designs, giving rise to the "second generation" of computers. Initially the only devices available were
germanium point-contact transistors, which although less reliable than the vacuum tubes they replaced had the advantage of consuming far less power. The first
transistorised computer was built at the
University of Manchester and was operational by 1953; a second version was completed there in April 1955. The later machine used 200 transistors and 1,300
solid-state diodes and had a power consumption of 150 watts. However, it still required valves to generate the clock waveforms at 125 kHz and to read and write on the magnetic
drum memory, whereas the
Harwell CADET operated without any valves by using a lower clock frequency, of 58 kHz when it became operational in February 1955. Problems with the reliability of early batches of point contact and alloyed junction transistors meant that the machine's
mean time between failures was about 90 minutes, but this improved once the more reliable
bipolar junction transistors became available.
Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Silicon junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space. Transistors greatly reduced computers' size, initial cost, and
operating cost. Typically, second-generation computers were composed of large numbers of
printed circuit boards such as the
IBM Standard Modular System each carrying one to four
logic gates or
flip-flops.
A second generation computer, the
IBM 1401, captured about one third of the world market. IBM installed more than one hundred thousand 1401s between 1960 and 1964.
Transistorized electronics improved not only the
CPU (Central Processing Unit), but also the
peripheral devices. The
IBM 350RAMAC was introduced in 1956 and was the world's first disk drive. The second generation
disk data storage units were able to store tens of millions of letters and digits. Next to the
fixed disk storage units, connected to the CPU via high-speed data transmission, were removable disk data storage units. A removable disk stack can be easily exchanged with another stack in a few seconds. Even if the removable disks' capacity is smaller than fixed disks,' their interchangeability guarantees a nearly unlimited quantity of data close at hand.
Magnetic tape provided archival capability for this data, at a lower cost than disk.
Many second generation CPUs delegated peripheral device communications to a secondary processor. For example, while the communication processor controlled
card reading and punching, the main CPU executed calculations and binary
branch instructions. One
databus would bear data between the main CPU and core memory at the CPU's
fetch-execute cycle rate, and other databusses would typically serve the peripheral devices. On the
PDP-1, the core memory's cycle time was 5 microseconds; consequently most arithmetic instructions took 10 microseconds (100,000 operations per second) because most operations took at least two memory cycles; one for the instruction, one for the
operand data fetch.
During the second generation
remote terminal units (often in the form of
teletype machines like a
Friden Flexowriter) saw greatly increased use. Telephone connections provided sufficient speed for early remote terminals and allowed hundreds of kilometers separation between remote-terminals and the computing center. Eventually these stand-alone computer networks would be generalized into an interconnected
network of networks—the Internet.
[edit]Post-1960: third generation and beyond
The explosion in the use of computers began with "third-generation" computers, making use of
Jack St. Clair Kilby's and
Robert Noyce's independent invention of the integrated circuit (or microchip), which later led to the invention of the microprocessor, by
Ted Hoff,
Federico Faggin, and Stanley Mazor at
Intel. The integrated circuit in the image on the right, for example, an
Intel 8742, is an 8-bit
microcontroller that includes a
CPU running at 12 MHz, 128 bytes of
RAM, 2048 bytes of
EPROM, and
I/O in the same chip.
During the 1960s there was considerable overlap between second and third generation technologies. IBM implemented its
IBM Solid Logic Technology modules in
hybrid circuits for the IBM System/360 in 1964. As late as 1975, Sperry Univac continued the manufacture of second-generation machines such as the UNIVAC 494. The
Burroughs large systems such as the B5000 were
stack machines, which allowed for simpler programming. These
pushdown automatons were also implemented in minicomputers and microprocessors later, which influenced programming language design. Minicomputers served as low-cost computer centers for industry, business and universities. It became possible to simulate analog circuits with the
simulation program with integrated circuit emphasis, or
SPICE (1971) on minicomputers, one of the programs for electronic design automation (
EDA). The microprocessor led to the development of the
microcomputer, small, low-cost computers that could be owned by individuals and small businesses. Microcomputers, the first of which appeared in the 1970s, became ubiquitous in the 1980s and beyond.
Steve Wozniak, co-founder of
Apple Computer, is sometimes erroneously credited with developing the first mass-market
home computers. However, his first computer, the
Apple I, came out some time after the
MOS Technology KIM-1 and
Altair 8800, and the first Apple computer with graphic and sound capabilities came out well after the
Commodore PET. Computing has evolved with microcomputer architectures, with features added from their larger brethren, now dominant in most market segments.
Systems as complicated as computers require very high
reliability. ENIAC remained on, in continuous operation from 1947 to 1955, for eight years before being shut down. Although a vacuum tube might fail, it would be replaced without bringing down the system. By the simple strategy of never shutting down ENIAC, the failures were dramatically reduced.
Hot-pluggable hard disks, like the hot-pluggable vacuum tubes of yesteryear, continue the tradition of repair during continuous operation. Semiconductor memories routinely have no errors when they operate, although operating systems like Unix have employed memory tests on start-up to detect failing hardware. Today, the requirement of reliable performance is made even more stringent when
server farms are the delivery platform. Google has managed this by using fault-tolerant software to recover from hardware failures, and is even working on the concept of replacing entire server farms on-the-fly, during a service event.
In the twenty-first century,
multi-core CPUs became commercially available.
Content-addressable memory (CAM) has become inexpensive enough to be used in networking, although no computer system has yet implemented hardware CAMs for use in programming languages. Currently, CAMs (or associative arrays) in software are programming-language-specific. Semiconductor memory cell arrays are very regular structures, and manufacturers prove their processes on them; this allows price reductions on memory products. During the 1980's, CMOS
logic gates developed into devices that could be made as fast as other circuit types; computer power consumption could therefore be decreased dramatically. Unlike the continuous current draw of a gate based on other logic types, a
CMOS gate only draws significant current during the 'transition' between logic states, except for leakage.
An indication of the rapidity of development of this field can be inferred by the history of the seminal article. By the time that anyone had time to write anything down, it was obsolete. After 1945, others read John von Neumann's First Draft of a Report on the EDVAC, and immediately started implementing their own systems. To this day, the pace of development has continued, worldwide.