HISTORY OF COMPUTERS

45
HISTORY OF COMPUTERS The history of computing hardware covers the developments from early simple devices to aid calculation to modern day computers. Who invented the computer?" is not a question with a simple answer. The real answer is that many inventors contributed to the history of computers and that a computer is a complex piece of machinery made up of many parts, each of which can be considered a separate invention. This series covers many of the major milestones in computer history (but not all of them) with a concentration on the history of personal home computers. Before the 20th century, most calculations were done by humans. Early mechanical tools to help humans with digital calculations were called "calculating machines", by proprietary names, or even as they are now, calculators . The machine operator was called the computer. The first aids to computation were purely mechanical devices which required the operator to set up the initial values of an elementary arithmetic operation, then manipulate the device to obtain the result. Later, computers represented numbers in a continuous form, for instance distance along a scale, rotation of a shaft, or a voltage. Numbers could also be represented in the 1 | Page

Transcript of HISTORY OF COMPUTERS

HISTORY OF COMPUTERS

The history of computing hardware covers thedevelopments from early simple devices to aidcalculation to modern day computers.

Who invented the computer?" is not a question with asimple answer. The real answer is that many inventorscontributed to the history of computers and that acomputer is a complex piece of machinery made up ofmany parts, each of which can be considered a separateinvention.

This series covers many of the major milestones incomputer history (but not all of them) with aconcentration on the history of personal homecomputers.

Before the 20th century, most calculations were done byhumans. Early mechanical tools to help humans withdigital calculations were called "calculatingmachines", by proprietary names, or even as they arenow, calculators. The machine operator was called thecomputer.

The first aids to computation were purely mechanicaldevices which required the operator to set up theinitial values of an elementary arithmetic operation,then manipulate the device to obtain the result. Later,computers represented numbers in a continuous form, forinstance distance along a scale, rotation of a shaft,or a voltage. Numbers could also be represented in the

1 | P a g e

form of digits, automatically manipulated by amechanical mechanism. Although this approach generallyrequired more complex mechanisms, it greatly increasedthe precision of results. The invention of transistorand then integrated circuits made a breakthrough incomputers. As a result digital computers largelyreplaced analog computers. The price of computersgradually became so low that first the personalcomputers and later mobile computers (smartphones andtablets) became ubiquitous

Ancient era

Devices have been used to aid computation for thousandsof years, mostly using one-to-one correspondence withfingers. The earliest counting device was probably aform of tally stick. Later record keeping aidsthroughout the Fertile Crescent included calculi (clayspheres, cones, etc.) which represented counts ofitems, probably livestock or grains, sealed in hollowunbaked clay containers. The use of counting rods isone example.

The abacus was early used for arithmetic tasks. What wenow call the Roman abacus was used in Babylonia asearly as 2400 BC. Since then, many other forms ofreckoning boards or tables have been invented. In amedieval European counting house, a checkered clothwould be placed on a table, and markers moved around onit according to certain rules, as an aid to calculatingsums of money.

Several analog computers were constructed in ancientand medieval times to perform astronomicalcalculations. These include the Antikythera mechanism

2 | P a g e

and the astrolabe from ancient Greece (c. 150–100 BC),which are generally regarded as the earliest knownmechanical analog computers.[3] Hero of Alexandria (c.10–70 AD) made many complex mechanical devicesincluding automata and a programmable cart.] Other earlyversions of mechanical devices used to perform one oranother type of calculations include the planisphereand other mechanical computing devices invented by AbuRayhan al-Biruni (c. AD 1000); the equatorium anduniversal latitude-independent astrolabe by Abu IshaqIbrahim al-Zarqali (c. AD 1015); the astronomicalanalog computers of other medieval Muslim astronomersand engineers; and the astronomical clock tower of SuSong (c. AD 1090) during the Song Dynasty.

Medieval calculating tools

A set of John Napier's calculating tables from around1680.

Scottish mathematician and physicist John Napierdiscovered that the multiplication and division ofnumbers could be performed by the addition andsubtraction, respectively, of the logarithms of thosenumbers. While producing the first logarithmic tables,Napier needed to perform many tedious multiplications.It was at this point that he designed his 'Napier'sbones', an abacus-like device that greatly simplifiedcalculations that involved multiplication and division.

A slide rule

3 | P a g e

Since real numbers can be represented as distances orintervals on a line, the slide rule was invented in the1620s, shortly after Napier's work, to allowmultiplication and division operations to be carriedout significantly faster than was previously possible.[6] Edmund Gunter built a calculating device with asingle logarithmic scale at the University of Oxford.His device greatly simplified arithmetic calculations,including multiplication and division. William Oughtredgreatly improved this in 1630 with his circular sliderule. He followed this up with the modern slide rule in1632, essentially a combination of two Gunter rules,held together with the hands. Slide rules were used bygenerations of engineers and other mathematicallyinvolved professional workers, until the invention ofthe pocket calculator.

Mechanical calculators[edit]

Wilhelm Schickard, a German polymath, designed acalculating machine in 1623 which combined a mechanizedform of Napier's rods with the world's first mechanicaladding machine built into the base. Because it made useof a single-tooth gear there were circumstances inwhich its carry mechanism would jam. A fire destroyedat least one of the machines in 1624 and it is believedSchickard was too disheartened to build another.

4 | P a g e

View through the back of Pascal's calculator. Pascalinvented his machine in 1642.

In 1642, while still a teenager, Blaise Pascal startedsome pioneering work on calculating machines and afterthree years of effort and 50 prototypes he invented amechanical calculator. He built twenty of thesemachines (called Pascal's Calculator or Pascaline) inthe following ten years. Nine Pascalines have survived,most of which are on display in European museums. Acontinuing debate exists over whether Schickard orPascal should be regarded as the "inventor of themechanical calculator" and the range of issues to beconsidered is discussed elsewhere.[

Gottfried Wilhelm von Leibniz invented the SteppedReckoner and his famous stepped drum mechanism around1672. He attempted to create a machine that could beused not only for addition and subtraction but wouldutilise a moveable carriage to enable longmultiplication and division. Leibniz once said "It isunworthy of excellent men to lose hours like slaves inthe labour of calculation which could safely be

5 | P a g e

relegated to anyone else if machines were used."However, Leibniz did not incorporate a fully successfulcarry mechanism. Leibniz also described the binarynumeral system a central ingredient of all moderncomputers. However, up to the 1940s, many subsequentdesigns (including Charles Babbage's machines of the1822 and even ENIAC of 1945) were based on the decimalsystem.

Around 1820, Charles Xavier Thomas de Colmar createdwhat would over the rest of the century become thefirst successful, mass-produced mechanical calculator,the Thomas Arithmometer. It could be used to add andsubtract, and with a moveable carriage the operatorcould also multiply, and divide by a process of longmultiplication and long division. It utilised a steppeddrum similar in conception to that invented by Leibniz.Mechanical calculators remained in use until the 1970s.

Punched card data processing

In 1801, Joseph-Marie Jacquard developed a loom inwhich the pattern being woven was controlled by punchedcards. The series of cards could be changed withoutchanging the mechanical design of the loom. This was alandmark achievement in programmability. His machinewas an improvement over similar weaving looms. Punchcards were preceded by punch bands, as in the machineproposed by Basile Bouchon. These bands would inspireinformation recording for automatic pianos and morerecently numerical control machine tools.

IBM punched card Accounting Machines, pictured in 1936.

6 | P a g e

In the late 1880s, the American Herman Hollerithinvented data storage on punched cards that could thenbe read by a machine.[19] To process these punched cardshe invented the tabulator, and the key punch machine.His machines used mechanical relays (and solenoids) toincrement mechanical counters. Hollerith's method wasused in the 1890 United States Census and the completedresults were "... finished months ahead of schedule andfar under budget" Indeed, the census was processedyears faster than the prior census had been.Hollerith's company eventually became the core of IBM.

By 1920, electro-mechanical tabulating machines couldadd, subtract and print accumulated totals. Machineswere programmed by inserting dozens of wire jumpersinto removable control panels. When the United Statesinstituted Social Security in 1935, IBM punched cardsystems were used to process records of 26 millionworkers. Punch cards became ubiquitous in industry andgovernment for accounting and administration.

Leslie Comrie's articles on punched card methods andW.J. Eckert's publication of Punched Card Methods in ScientificComputation in 1940, described punch card techniquessufficiently advanced to solve some differentialequationsor perform multiplication and division usingfloating point representations, all on punched cardsand unit record machines. Such machines were usedduring World War II for cryptographic statisticalprocessing, as well as a vast number of administrativeuses. The Astronomical Computing Bureau, ColumbiaUniversity performed astronomical calculationsrepresenting the state of the art in computing.

7 | P a g e

Calculators

The Curta calculator could also do multiplication anddivision.

By the 20th century, earlier mechanical calculators,cash registers, accounting machines, and so on wereredesigned to use electric motors, with gear positionas the representation for the state of a variable. Theword "computer" was a job title assigned to people whoused these calculators to perform mathematicalcalculations. By the 1920s, British scientist Lewis FryRichardson's interest in weather prediction led him topropose human computers and numerical analysis to modelthe weather; to this day, the most powerful computerson Earth are needed to adequately model its weatherusing the Navier–Stokes equations.[26]

Companies like Friden, Marchant Calculator and Monroemade desktop mechanical calculators from the 1930s thatcould add, subtract, multiply and divide.[27] In 1948,the Curta was introduced by Austrian inventor, CurtHerzstark. It was a small, hand-cranked mechanicalcalculator and as such, a descendant of GottfriedLeibniz's Stepped Reckoner and Thomas's Arithmometer.

The world's first all-electronic desktop calculator was theBritish Bell Punch ANITA, released in 1961. It usedvacuum tubes, cold-cathode tubes and Dekatrons in itscircuits, with 12 cold-cathode "Nixie" tubes for itsdisplay. The ANITA sold well since it was the onlyelectronic desktop calculator available, and was silentand quick. The tube technology was superseded in June1963 by the U.S. manufactured Friden EC-130, which hadan all-transistor design, a stack of four 13-digit

8 | P a g e

numbers displayed on a 5-inch (13 cm) CRT, andintroduced reverse Polish notation (RPN).

First general-purpose computing device

Charles Babbage, an English mechanical engineer andpolymath, originated the concept of a programmablecomputer. Considered the "father of the computer"heconceptualized and invented the first mechanicalcomputer in the early 19th century. After working onhis revolutionary difference engine, designed to aid innavigational calculations, in 1833 he realized that amuch more general design, an Analytical Engine, waspossible. The input of programs and data was to beprovided to the machine via punched cards, a methodbeing used at the time to direct mechanical looms suchas the Jacquard loom. For output, the machine wouldhave a printer, a curve plotter and a bell. The machinewould also be able to punch numbers onto cards to beread in later. It employed ordinary base-10 fixed-pointarithmetic.

The Engine incorporated an arithmetic logic unit,control flow in the form of conditional branching andloops, and integrated memory, making it the firstdesign for a general-purpose computer that could bedescribed in modern terms as Turing-complete.[31][32]

There was to be a store, or memory, capable of holding1,000 numbers of 40 decimal digits each (ca. 16.7 kB).An arithmetical unit, called the "mill", would be ableto perform all four arithmetic operations, pluscomparisons and optionally square roots. Initially itwas conceived as a difference engine curved back uponitself, in a generally circular layout, with the long

9 | P a g e

store exiting off to one side. (Later drawings depict aregularized grid layout.) Like the central processingunit (CPU) in a modern computer, the mill would relyupon its own internal procedures, roughly equivalent tomicrocode in modern CPUs, to be stored in the form ofpegs inserted into rotating drums called "barrels", tocarry out some of the more complex instructions theuser's program might specify.

Reconstruction of Babbage's Analytical Engine, thefirst general-purpose programmable computer.

The programming language to be employed by users wasakin to modern day assembly languages. Loops andconditional branching were possible, and so thelanguage as conceived would have been Turing-completeas later defined by Alan Turing. Three different typesof punch cards were used: one for arithmeticaloperations, one for numerical constants, and one forload and store operations, transferring numbers fromthe store to the arithmetical unit or back. There werethree separate readers for the three types of cards.

The machine was about a century ahead of its time.However, the project was slowed by various problemsincluding disputes with the chief machinist buildingparts for it. All the parts for his machine had to bemade by hand - this was a major problem for a machinewith thousands of parts. Eventually, the project wasdissolved with the decision of the British Governmentto cease funding. Babbage's failure to complete theanalytical engine can be chiefly attributed todifficulties not only of politics and financing, but

10 | P a g e

also to his desire to develop an increasinglysophisticated computer and to move ahead faster thananyone else could follow. Ada Lovelace, Lord Byron'sdaughter, translated and added notes to the "Sketch of theAnalytical Engine" by Federico Luigi, Conte Menabrea. Thisappears to be the first published description ofprogramming.

Following Babbage, although unaware of his earlierwork, was Percy Ludgate, an accountant from Dublin,Ireland. He independently designed a programmablemechanical computer, which he described in a work thatwas published in 1909.

Analog computers

Sir William Thomson's third tide-predicting machinedesign, 1879-81

In the first half of the 20th century, analog computerswere considered by many to be the future of computing.These devices used the continuously changeable aspectsof physical phenomena such as electrical, mechanical,or hydraulic quantities to model the problem beingsolved, in contrast to digital computers thatrepresented varying quantities symbolically, as theirnumerical values change. As an analog computer does notuse discrete values, but rather continuous values,processes cannot be reliably repeated with exactequivalence, as they can with Turing machines.[37]

The first modern analog computer was a tide-predictingmachine, invented by Sir William Thomson, later LordKelvin, in 1872. It used a system of pulleys and wiresto automatically calculate predicted tide levels for a

11 | P a g e

set period at a particular location and was of greatutility to navigation in shallow waters. His device wasthe foundation for further developments in analogcomputing.

The differential analyser, a mechanical analog computerdesigned to solve differential equations by integrationusing wheel-and-disc mechanisms, was conceptualized in1876 by James Thomson, the brother of the more famousLord Kelvin. He explored the possible construction ofsuch calculators, but was stymied by the limited outputtorque of the ball-and-disk integrators. In adifferential analyzer, the output of one integratordrove the input of the next integrator, or a graphingoutput.

A Mk. I Drift Sight. The lever just in front of thebomb aimer's fingertips sets the altitude, the wheelsnear his knuckles set the wind and airspeed.

An important advance in analog computing was thedevelopment of the first fire-control systems for longrange ship gunlaying. When gunnery ranges increaseddramatically in the late 19th century it was no longera simple matter of calculating the proper aim point,given the flight times of the shells. Various spotterson board the ship would relay distance measures andobservations to a central plotting station. There thefire direction teams fed in the location, speed anddirection of the ship and its target, as well asvarious adjustments for Coriolis effect, weathereffects on the air, and other adjustments; the computerwould then output a firing solution, which would be fed

12 | P a g e

to the turrets for laying. In 1912, British engineerArthur Pollen developed the first electrically poweredmechanical analogue computer (called at the time theArgo Clock). It was used by the Imperial Russian Navyin World War I.[ The alternative Dreyer Table firecontrol system was fitted to British capital ships bymid-1916.

Mechanical devices were also used to aid the accuracyof aerial bombing. Drift Sight was the first such aid,developed by Harry Wimperis in 1916 for the Royal NavalAir Service; it measured the wind speed from the air,and used that measurement to calculate the wind'seffects on the trajectory of the bombs. The system waslater improved with the Course Setting Bomb Sight, andreached a climax with World War II bomb sights, MarkXIV bomb sight (RAF Bomber Command) and the Norden [41] (United States Army Air Forces).

The art of mechanical analog computing reached itszenith with the differential analyzer,[42] built by H. L.Hazen and Vannevar Bush at MIT starting in 1927, whichbuilt on the mechanical integrators of James Thomsonand the torque amplifiers invented by H. W. Nieman. Adozen of these devices were built before theirobsolescence became obvious; the most powerful wasconstructed at the University of Pennsylvania's MooreSchool of Electrical Engineering, where the ENIAC wasbuilt.

By the 1950s the success of digital electroniccomputers had spelled the end for most analog computingmachines, but hybrid analog computers, controlled bydigital electronics, remained in substantial use into

13 | P a g e

the 1950s and 1960s, and later in some specializedapplications.

Advent of the digital computer

The principle of the modern computer was firstdescribed by computer scientist Alan Turing, who setout the idea in his seminal 1936 paper, On ComputableNumbers. Turing reformulated Kurt Gödel's 1931 resultson the limits of proof and computation, replacingGödel's universal arithmetic-based formal language withthe formal and simple hypothetical devices that becameknown as Turing machines. He proved that some suchmachine would be capable of performing any conceivablemathematical computation if it were representable as analgorithm. He went on to prove that there was nosolution to the Entscheidungsproblem by first showing thatthe halting problem for Turing machines is undecidable:in general, it is not possible to decidealgorithmically whether a given Turing machine willever halt.

He also introduced the notion of a 'Universal Machine'(now known as a Universal Turing machine), with theidea that such a machine could perform the tasks of anyother machine, or in other words, it is provablycapable of computing anything that is computable byexecuting a program stored on tape, allowing themachine to be programmable. Von Neumann acknowledgedthat the central concept of the modern computer was dueto this paper. Turing machines are to this day acentral object of study in theory of computation.Except for the limitations imposed by their finitememory stores, modern computers are said to be Turing-

14 | P a g e

complete, which is to say, they have algorithmexecution capability equivalent to a universal Turingmachine.

Electromechanical computers

The era of modern computing began with a flurry ofdevelopment before and during World War II. Mostdigital computers built in this period wereelectromechanical - electric switches drove mechanicalrelays to perform the calculation. These devices had alow operating speed and were eventually superseded bymuch faster all-electric computers, originally usingvacuum tubes.

The Z2 was one of the earliest examples of anelectromechanical relay computer, and was created byGerman engineer Konrad Zuse in 1939. It was animprovement on his earlier Z1; although it used thesame mechanical memory, it replaced the arithmetic andcontrol logic with electrical relay circuits.[45]

Replica of Zuse's Z3, the first fully automatic,digital (electromechanical) computer.

In the same year, the electro-mechanical bombes werebuilt by British cryptologists to help decipher GermanEnigma-machine-encrypted secret messages during WorldWar II. The initial design of the bombe was produced in1939 at the UK Government Code and Cypher School(GC&CS) at Bletchley Park by Alan Turing,] with animportant refinement devised in 1940 by GordonWelchman. The engineering design and construction wasthe work of Harold Keen of the British Tabulating

15 | P a g e

Machine Company. It was a substantial development froma device that had been designed in 1938 by PolishCipher Bureau cryptologist Marian Rejewski, and knownas the "cryptologic bomb" (Polish: "bomba kryptologiczna").

In 1941, Zuse followed his earlier machine up with theZ3, the world's first working electromechanicalprogrammable, fully automatic digital computer. The Z3was built with 2000 relays, implementing a 22 bit wordlength that operated at a clock frequency of about 5–10 Hz. Program code and data were stored on punchedfilm. It was quite similar to modern machines in somerespects, pioneering numerous advances such as floatingpoint numbers. Replacement of the hard-to-implementdecimal system (used in Charles Babbage's earlierdesign) by the simpler binary system meant that Zuse'smachines were easier to build and potentially morereliable, given the technologies available at thattime.[51] The Z3 was probably a complete Turing machine.In two 1936 patent applications, Zuse also anticipatedthat machine instructions could be stored in the samestorage used for data—the key insight of what becameknown as the von Neumann architecture, firstimplemented in the British SSEM of 1948.

Zuse suffered setbacks during World War II when some ofhis machines were destroyed in the course of Alliedbombing campaigns. Apparently his work remained largelyunknown to engineers in the UK and US until much later,although at least IBM was aware of it as it financedhis post-war startup company in 1946 in return for anoption on Zuse's patents.

16 | P a g e

In 1944, the Harvard Mark I was constructed at IBM'sEndicott laboratories; it was a similar general purposeelectro-mechanical computer to the Z3 and was not quiteTuring-complete.

Digital computation

The mathematical basis of digital computing wasestablished by the British mathematician George Boole,in his work The Laws of Thought, published in 1854. HisBoolean algebra was further refined in the 1860s byWilliam Jevons and Charles Sanders Peirce, and wasfirst presented systematically by Ernst Schröder and A.N. Whitehead.

In the 1930s and working independently, Americanelectronic engineer Claude Shannon and Soviet logicianVictor Shestakov both showed a one-to-onecorrespondence between the concepts of Boolean logicand certain electrical circuits, now called logicgates, which are now ubiquitous in digital computers.They showed] that electronic relays and switches canrealize the expressions of Boolean algebra. This thesisessentially founded practical digital circuit design.

Electronic data processing

Atanasoff–Berry Computer replica at 1st floor of DurhamCenter, Iowa State University.

Purely electronic circuit elements soon replaced theirmechanical and electromechanical equivalents, at thesame time that digital calculation replaced analog.Machines such as the Z3, the Atanasoff–Berry Computer,the Colossus computers, and the ENIAC were built by

17 | P a g e

hand, using circuits containing relays or valves(vacuum tubes), and often used punched cards or punchedpaper tape for input and as the main (non-volatile)storage medium.

The engineer Tommy Flowers joined thetelecommunications branch of the General Post Office in1926. While working at the research station in DollisHill in the 1930s, he began to explore the possible useof electronics for the telephone exchange. Experimentalequipment that he built in 1934 went into operation 5years later, converting a portion of the telephoneexchange network into an electronic data processingsystem, using thousands of vacuum tubes.[38]

In the US, John Vincent Atanasoff and Clifford E. Berryof Iowa State University developed and tested theAtanasoff–Berry Computer (ABC) in 1942,[57] the firstelectronic digital calculating device. This design wasalso all-electronic, and used about 300 vacuum tubes,with capacitors fixed in a mechanically rotating drumfor memory. However, its paper card writer/reader wasunreliable, and work on the machine was discontinued.The machine's special-purpose nature and lack of achangeable, stored program distinguish it from moderncomputers.[59]

The electronic programmable computer

Colossus was the first electronic digital programmablecomputing device, and was used to break German ciphersduring World War II.

During World War II, the British at Bletchley Park (40miles north of London) achieved a number of successes

18 | P a g e

at breaking encrypted German military communications.The German encryption machine, Enigma, was firstattacked with the help of the electro-mechanicalbombes. They ruled out possible Enigma settings byperforming chains of logical deductions implementedelectrically. Most possibilities led to acontradiction, and the few remaining could be tested byhand.

The Germans also developed a series of teleprinterencryption systems, quite different from Enigma. TheLorenz SZ 40/42 machine was used for high-level Armycommunications, termed "Tunny" by the British. Thefirst intercepts of Lorenz messages began in 1941. Aspart of an attack on Tunny, Max Newman and hiscolleagues helped specify the Colossus.

Tommy Flowers, still a senior engineer at the PostOffice Research Stationwas recommended to Max Newman byAlan Turing and spent eleven months from early February1943 designing and building the first Colossus. After afunctional test in December 1943, Colossus was shippedto Bletchley Park, where it was delivered on 18 January1944and attacked its first message on 5 February.

Colossus was the world's first electronic digitalprogrammable computer.[38] It used a large number ofvalves (vacuum tubes). It had paper-tape input and wascapable of being configured to perform a variety ofboolean logical operations on its data, but it was notTuring-complete. Nine Mk II Colossi were built (The MkI was converted to a Mk II making ten machines intotal). Colossus Mark I contained 1500 thermionic

19 | P a g e

valves (tubes), but Mark II with 2400 valves, was both5 times faster and simpler to operate than Mark 1,greatly speeding the decoding process. Mark 2 wasdesigned while Mark 1 was being constructed. AllenCoombs took over leadership of the Colossus Mark 2project when Tommy Flowers moved on to other projects.

Colossus was able to process 5,000 characters persecond with the paper tape moving at 40 ft/s (12.2 m/s;27.3 mph). Sometimes, two or more Colossus computerstried different possibilities simultaneously in whatnow is called parallel computing, speeding the decodingprocess by perhaps as much as double the rate ofcomparison.

Colossus included the first ever use of shift registersand systolic arrays, enabling five simultaneous tests,each involving up to 100 Boolean calculations, on eachof the five channels on the punched tape (although innormal operation only one or two channels were examinedin any run). Initially Colossus was only used todetermine the initial wheel positions used for aparticular message (termed wheel setting). The Mark 2included mechanisms intended to help determine pinpatterns (wheel breaking). Both models wereprogrammable using switches and plug panels in a waythe Robinsons had not been.

ENIAC was the first Turing-complete electronic device,and performed ballistics trajectory calculations forthe United States Army.

Without the use of these machines, the Allies wouldhave been deprived of the very valuable intelligence

20 | P a g e

that was obtained from reading the vast quantity ofencrypted high-level telegraphic messages between theGerman High Command (OKW) and their army commandsthroughout occupied Europe. Details of their existence,design, and use were kept secret well into the 1970s.Winston Churchill personally issued an order for theirdestruction into pieces no larger than a man's hand, tokeep secret that the British were capable of crackingLorenz SZ cyphers (from German rotor stream ciphermachines) during the oncoming cold war. Two of themachines were transferred to the newly formed GCHQ andthe others were destroyed. As a result the machineswere not included in many histories of computing.[69] Areconstructed working copy of one of the Colossusmachines is now on display at Bletchley Park.

The US-built ENIAC (Electronic Numerical Integrator andComputer) was the first electronic programmablecomputer built in the US. Although the ENIAC wassimilar to the Colossus it was much faster and moreflexible. It was unambiguously a Turing-complete deviceand could compute any problem that would fit into itsmemory. Like the Colossus, a "program" on the ENIAC wasdefined by the states of its patch cables and switches,a far cry from the stored program electronic machinesthat came later. Once a program was written, it had tobe mechanically set into the machine with manualresetting of plugs and switches.

It combined the high speed of electronics with theability to be programmed for many complex problems. Itcould add or subtract 5000 times a second, a thousandtimes faster than any other machine. It also hadmodules to multiply, divide, and square root. High

21 | P a g e

speed memory was limited to 20 words (about 80 bytes).Built under the direction of John Mauchly and J.Presper Eckert at the University of Pennsylvania,ENIAC's development and construction lasted from 1943to full operation at the end of 1945. The machine washuge, weighing 30 tons, using 200 kilowatts of electricpower and contained over 18,000 vacuum tubes, 1,500relays, and hundreds of thousands of resistors,capacitors, and inductors. One of its major engineeringfeats was to minimize the effects of tube burnout,which was a common problem in machine reliability atthat time. The machine was in almost constant use forthe next ten years.

The stored-program computer

Early computing machines had fixed programs. Forexample, a desk calculator is a fixed program computer.It can do basic mathematics, but it cannot be used as aword processor or a gaming console. Changing theprogram of a fixed-program machine requires re-wiring,re-structuring, or re-designing the machine. Theearliest computers were not so much "programmed" asthey were "designed". "Reprogramming", when it waspossible at all, was a laborious process, starting withflowcharts and paper notes, followed by detailedengineering designs, and then the often-arduous processof physically re-wiring and re-building the machine.[71]

With the proposal of the stored-program computer thischanged. A stored-program computer includes by designan instruction set and can store in memory a set ofinstructions (a program) that details the computation.

22 | P a g e

Theory

Design of the von Neumann architecture (1947)

The theoretical basis for the stored-program computerhad been laid by Alan Turing in his 1936 paper. In 1945Turing joined the National Physical Laboratory andbegan work on developing an electronic stored-programdigital computer. His 1945 report ‘Proposed ElectronicCalculator’ was the first specification for such adevice.

23 | P a g e

Meanwhile, John von Neumann at the Moore School ofElectrical Engineering, University of Pennsylvania,circulated his First Draft of a Report on the EDVAC in 1945.Although substantially similar to Turing's design andcontaining comparatively little engineering detail, thecomputer architecture it outlined became known as the"von Neumann architecture". Turing presented a moredetailed paper to the National Physical Laboratory(NPL) Executive Committee in 1946, giving the firstreasonably complete design of a stored-programcomputer, a device he called the Automatic ComputingEngine (ACE). However, the better-known EDVAC design ofJohn von Neumann, who knew of Turing's theoreticalwork, received more publicity, despite its incompletenature and questionable lack of attribution of thesources of some of the ideas.

Turing felt that speed and size of memory were crucialand he proposed a high-speed memory of what would todaybe called 25 KB, accessed at a speed of 1 MHz. The ACEimplemented subroutine calls, whereas the EDVAC didnot, and the ACE also used Abbreviated Computer Instructions,an early form of programming language.

Manchester "baby”

The Manchester Small-Scale Experimental Machine,nicknamed Baby, was the world's first stored-programcomputer. It was built at the Victoria University ofManchester by Frederic C. Williams, Tom Kilburn andGeoff Tootill, and ran its first program on 21 June1948.[72]

The machine was not intended to be a practical computerbut was instead designed as a testbed for the Williams

24 | P a g e

tube, the first random-access digital storage device.[73]

Invented by Freddie Williams and Tom Kilburn [74] [75] at theUniversity of Manchester in 1946 and 1947, it was acathode ray tube that used an effect called secondaryemission to temporarily store electronic binary data,and was used successfully in several early computers.

Although the computer was considered "small andprimitive" by the standards of its time, it was thefirst working machine to contain all of the elementsessential to a modern electronic computer.[76] As soon asthe SSEM had demonstrated the feasibility of itsdesign, a project was initiated at the university todevelop it into a more usable computer, the ManchesterMark 1. The Mark 1 in turn quickly became the prototypefor the Ferranti Mark 1, the world's first commerciallyavailable general-purpose computer.

The SSEM had a 32-bit word length and a memory of32 words. As it was designed to be the simplestpossible stored-program computer, the only arithmeticoperations implemented in hardware were subtraction andnegation; other arithmetic operations were implementedin software. The first of three programs written forthe machine found the highest proper divisor of 218

(262,144), a calculation that was known would take along time to run—and so prove the computer'sreliability—by testing every integer from 218 - 1downwards, as division was implemented by repeatedsubtraction of the divisor. The program consisted of17 instructions and ran for 52 minutes before reachingthe correct answer of 131,072, after the SSEM hadperformed 3.5 million operations (for an effective CPUspeed of 1.1 kIPS).

25 | P a g e

Manchester Mark 1

The Experimental machine led on to the development ofthe Manchester Mark 1 at the University of Manchester.[78] Work began in August 1948, and the first version wasoperational by April 1949; a program written to searchfor Mersenne primes ran error-free for nine hours onthe night of 16/17 June 1949. The machine's successfuloperation was widely reported in the British press,which used the phrase "electronic brain" in describingit to their readers.

The computer is especially historically significantbecause of its pioneering inclusion of index registers,an innovation which made it easier for a program toread sequentially through an array of words in memory.Thirty-four patents resulted from the machine'sdevelopment, and many of the ideas behind its designwere incorporated in subsequent commercial productssuch as the IBM 701 and 702 as well as the FerrantiMark 1. The chief designers, Frederic C. Williams andTom Kilburn, concluded from their experiences with theMark 1 that computers would be used more in scientificroles than in pure mathematics. In 1951 they starteddevelopment work on Meg, the Mark 1's successor, whichwould include a floating point unit.

EDSAC

The other contender for being the first recognizablymodern digital stored-program computer[79] was the EDSAC,[80] designed and constructed by Maurice Wilkes and histeam at the University of Cambridge MathematicalLaboratory in England at the University of Cambridge in1949. The machine was inspired by John von Neumann's

26 | P a g e

seminal First Draft of a Report on the EDVAC and was one of thefirst usefully operational electronic digital stored-program computer.[

EDSAC ran its first programs on 6 May 1949, when itcalculated a table of squares[82] and a list of primenumbers.The EDSAC also served as the basis for thefirst commercially applied computer, the LEO I, used byfood manufacturing company J. Lyons & Co. Ltd. EDSAC 1and was finally shut down on 11 July 1958, having beensuperseded by EDSAC 2 which stayed in use until 1965.

EDVAC

ENIAC inventors John Mauchly and J. Presper Eckertproposed the EDVAC's construction in August 1944, anddesign work for the EDVAC commenced at the Universityof Pennsylvania's Moore School of ElectricalEngineering, before the ENIAC was fully operational.The design would implement a number of importantarchitectural and logical improvements conceived duringthe ENIAC's construction and would incorporate a highspeed serial access memory. However, Eckert and Mauchlyleft the project and its construction floundered.

It was finally delivered to the U.S. Army's BallisticsResearch Laboratory at the Aberdeen Proving Ground inAugust 1949, but due to a number of problems, thecomputer only began operation in 1951, and then only ona limited basis.

Commercial computers

The first commercial computer was the Ferranti Mark 1,built by Ferranti and delivered to the University of

27 | P a g e

Manchester in February 1951. It was based on theManchester Mark 1. The main improvements over theManchester Mark 1 were in the size of the primarystorage (using random access Williams tubes), secondarystorage (using a magnetic drum), a faster multiplier,and additional instructions. The basic cycle time was1.2 milliseconds, and a multiplication could becompleted in about 2.16 milliseconds. The multiplierused almost a quarter of the machine's 4,050 vacuumtubes (valves). A second machine was purchased by theUniversity of Toronto, before the design was revisedinto the Mark 1 Star. At least seven of these latermachines were delivered between 1953 and 1957, one ofthem to Shell labs in Amsterdam.

In October 1947, the directors of J. Lyons & Company, aBritish catering company famous for its teashops butwith strong interests in new office managementtechniques, decided to take an active role in promotingthe commercial development of computers. The LEO Icomputer became operational in April 1951[87] and ran theworld's first regular routine office computer job. On17 November 1951, the J. Lyons Company began weeklyoperation of a bakery valuations job on the LEO (LyonsElectronic Office). This was the first businessapplication to go live on a stored program computer.

28 | P a g e

In June 1951, the UNIVAC I (Universal AutomaticComputer) was delivered to the U.S. Census Bureau.Remington Rand eventually sold 46 machines at more thanUS$1 million each ($9.09 million as of 2015). UNIVACwas the first "mass produced" computer. It used 5,200vacuum tubes and consumed 125 kW of power. Its primarystorage was serial-access mercury delay lines capableof storing 1,000 words of 11decimal digits plus sign(72-bit words).

IBM introduced a smaller, more affordable computer in1954 that proved very popular.[90] The IBM 650 weighedover 900 kg, the attached power supply weighed around1350 kg and both were held in separate cabinets ofroughly 1.5 meters by 0.9 meters by 1.8 meters. It costUS$500,000 ($4.39 million as of 2015) or could beleased for US$3,500 a month ($30 thousand as of 2015)Its drum memory was originally 2,000 ten-digit words,later expanded to 4,000 words. Memory limitations suchas this were to dominate programming for decadesafterward. The program instructions were fetched fromthe spinning drum as the code ran. Efficient executionusing drum memory was provided by a combination ofhardware architecture: the instruction format included

29 | P a g e

the address of the next instruction; and software: theSymbolic Optimal Assembly Program, SOAP, assignedinstructions to the optimal addresses (to the extentpossible by static analysis of the source program).Thus many instructions were, when needed, located inthe next row of the drum to be read and additional waittime for drum rotation was not required.

Microprogramming

In 1951, British scientist Maurice Wilkes developed theconcept of microprogramming from the realisation thatthe Central Processing Unit of a computer could becontrolled by a miniature, highly specialised computerprogram in high-speed ROM. Microprogramming allows thebase instruction set to be defined or extended bybuilt-in programs (now called firmware or microcode).This concept greatly simplified CPU development. Hefirst described this at the University of ManchesterComputer Inaugural Conference in 1951, then publishedin expanded form in IEEE Spectrum in 1955.[citation needed]

It was widely used in the CPUs and floating-point unitsof mainframe and other computers; it was implementedfor the first time in EDSAC 2, which also used multipleidentical "bit slices" to simplify design.Interchangeable, replaceable tube assemblies were usedfor each bit of the processor.

Magnetic storage

Magnetic core memory. Each core is one bit.

By 1954, magnetic core memory was rapidly displacingmost other forms of temporary storage, including the

30 | P a g e

Williams tube. It went on to dominate the field throughthe mid-1970s.

A key feature of the American UNIVAC I system of 1951was the implementation of a newly invented type ofmetal magnetic tape, and a high-speed tape unit, fornon-volatile storage. Magnetic tape is still used inmany computers. In 1952, IBM publicly announced the IBM701 Electronic Data Processing Machine, the first inits successful 700/7000 series and its first IBMmainframe computer. The IBM 704, introduced in 1954,used magnetic core memory, which became the standardfor large machines.

IBM introduced the first disk storage unit, the IBM 350RAMAC (Random Access Method of Accounting and Control)in 1956. Using fifty 24-inch (610 mm) metal disks, with100 tracks per side, it was able to store 5 megabytesof data at a cost of US$10,000 per megabyte($90 thousand as of 2015).

Early computer characteristics

Defining characteristics of some early digital computersof the 1940s (In the history of computing hardware)

NameFirstoperational

Numeralsystem

Computingmechanism

ProgrammingTuringcomplete

Zuse Z3(Germany)

May 1941 Binaryfloatingpoint

Electro-mechanical

Program-controlled bypunched 35 mmfilm stock(but no

Intheory(1998)

31 | P a g e

conditionalbranch)

Atanasoff–BerryComputer(US)

1942 Binary Electronic

Notprogrammable—singlepurpose

No

ColossusMark 1 (UK)

February1944 Binary Electron

ic

Program-controlled bypatch cablesand switches

No

HarvardMark I –IBM ASCC(US)

May 1944 Decimal

Electro-mechanical

Program-controlled by24-channelpunched papertape (but noconditionalbranch)

Debatable

ColossusMark 2 (UK)

June1944 Binary Electron

ic

Program-controlled bypatch cablesand switches

Intheory(2011)

Zuse Z4(Germany)

March1945

Binaryfloatingpoint

Electro-mechanical

Program-controlled bypunched 35 mmfilm stock

Yes

ENIAC (US) July1946

Decimal

Electronic

Program-controlled bypatch cablesand switches

Yes

ManchesterSmall-ScaleExperimental Machine

June1948

Binary Electronic

Stored-program inWilliamscathode ray

Yes

32 | P a g e

(Baby) (UK) tube memory

ModifiedENIAC (US)

September 1948

Decimal

Electronic

Read-onlystoredprogrammingmechanismusing theFunctionTables asprogram ROM

Yes

ManchesterMark 1 (UK)

April1949 Binary Electron

ic

Stored-program inWilliamscathode raytube memoryand magneticdrum memory

Yes

EDSAC (UK) May 1949 Binary Electronic

Stored-program inmercury delayline memory

Yes

CSIRAC(Australia)

November1949 Binary Electron

ic

Stored-program inmercury delayline memory

Yes

Transistor computers

The bipolar transistor was invented in 1947. From 1955onwards transistors replaced vacuum tubes in computerdesigns, giving rise to the "second generation" ofcomputers. Initially the only devices available weregermanium point-contact transistors.[ have manyadvantages: they are smaller, and require less powerthan vacuum tubes, so give off less heat. Silicon

33 | P a g e

junction transistors were much more reliable thanvacuum tubes and had longer, indefinite, service life.Transistorized computers could contain tens ofthousands of binary logic circuits in a relativelycompact space. Transistors greatly reduced computers'size, initial cost, and operating cost. Typically,second-generation computers were composed of largenumbers of printed circuit boards such as the IBMStandard Modular System each carrying one to four logicgates or flip-flops.

At the University of Manchester, a team under theleadership of Tom Kilburn designed and built a machineusing the newly developed transistors instead ofvalves. Initially the only devices available weregermanium point-contact transistors, less reliable thanthe valves they replaced but which consumed far lesspower.[102] Their first transistorised computer and thefirst in the world, was operational by 1953,[103] and asecond version was completed there in April 1955.[104]

The 1955 version used 200 transistors, 1,300 solid-state diodes, and had a power consumption of 150 watts.However, the machine did make use of valves to generateits 125 kHz clock waveforms and in the circuitry toread and write on its magnetic drum memory, so it wasnot the first completely transistorized computer.

That distinction goes to the Harwell CADET of 1955built by the electronics division of the Atomic EnergyResearch Establishment at Harwell. The design featureda 64-kilobyte magnetic drum memory store with multiplemoving heads that had been designed at the NationalPhysical Laboratory, UK. By 1953 his team hadtransistor circuits operating to read and write on a

34 | P a g e

smaller magnetic drum from the Royal RadarEstablishment. The machine used a low clock speed ofonly 58 kHz to avoid having to use any valves togenerate the clock waveforms.

CADET used 324 point-contact transistors provided bythe UK company Standard Telephones and Cables; 76junction transistors were used for the first stageamplifiers for data read from the drum, since point-contact transistors were too noisy. From August 1956CADET was offering a regular computing service, duringwhich it often executed continuous computing runs of 80hours or more.[108][109] Problems with the reliability ofearly batches of point contact and alloyed junctiontransistors meant that the machine's mean time betweenfailures was about 90 minutes, but this improved oncethe more reliable bipolar junction transistors becameavailable.

The Transistor Computer's design was adopted by thelocal engineering firm of Metropolitan-Vickers in theirMetrovick 950, the first commercial transistor computeranywhere.[111] Six Metrovick 950s were built, the firstcompleted in 1956. They were successfully deployedwithin various departments of the company and were inuse for about five years.

A second generation computer, the IBM 1401, capturedabout one third of the world market. IBM installed morethan ten thousand 1401s between 1960 and 1964.

Transistorized peripherals[edit]

Transistorized electronics improved not only the CPU(Central Processing Unit), but also the peripheral

35 | P a g e

devices. The second generation disk data storage unitswere able to store tens of millions of letters anddigits. Next to the fixed disk storage units, connectedto the CPU via high-speed data transmission, wereremovable disk data storage units. A removable diskpack can be easily exchanged with another pack in a fewseconds. Even if the removable disks' capacity issmaller than fixed disks, their interchangeabilityguarantees a nearly unlimited quantity of data close athand. Magnetic tape provided archival capability forthis data, at a lower cost than disk.

Many second-generation CPUs delegated peripheral devicecommunications to a secondary processor. For example,while the communication processor controlled cardreading and punching, the main CPU executedcalculations and binary branch instructions. Onedatabus would bear data between the main CPU and corememory at the CPU's fetch-execute cycle rate, and otherdatabusses would typically serve the peripheraldevices. On the PDP-1, the core memory's cycle time was5 microseconds; consequently most arithmeticinstructions took 10 microseconds (100,000 operationsper second) because most operations took at least twomemory cycles; one for the instruction, one for theoperand data fetch.

During the second generation remote terminal units(often in the form of Teleprinters like a FridenFlexowriter) saw greatly increased use. Telephoneconnections provided sufficient speed for early remoteterminals and allowed hundreds of kilometers separationbetween remote-terminals and the computing center.Eventually these stand-alone computer networks would be

36 | P a g e

generalized into an interconnected network of networks—theInternet.

Supercomputers

The University of Manchester Atlas in January 1963

The early 1960s saw the advent of supercomputing. TheAtlas Computer was a joint development between theUniversity of Manchester, Ferranti, and Plessey, andwas first installed at Manchester University andofficially commissioned in 1962 as one of the world'sfirst supercomputers - considered to be the mostpowerful computer in the world at that time.] it wassaid that whenever Atlas went offline half of theUnited Kingdom's computer capacity was lost. It was asecond-generation machine, using discrete germaniumtransistors. Atlas also pioneered the Atlas Supervisor,"considered by many to be the first recognizable modernoperating system".

In the US, a series of computers at Control DataCorporation (CDC) were designed by Seymour Cray to useinnovative designs and parallelism to achieve superiorcomputational peak performance. The CDC 6600, releasedin 1964, is generally considered the firstsupercomputer. The CDC 6600 outperformed itspredecessor, the IBM 7030 Stretch, by about a factor ofthree. With performance of about 1 megaFLOPS, the CDC

37 | P a g e

6600 was the world's fastest computer from 1964 to1969, when it relinquished that status to itssuccessor, the CDC 7600.

The integrated circuit

The next great advance in computing power came with theadvent of the integrated circuit. The idea of theintegrated circuit was conceived by a radar scientistworking for the Royal Radar Establishment of theMinistry of Defence, Geoffrey W.A. Dummer. Dummerpresented the first public description of an integratedcircuit at the Symposium on Progress in QualityElectronic Components in Washington,   D.C. on 7 May1952:[

With the advent of the transistor and the work onsemi-conductors generally, it now seems possible toenvisage electronic equipment in a solid block withno connecting wires.[122] The block may consist oflayers of insulating, conducting, rectifying andamplifying materials, the electronic functionsbeing connected directly by cutting out areas ofthe various layers”.

The first practical ICs were invented by Jack Kilby atTexas Instruments and Robert Noyce at FairchildSemiconductor. Kilby recorded his initial ideasconcerning the integrated circuit in July 1958,successfully demonstrating the first working integratedexample on 12 September 1958. In his patent applicationof 6 February 1959, Kilby described his new device as“a body of semiconductor material ... wherein all thecomponents of the electronic circuit are completely

38 | P a g e

integrated.”[ The first customer for the invention wasthe US Air Force.

Noyce also came up with his own idea of an integratedcircuit half a year later than Kilby. His chip solvedmany practical problems that Kilby's had not. Producedat Fairchild Semiconductor, it was made of silicon,whereas Kilby's chip was made of germanium.

Post-1960 (integrated circuit based)

The explosion in the use of computers began with"third-generation" computers, making use of Jack St.Clair Kilby's and Robert Noyce's independent inventionof the integrated circuit (or microchip). This led tothe invention of the microprocessor. While the subjectof exactly which device was the first microprocessor iscontentious, partly due to lack of agreement on theexact definition of the term "microprocessor", it islargely undisputed that the first single-chipmicroprocessor was the Intel 4004, designed andrealized by Ted Hoff, Federico Faggin, and StanleyMazor at Intel.]

While the earliest microprocessor ICs literallycontained only the processor, i.e. the centralprocessing unit, of a computer, their progressivedevelopment naturally led to chips containing most orall of the internal electronic parts of a computer. Theintegrated circuit in the image on the right, forexample, an Intel 8742, is an 8-bit microcontrollerthat includes a CPU running at 12 MHz, 128 bytes ofRAM, 2048 bytes of EPROM, and I/O in the same chip.

39 | P a g e

During the 1960s there was considerable overlap betweensecond and third generation technologies. IBMimplemented its IBM Solid Logic Technology modules inhybrid circuits for the IBM System/360 in 1964. As lateas 1975, Sperry Univac continued the manufacture ofsecond-generation machines such as the UNIVAC 494. TheBurroughs large systems such as the B5000 were stackmachines, which allowed for simpler programming. Thesepushdown automatons were also implemented inminicomputers and microprocessors later, whichinfluenced programming language design. Minicomputersserved as low-cost computer centers for industry,business and universities. It became possible tosimulate analog circuits with the simulation program withintegrated circuit emphasis, or SPICE (1971) on minicomputers,one of the programs for electronic design automation(EDA). The microprocessor led to the development of themicrocomputer, small, low-cost computers that could beowned by individuals and small businesses.Microcomputers, the first of which appeared in the1970s, became ubiquitous in the 1980s and beyond.

In April 1975 at the Hannover Fair, Olivetti presentedthe P6060, the world's first personal computer withbuilt-in floppy disk: a central processing unit on twocards, code named PUCE1 and PUCE2, with TTL components.It had one or two 8" floppy disk drives, a 32-characterplasma display, 80-column graphical thermal printer, 48Kbytes of RAM, and BASIC language. It weighed 40 kg(88 lb). It was in competition with a similar productby IBM that had an external floppy disk drive.

MOS Technology KIM-1 and Altair 8800, were sold as kitsfor do-it-yourselfers, as was the Apple I, soon

40 | P a g e

afterward. The first Apple computer with graphic andsound capabilities came out well after the CommodorePET. Computing has evolved with microcomputerarchitectures, with features added from their largerbrethren, now dominant in most market segments.

Systems as complicated as computers require very highreliability. ENIAC remained on, in continuous operationfrom 1947 to 1955, for eight years before being shutdown. Although a vacuum tube might fail, it would bereplaced without bringing down the system. By thesimple strategy of never shutting down ENIAC, thefailures were dramatically reduced. The vacuum-tubeSAGE air-defense computers became remarkably reliable –installed in pairs, one off-line, tubes likely to faildid so when the computer was intentionally run atreduced power to find them. Hot-pluggable hard disks,like the hot-pluggable vacuum tubes of yesteryear,continue the tradition of repair during continuousoperation. Semiconductor memories routinely have noerrors when they operate, although operating systemslike Unix have employed memory tests on start-up todetect failing hardware. Today, the requirement ofreliable performance is made even more stringent whenserver farms are the delivery platform. Google hasmanaged this by using fault-tolerant software torecover from hardware failures, and is even working onthe concept of replacing entire server farms on-the-fly, during a service event.

In the 21st century, multi-core CPUs becamecommercially available. Content-addressable memory(CAM) has become inexpensive enough to be used innetworking, although no computer system has yet

41 | P a g e

implemented hardware CAMs for use in programminglanguages. Currently, CAMs (or associative arrays) insoftware are programming-language-specific.Semiconductor memory cell arrays are very regularstructures, and manufacturers prove their processes onthem; this allows price reductions on memory products.During the 1980s, CMOS logic gates developed intodevices that could be made as fast as other circuittypes; computer power consumption could therefore bedecreased dramatically. Unlike the continuous currentdraw of a gate based on other logic types, a CMOS gateonly draws significant current during the 'transition'between logic states, except for leakage.

This has allowed computing to become a commodity whichis now ubiquitous, embedded in many forms, fromgreeting cards and telephones to satellites. Thethermal design power which is dissipated duringoperation has become as essential as computing speed ofoperation. In 2006 servers consumed 1.5% of the totalenergy budget of the U.S. The energy consumption ofcomputer data centers was expected to double to 3% ofworld consumption by 2011. The SoC (system on a chip)has compressed even more of the integrated circuitryinto a single chip; SoCs are enabling phones and PCs toconverge into single hand-held wireless mobile devices.Computing hardware and its software have even become ametaphor for the operation of the universe.

Future

Although DNA-based computing and quantum computing areyears or decades in the future, the infrastructure isbeing laid today, for example, with DNA origami on

42 | P a g e

photolithography and with quantum antennae fortransferring information between ion traps.] By 2011,researchers had entangled 14 qubits. Fast digitalcircuits (including those based on Josephson junctionsand rapid single flux quantum technology) are becomingmore nearly realizable with the discovery of nanoscalesuperconductors.

Fiber-optic and photonic devices, which already havebeen used to transport data over long distances, arenow entering the data center, side by side with CPU andsemiconductor memory components. This allows theseparation of RAM from CPU by optical interconnects.IBM has created an integrated circuit with bothelectronic and optical (this is called photonic)information processing in one chip. This is denoted"CMOS-integrated nanophotonics" or (CINP). One benefitof optical interconnects is that motherboards whichformerly required a certain kind of system on a chip(SoC) can now move formerly dedicated memory andnetwork controllers off the motherboards, spreading thecontrollers out onto the rack. This allowsstandardization of backplane interconnects andmotherboards for multiple types of SoCs, which allowsmore timely upgrades of CPUs.

An indication of the rapidity of development of thisfield can be inferred by the history of the seminalarticle. By the time that anyone had time to writeanything down, it was obsolete. After 1945, others readJohn von Neumann's First Draft of a Report on the EDVAC, andimmediately started implementing their own systems. Tothis day, the pace of development has continued,worldwide

43 | P a g e

The computer as we know it today had its beginning witha 19th century English mathematics professor nameCharles Babbage.He designed the Analytical Engine and it was thisdesign that the basic framework of the computers oftoday are based on.

Generally speaking, computers can be classified intothree generations. Each generation lasted for a certainperiod oftime, and each gave us either a new and improvedcomputer or an improvement to the existing computer. ,

First generation: 1937 – 1946 - In 1937 the firstelectronic digital computer was built by Dr. John V.Atanasoff and Clifford Berry. It was called theAtanasoff-Berry Computer (ABC). In 1943 an electroniccomputer name the Colossus was built for the military.Other developments continued until in 1946 the firstgeneral– purpose digital computer, the ElectronicNumerical Integrator and Computer (ENIAC) was built. Itis said that this computer weighed 30 tons, and had18,000 vacuum tubes which was used for processing. Whenthis computer was turned on for the first time lightsdim in sections of Philadelphia. Computers of thisgeneration could only perform single task, and they hadno operating system.

Second generation: 1947 – 1962 - This generation ofcomputers used transistors instead of vacuum tubeswhich were more reliable. In 1951 the first computerfor commercial use was introduced to the public; theUniversal Automatic Computer (UNIVAC 1). In 1953 theInternational Business Machine (IBM) 650 and 700 series

44 | P a g e

computers made their mark in the computer world. Duringthis generation of computers over 100 computerprogramming languages were developed, computers hadmemory and operating systems. Storage media such astape and disk were in use also were printers foroutput.

Third generation: 1963 - present - The invention ofintegrated circuit brought us the third generation ofcomputers. With this invention computers becamesmaller, more powerful more reliable and they are ableto run many different programs at the same time. In1980Microsoft Disk Operating System (MS-Dos) was born andin 1981 IBM introduced the personal computer (PC) forhome and office use. Three years later Apple gave usthe Macintosh computer with its icon driven interfaceand the 90s gave us Windows operating system.

As a result of the various improvements to thedevelopment of the computer we have seen the computerbeing used in all areas of life. It is a very usefultool that will continue to experience new developmentas time passes.

45 | P a g e