Abaci and Counting Boards - The ancient world saw a proliferation of various abaci and counting boards which used pebbles, beads and other markers as aids to calculation. The earliest believed use is attributed to the Babylonians in the third millenium BC. The emergence of these devices represents a significant evolutionary leap from simple counting to calculation.
Astrolabes - There is some uncertainty surrounding the early development of the astrolabe, but the invention is often attributed to the Greek astronomer and mathematician Hipparchus of Nicea in ca. 150 BC. An astrolabe is an astronomical calculator made from layered movable discs that enabled the user to find local time and calculate a range of celestial events including sunrise and sunset, and the positions of stars. Astrolabes of various construction were in widespread use in Europe and the Islamic world well into the 18th century.
Antikythera Mechanism - Discovered in an ancient shipwreck and dated to between 200-100 BC, the Antikythera Mechanism is an astronomical calculator. Driven by a complex clockwork of about 30 gears, the device could calculate and display planetary movement and accurately predict eclipses centuries into the future.
Vitruvius' Odometer - Marcus Vitruvius Pollio, Roman author, architect, and engineer described an odometer in his work "The Ten Books on Architecture" ca. 25 BC. The device he described was likely in use about 200 years earlier with its invention being generally attributed to the Greek mathematician, inventor and engineer Archimedes of Syracuse in the third century BC. The odometer used a pin on the hub of a chariot wheel to drive a pair of gears configured to drop a pebble into a bucket every 400 rotations of the wheel thus measuring a Roman mile.
Mechanical Clock - Liang Lingzan, a Chinese Tang Dynasty military engineer, and Yi Xing, a monk and mathematician, invent the mechanical clock in ca. 725 CE. Driven by a waterwheel engineered to make one complete rotation per day, a system of gears enabled the clock to provide both astronomical data and time.
The Banu Musa's Mechanical Organ - Published ca. 850 CE by three Persian brothers known as the Banu Musa, the Book of Ingenious Devices includes a description of automatic musical instruments that use interchangeable cylinders with raised teeth to play a variety of tunes.
Napier's Bones - Scottish mathematician, physicist, and astronomer John Napier published his treatise Rabdologie in 1617 which includes a description of his invention Napier's Bones. The invention utilized movable rods with inscribed numbers that facilitated multiplication, division and the calculation of roots.
Slide Rule - The invention of the slide rule is credited to English mathematician William Oughtred in 1630, utilizing the emerging knowledge of logarithms to create an analog calculator capable of directly solving multiplication and division problems. Oughtred first designed a circular device and then two years later produced a linear version. The basic technique of sliding two logarithmic scales into alignment as a method of calculation would be in active use for centuries until the invention of the electronic scientific calculator in the 1970s.
Pascal's Calculator - Starting work at the age of 19, French mathematician Blaise Pascal developed his mechanical calculator between the years 1642 and 1654 producing roughly 50 prototypes and selling 20 machines in various configurations. Borrowing heavily from advances in clock making, Pascal's Calculator or Pascaline used a system of gears to automatically add two numbers. The Pascaline had spoked wheels representing each digit, each marked with the numerals 0-9. The operator would input numbers using a stylus to rotate the wheels accordingly and the result would appear in display windows associated with each digit. The many significant features of the calculator include the invention of the sautoir, a carry mechanism designed to effortlessly propagate multiple carries.
The Stepped Reckoner - Between 1672 and 1694, German mathematician Gottfried Wilhelm Leibniz developed the first calculator that could perform addition, subtraction, multiplication and division. The device was called the the Stepped Reckoner based on its core innovation, a stepped geared drum now commonly called the Leibniz wheel.
The Jacquard Loom - French weaver Joseph Marie Jacquard first demonstrated his automatic loom in 1801. The Jacquard loom used a chain of punched cards, paper cards with a series of holes, to control the weaving of intricate patterns and designs. The punched card method of programming would prove to feature prominently in the advancement of computing, and was implemented in numerous machines from the earliest difference engines into the 1980s.
Babbage's Difference and Analytic Engines - From 1822 until his death in 1871 English mathematician Charles Babbage labored over the designs of two machines. His difference engine was intended to automate the construction of astronomic and mathematic tables which at the time was a costly and error prone enterprise. Programmed by punched cards, his analytic engine was a general purpose calculator with many elements of a modern computer: an arithmetic logic unit, conditional branching and looping, and integrated memory. The machines would have been room-sized mechanical behemoths, but neither was completed due to funding and other problems. Nonetheless, Babbage's work was well known and very influential and he is generally regarded as one of the pioneers of modern computing.
Scheutzian Calculation Engine - In 1843 the Swedish father and son team of Per Georg Scheutz and Edvard Scheutz completed a mechanical calculator based on the Babbage difference engine.
Boolean Algebra - In 1847 English philosopher and logician George Boole published the pamphlet Mathematical Analysis of Logic. In the pamphlet Boole presented a symbolic system in which logical propositions are expressed and manipulated as algebraic equations. Now known as Boolean Algebra, Boole's system is a critical technique in the design of digital logic circuits.
Logic Piano - Inspired by Boole, English economist and logician William Stanley Jevons built a logic piano in 1869. Similar in look and operation to the musical instrument (down to the black and white keys), Jevons' invention was a mechanical logical computer that used a system of levers and pulleys to implement Boolean truth tables. The operator would enter truth table propositions via the keyboard, and untrue conditions would be automatically removed from the display.
Vacuum Tube - English electrical engineer and physicist John Ambrose Fleming invented the vacuum tube (also the valve in British English) in 1904. The invention improved the ability to control electric current ushering in the electronics age. Key to computer technology advancement would be the development of vacuum tube switches and oscillators.
Digital Circuit Design Theory - American Claude Shannon, while at MIT in 1937, lays the foundation for digital circuit design in his masters thesis demonstrating that Boolean algebra can be employed in the design of electronic circuits that can express any logical or mathematical relationship.
The Bell Laboratories Model I - Originally called the Complex Calculator, George Stibitz demonstrated the Model I, to the American Mathematical Society in September 1940. Built from over 400 electromagnetic relays, the machine could perform calculations on two-part complex numbers. The demonstration was conducted remotely with a keyboard and teletype at Dartmouth College in New Hampshire, and the calculator itself at a Bell Laboratories office in New York.
Colossus - Led by English engineer Tommy Flowers, a sequence of 10 Colosi machines were built to to decrypt German communications during World War II (1943-1945). Although special function, the Colossus is considered the first modern computer. It used vacuum tubes and a similar device called a thryratron for Boolean and counting operations and was programmable by the setting of plugs and switches.
Harvard Mark I - Conceived by Harvard professor Howard Aiken and built by IBM the Mark I was turned over to Harvard in 1944. Relying on electro-mechanical switches and relays, as well as shafts, and clutches, the Mark I weighed 10,000 pounds. The most notable use of the machine was in assisting with the calculations needed to support the Manhattan Project and the construction of the first atomic bombs. (see Grace Hopper - Matriarch of Programming)
ENIAC - The Electronic Numerical Integrator And Computer (ENIAC) built between 1943-1946 was the first fully electronic general purpose computer. With more than 17,000 vacuum tubes the computer was 1,000 times faster than its mechanical predecessors. ENIAC was built in secret during the war years, but afterwards captured the public imagination. It was widely reported on in the popular press and was known as the Giant Brain.
The Transistor - American physicists John Bardeen, Walter Brattain, and William Shockley working at Bell Labs in 1947 invent the transistor. Smaller, more efficient and more reliable than the vacuum tube, the transistor would quickly become ubiquitous in electronics.
Manchester Baby - Built in 1948 at the University of Manchester, Baby used a cathode ray tube as a 128 byte memory used to store a program, making it the first computer to store its program electronically.
Whirlwind I - Built by MIT for the U.S. Navy, Whirlwind I was completed in 1951. The machine represented many technical advances related to memory and architecture. It is most notable for its user interface being the first device to use a keyboard for input and cathode ray tube monitor for output.
EDVAC - built by the University of Pennsylvania's Moore School of Electrical Engineering for the U.S. Army's Ballistics Research Laboratory, the Electronic Discrete Variable Automatic Computer (EDVAC) became operational in 1951. EDVAC was one of the earliest fully electronic computers. Noteworthy features include its use of binary rather than decimal math, and its use of stored program methodologies.
The Integrated Circuit - In 1952, British Engineer Geoffrey Dummer described the concept of fabricating multiple circuit components by cutting away from a single block composed of layers of various materials such as semiconductors and insulators. Robert Noyce of Fairfield Semiconductor and Jack Kirby of Texas Instruments were awarded patents for integrated circuit designs in 1961 and 1964 respectively. The two men are considered to have developed their methods independently.
Fortran - IBM delivered the first compiled, high level programming language, Fortran, in 1957. Still in use today, Fortran provided an efficient and effective alternative to the more cumbersome process of assembly or machine language programming.
ATLAS - Built as a joint venture between academe and industry at the University of Manchester, the ATLAS computer became operational in 1962. Considered to be one of the first true supercomputers, ATLAS brought together a number of leading edge hardware technologies, such as germanium transistor switching circuits and magnetic core memories, along with significant architectural innovations such as spooling, interrupts, and pipelining.
BASIC - Dartmouth College professors John Kemeny and Thomas Kurtz created the Beginner's All-purpose Symbolic Instruction Code (BASIC) programming language in 1964 with the express goal of making computer use accessible to a broader range of potential users.
DEC PDP-8 - In 1965, Digital Equipment Corporation (DEC) released the Programmed Data Processor (PDP)-8. About the size of a refrigerator and priced at about $16,000, the PDP-8 was the first commercially successful machine of a generation of minicomputers. DEC sold more than 50,000 PDP-8s, launching an era in which access to computing technology exploded among smaller business, government, scientific and academic enterprises.
ARPANET - In 1969 the first message was transmitted using the U.S. military's Advanced Research Projects Agency Network (ARPANET). Designed for robust communications among widely distributed computers, the system used a packet-switching methodology and various communications protocols which would eventually mature into the Internet.
Intel 4004 - Intel produced the first commercially available microprocessor in 1971. The 4004 was a complete 4-bit central processing unit (CPU) in one integrated circuit.
Altair 8800 - Released in conjunction with a 1975 article in Popular Electronics, the Altair 8800 was sold as a kit for hobbyists. The computer was built around the Intel 8080, and designed by Ed Roberts and Forrest M. Mims III. Far exceeding expectations and selling about 10,000 units, the machine was the first commercially successful home computer.
Microsoft - Paul Allen and Bill Gates develop a version of BASIC for the Altair 8800 in 1975 and found their company, Micro Soft, later that year. From such humble beginnings, Microsoft would become a software juggernaut of the microcomputer revolution.
Apple I - Designed and hand-built by Steve Wozniak, the first Apple computer went on sale in 1976. The Apple I was built around the MOS Technology 6502 8-bit microprocessor. Unlike previous home/hobby computers, the circuit board was fully assembled, but the computer did still require additional components and assembly to provide a working system.
Commodore Pet, Apple II and TRS-80 - 1977 saw a proliferation of computers designed for home use that included Commodore Pet and Apple II based on the MOS 6502, and Tandy's TRS-80 based on Zilog's Z80 8-bit microprocessor. These computers were ready-to-go with keyboards, monitors and cassette drive storage.
IBM Personal Computer - IBM released its IBM Personal Computer in 1981 selling about 100,000 in the first year. Although considered by some as late-to-market, the machines are powerful and well-equipped, effectively bridging the gap between hobbyist and business machines. IBM introduces an open architecture concept that will eventually see it loose significant market share to clone manufactures, but nonetheless helps the platform establish a dominant position and be the basis for computer design for decades to come.
World Wide Web - English computer scientist Tim Berners-Lee and a team at the European Organization for Nuclear Research (CERN) are generally credited with the invention of the World Wide Web in the 1989-1990 timeframe. Including the development of the HyperText Markup Language (HTML) the Hypertext Transfer Protocol (HTTP), the Web would become the primary method of both Internet authorship and access. Although originally centered on textural document sharing, the Web ultimately redefined modern computing as a platform for communication and entertainment, providing global access to diverse media including music and video.
iPhone - Apple Inc. released the first iPhone in 2007. The device combined cell phone communications capabilities with a fully functional Internet-centric computer. The usability and portability of the device is a watershed moment in the trend toward ubiquitous computing. Additionally, the advent of so-called smartphones opens personal computing to entire new populations around the world.