It’s not like the cochlear implant or black box flight recorder. The Internet had a long, varied and occasionally ad hoc genesis, as Drew Turney learns.
Befitting the tool that revolutionised human communication in the latter years of the 20th century, the Internet wasn’t the result of a lone, dedicated inventor sitting in a lab or bathtub before leaping to his feet shouting ‘Eureka!’
Few inventions really emerge according to that mythical motif anyway, but even if they do the iteration that overturns an industry or changes the world is usually very different from the one that generated the light bulb moment. More often than not it’s also in the hands of different owners — inventors aren’t usually very good at marketing.
The technologies that underpin the Internet not only came from very different places for very different reasons, but they were established (in some cases after being held back) over the course of three decades before they combined to give us the doorway to the world we know from our PCs and phones today.
The Network
As any purist will tell you, the Internet isn’t a collection of computers, it’s the infrastructure that connects them, the network itself.
Joseph Carl Robnett Licklider was a computer scientist who specialised in physics, mathematics, psychology and — most importantly — systems interface theory. One of his most influential papers was 1960’s Man-Computer Symbiosis, which called for much simpler interaction between people and computers.
Licklider (or ‘Lick’ as he was known) was also a pioneer in the field of psychoacoustics — sound perception — but he was also one of the earliest to theorise the possibility of networking computers together to share data.
He not only funded early network research that led to the creation of the ARPANet, Licklider outlined his ideas for a connected world in a series of memos in 1962 in which he (clearly a sci-fi geek of his time) discussed the ‘Intergalactic Computer Network’. His 1968 paper The Computer as a Communications Device outlined many of the applications he saw for computer networks — many of which have since come to pass.
Bomb proof
At the height of the Cold War in the 1960s, the US defence brass was scared. If the Soviets knocked out one critical node of military infrastructure with a nuclear strike, it could seriously hamper emergency measures or the rebuilding of the economy.
The answer was a network of conduits between each terminal rather than a single line between each. The data could then travel as far as it wanted in any direction through any number of interchanges to reach its destination. To destroy the network, you’d have to locate and cut every single pipe, a big job considering the millions of miles of copper piping in the ground or (as we have today) electromagnetic signals in the air.
The computer infrastructure of the US military was particularly vulnerable in the days when a single computer weighed three hundred tons and took up 20,000 feet of floor space, as the IBM AN/FSQ-7 did (the US Air Force bought 56 of them). That was the conundrum facing a RAND Institute researcher named Paul Baran in the early 60s.
Baran came up with the ingenious idea of breaking information into packets and sending them through a device — the router — that could send each packet off in any direction through any number of interchanges to be reassembled at the other end.
Packet switching also might have come about far earlier, but at the time the US telecommunications monopoly AT&T owned the American mass transit communications network, and even with the Air Force offering to pay for an experimental trial, AT&T saw Baran’s idea as a grave threat to their iron-fisted control of the information flow.
Mechanischspracht
In 1973, there were three data networks running on AT&T lines — ARPANet, a private radio network and a private satellite network. Each one had emerged independently of the others and used very different underlying programming. Computer science graduates Vint Cerf and Robert Kahn were given the task of creating a language that could make those three networks talk to each other.
Little did Cerf or Kahn realise, but one day billions of networks and systems of every conceivable type would be talking on the same network. At the time they just needed a stop-gap solution for the problem in front of them.
It called for a new paradigm where information could be transmitted across wires or infrastructure that didn’t belong to those doing the transmitting. Cerf and Kahn’s language had to be commercially, linguistically and technologically inert. The result was the Transmission Control Protocol, or TCP — a method of wrapping information from a local network in a platform-agnostic ‘envelope’ the rest of the network can recognise and send to the right place.
It’s much like the way the post office can direct information according to the address on the front of a real envelope regardless of the language of the contents and is the reason why whether you’re on a Mac, PC, Solaris, Sun or Commodore Vic 20 you can send data to every other Internet connected device there is.
Going into hyperdrive
As conventional wisdom contends, the Internet has been around since the 1960s. But for decades the sending and receiving of files or data was the job of specialised technicians, scientists or academics. Never mind that the data was crushingly boring military or academic information, the average citizen could no more operate a computer than they could sprout wings and fly.
What the world needed was a file type that could take advantage of interface technologies like the mouse to point a user to information on another system in the network. No calling up long strings of numbers or arcane programming languages — just point and click.
Tim Berners-Lee, a British physicist working at the European Organization for Nuclear Research (CERN) in Geneva, wanted to put together a resource for he and fellow scientists to share and update information about projects. He took an existing technology called hypertext — which encodes a link between data in two separate files — and quite literally put it online. CERN had a large intra-organisational network of documentation much like a modern wiki and on Christmas Day 1990, Berners-Lee and his collaborators sent the first successful message from a networked computer to a client based hypertext reader on another system.
The client based hypertext reader would morph into the application we now know as a browser, and today the links that point you to other files on the network, anything from web pages (HTML files) to music tracks, PDF files and movies are as easy as clicking your mouse.