I picked this book up wanting a broader understanding of the historical foundations of the web and the internet. It's a comprehensive introduction, although Isaacson does a fair amount of glorification and Grand Narrative storytelling without much nuanced critique in the mix.
It's obvious we're getting the rose-tinted version of history. Which is fine, as long as we're aware it's a particular kind of narrative, and not the only version of this story.
These notes are only from chapters 7, 10, and 11 that delve into the Internet and the Web. The rest of the book covers other computing innovations like transistors, microchips, and personal computing.
Vannevar Bush led the effort to establish The Internet. A professor at MIT at the time, he helped corral together military, academic, and industrial corporate interests.
Bush was deeply involved in US military projects and personally convinced Franklin Roosevelt to establish a National Defence Research Committee, then oversaw The Manhattan Project building the atomic bombs. He also confounded Raytheon - an enormous defence contractor in Boston.
Bush strongly believed in government funding scientific research and innovation. His reports to the White House were titled things like “Science, the Endless Frontier”, echoing the quintessentially American belief in Manifest Destiny. Quite the fan of your classic notions of "progress" in early 20th century white dude Ameria.
The defence department and national science foundation became the primary funders of The Internet. Most of the research work was done inside a handful of corporate centres - Bell Labs, the RAND Corporation, Stanford, Lincoln Laboratory and Xerox PARC
JCR Licklider is a critical figure in the history of the internet. He created two key pieces of technology - Decentralised networks that enabled information distribution, and interfaces to facilitate human-machine interaction in real time. The beginnings of Human-Computer Interaction
JCR Licklider ended up at MIT studying with Norbert Wiener and his newly established concept of Cybernetics - “any system... that learned through communications, control, and feedback loops “
“Norbert Wiener believed the most promising path for computer science was to devise machines that would work well with human minds rather than try to replace them" #Collaborative AI
In 1960 JCR Licklider published a paper called “Man-Computer Symbiosis” that essentially described a Neuralink-like Brain-Computer Interface.
He hoped that “human brains and computing machines will be coupled together very tightly, and that the resulting partnership will think as no human brain has ever thought and process data in a way not approached by the information-handling machines we know today."
A very Cyborg vision, that carries on today in Tools for Thought and Collaborative AI
JCR Licklider and Norbert Wiener saw Human-Computer Interaction as a collaborative effort leading to Cyborg-like beings. While their MIT colleagues Marvin Minsky and John McCarthy framed the future as one where Artificial Intelligence replaced human minds altogether.
The difference is collaboration versus replacement. Our ongoing anxieties around Automation and the “robots are stealing our jobs” narrative comes back to this divide.
In 1957 JCR Licklider joined the research firm Bolt, Beranek, and Newman (BBN) where he outlined a vision for “The Libraries of the Future”, a concept that resembled the modern web - “a huge database of information that was curated and added so that it doesn’t get to diffuse, overwhelming, or unreliable”
Interestingly JCR Licklider recognised that digital screens do not have all the wonderful affordances and qualities of the printed page. Pointed out that print “affords enough resolution to meet the eyes demand. it presents enough information to occupy the reader for a convenient quantum of time. it offers great flexibility of font and format. It lets the reader control the mode and rate of inspection. it is small, light, moveable, cuttable, clippable, pastable, replicable, disposable, and inexpensive”
Joined DARPA - the Defense Deparrment Advanced Research Projects Agency in 1962. DARPA was a university-military collaboration.
ARPANET, the precursor to the internet, originally used leased phone lines and favoured a Decentralised approach where information could pass between nodes until it reached its destination. #Decentralised Web
ARPANET morphed into The Internet in the 1980’s and 1990's
The first message sent over ARPANET was “Lo and behold” sent from UCLA to Stanford in Palo Alto in 1969
The need for The Internet was born out of a need for time sharing on expensive and limited computational power.
To save the connected computers the hassle of dealing with routing traffic in and out, they designed “mini computers” at each location that would be responsible. aka. “routers”
This first iteration included breaking messages down into “packets” - bite size units of the same size. Packet Switching is when each little packet is assigned its destination address, and then jumps from node to node in the most efficient way possible to get there. They’re assembled at the destination.
From the beginning The Internet was designed as a Decentralised and Distributed ystem - each node has equal power to switch and route the flow of data. This made it robust in the face of attacks or network failures.
“This would become the defining trait of the internet, the ingrained attribute that would allow it to empower individuals and make it resistant to centralised control"
RFC - request for comments - was a format of soliciting feedback in a friendly collaborative way that Stephen Crocker initiated the among early internet working groups in 1969.
“the RFC process pioneered open source development of software, protocols, and content “
ARPANET only connected a limited number of computers, and other small networks sprung up similar to it. In 1973 Robert Kahn and Vincent Cerf decided to find a way to collect all of them, calling it the “internetwork”, and later the “internet”
They developed a common Protocols that every computer could use to seamlessly plugin to the network. A shared template for addressing packets - the Internet Protocol (IP) - specified how each packet header should look and how it should travel through the network.
The next layer up was the TCP - Transmission Control Protocol - that instructed the system to put packets back together in the right order, and checked for missing packets.