The last chapter of ValueWeb talks about what comes after the third-generation internet: the internet of value. This is the internet being built today, based on shared ledgers, cloud, apps, APIs and analytics. Of course, the next generation is the internet of things. But what comes after that? There is an answer, but you need to understand the shape-shifting of technology to really absorb the state of today and tomorrow. Therefore, this week I thought I’d talk about the past, present and future of the internet, beginning with ‘Internet 1.0: Building the Web’.
The origins of the Web start with the beginnings of building computers. I’m not going to linger on that too long, because hopefully you’ve seen The Imitation Game with Benedict Cumberbatch in the role of Alan Turing, solving the Enigma Code during the Second World War (although the Polish claim to have solved this a decade earlier). Wars often stimulate progress – just look at aircraft design and development in the First and Second World Wars to get an idea of how fast technology develops during wars – and the Second World War invented computing.
This was the ENIAC – the Electronic Numerical Integrator and Computer – developed by the Americans to provide weather forecasting. It was delivered in 1946 – as always with large computer projects in their early days, systems were delivered over time and over budget – and formed the origins of the first commercial computer company, EMCC.
After building the ENIAC at the University of Pennsylvania, the inventors, J Presper Eckert and John Mauchly, formed EMCC to build new computer designs for commercial and military applications. The company was initially called the Electronic Control Company, changing its name to Eckert-Mauchly Computer Corporation by the time it launched.
Eventually, their firm offered the Univac, the UNIVersal Automatic Computer, which was the computer system used by Nasa in the 1960s to get a man on the moon. Bearing in mind Moore’s Law – computer power doubles every year while cost halves – those systems were pretty basic. In fact, you have more compute power in your Apple Watch today than carried in the Apollo moon shots, which is why we’re now talking about colonising Mars as a real possibility.
The personal computer
It was during this period that computer power in private companies began to take off, with a spray of other firms entering the fray. IBM became the biggest of these firms – having purchased the Series 360 instruction set from an incredible inventor, Dr An Wang, who went on to start Wang Laboratories with the money paid by IBM for the patents – and became the company known to buy from. By the 1980s, the comment was that you never get fired if you buy IBM, and the result was that lots of those competitors – DEC, Wang, ICL, Burroughs and Univac – all went by the wayside. Interesting for a firm whose president originally dismissed computing as limited to a worldwide market of just five systems (though it’s doubtful Thomas Watson actually did say this).
However, IBM did dismiss another rising technology as irrelevant – the personal computer (PC) operating system – even though it owned that space, as the first commercial PC manufacturer. A little-known fact is that it was Bill Gates’ mother Mary who made Microsoft what it is today. Mary Gates was one of the first women to serve as a director of a bank – the First Interstate Bank – and was later appointed to the board of the United Way of America, where she became the first woman to lead it in 1983.
Her tenure on the National Board’s Executive Committee helped her son’s company at a crucial time. This is because, in 1980, she discussed with John Opel, a fellow United Way of America committee member, her son’s company. John was the chairman of IBM, and Mary told him that her son’s firm may be able to help the new business IBM was developing. A few weeks later, IBM took a chance by hiring Microsoft to develop an operating system for its first personal computer.
The success of the IBM PC gave Microsoft a lift that made it the world’s largest software company. Funny how things turn out.
Anyway, another giant of technology at the time – Ken Olsen, who founded Digital Equipment Corporation (DEC) – dismissed the PC, believing that “there is no reason anyone would want a computer in their home”. This is the same guy who dismissed Unix as snake oil, even though, back then, he had founded and was running one of the largest computer companies of the 1980s (Olsen was forced out of DEC in 1992, and the firm was acquired by Compaq in 1998). No wonder his firm went by the wayside, as mentioned earlier.
In the meantime, other technologies were developing around the PC, including Transmission Control Protocol (TCP), the Internet Protocol (IP), and modulators-demodulators, otherwise known as modems. There are several leading figures who were key to developing the modern internet, including Ivan Sutherland and Robert Taylor’s work on Arpanet, the Advanced Research Projects Agency Network, and Kevin Kelly’s work on The Well that led to the founding of Wired Magazine. But the standout figure has to be Sir Tim Berners-Lee (yes, I’m British and biased).
Tim is viewed by many as the founding father of the modern internet, due to his development of the foundations that we use today: HTML, URLs and HTTP.
- HTML: HyperText Markup Language. The markup (formatting) language for the Web.
- URI: Uniform Resource Identifier. A kind of ‘address’ that’s unique, and is used to identify each resource on the Web. It’s also commonly called a URL.
- HTTP: Hypertext Transfer Protocol. Allows for the retrieval of linked resources from across the Web.
Tim proposed these in an October 1990 research paper at Cern, the European Particle Physics Laboratory in Geneva, where he had been working since 1980. The 1990 paper was an extension of what many consider the founding paper of today’s internet: ‘Information Management: A Proposal‘, presented to Cern in March 1989. Believe it or not, Tim’s initial proposal wasn’t immediately accepted. In fact, his boss at the time, Mike Sendall, wrote the words “vague but exciting” on the cover. The Web was never an official Cern project, but Mike managed to give Tim time to work on it, and that led to the breakthrough in 1990.
So the first generation of the modern internet was born in 1990, 45 years after the birth of computing. Since then, each generation of the internet lasts about 10 years. In the 2000s, we saw Web 2.0; now we’re developing Web 3.0, the internet of value. Soon we will be entering the era of the internet of things, Web 4.0, and then in the 2030s we will be immersed in the Semantic Web, Web 5.0.
These are the five areas I’ll investigate in more depth over the course of this week, starting with Web 1.0: the first-generation internet.
READ NEXT: The Internet of Things, explained
– This article is reproduced with kind permission. Some minor changes have been made to reflect BankNXT style considerations. Read more here. Main image: ramcreations, Shutterstock.com