Monday, August 18, 2025

Hyperlinks vs. the web

In the previous post on the 1968 Mother of all Demos by Doug Engelbart and company, I mentioned stumbling on the fact that Andries van Dam and Ted Nelson had put together HES, which is generally regarded as the first hypertext system, the year before.

All the parts of that were familiar. I've written extensively, though not always favorably, about Nelson's later project, Xanadu. Foley, van Dam et. al.'s Computer Graphics: Principles and Practice was on the shelf at the first place I coded for money, and I'm pretty sure I've run across the idea of hypertext at some point over the years. What I hadn't realized was that Nelson and van Dam had worked together, and that hypertext went back quite that far.

To be fair, I wasn't exactly shocked that hypertext dated to 1967, particularly in the context of Engelbart & co.'s demo, which included what were recognizably hyperlinks. What did catch my attention was that all the building blocks of Web 1.0 were in place in 1968, twenty-three years before the first actual web servers appeared in 1991:

  • the concept of hyperlinks and hypertext
  • the realization of that concept in running code
  • connectivity between computers in different physical locations (ARPANET itself would come along a bit later, but computers were already talking to each other)
  • interactive graphic displays
  • the mouse
The bullet point I didn't quite add is file servers, but FTP dates to 1971. I haven't dug up solid evidence that there was such a thing as a file server in 1968, but I certainly wouldn't bet against it. If you prefer to put the "Web 1.0 could just as well have happened" point at 1971 and say it was only twenty years later that it actually happened, fine.

What took so long? According to van Dam's account, the people who saw the Mother of all Demos were generally impressed, but the overall reaction was to say "Wow, that was some demo" and then get back to work. Was this a failure of the imagination, that the computer researchers of the time couldn't wrap their heads around something that wasn't 80-column punched cards, even if they'd seen it with their own eyes?

Possibly, but there were also more mundane concerns. That list of pieces in place comes with a few disclaimers:
  • Connectivity was generally at 1200 Baud, or approximately one millionth of a gigabit per second. This will deliver text faster than you can read it, but it amounts to a megabyte every two minutes. You can actually fit quite a bit of information into a megabyte, and 1.5Mb/s T1 lines were available, (for a hefty charge, generally to institutions or large corporations), but you're not going to run YouTube or Netflix on the bandwidth available at the time.
  • Interactive graphic displays were a thing, but they were normally vector-based (think Asteroids, if you've heard of that) and in any case they were expensive specialized equipment. Graphic displays (bitmapped) didn't become commonplace until the 80s. Even then they weren't cheap and they looked absolutely primitive by today's standards
I think it's fair to say that in 1968 you could have put something together that looked quite a bit like Web 1.0, but it would have been a curiosity: slow, expensive and without anywhere near enough content to make for a compelling experience.

The first actual web servers were intended as an easier way to get to the content that had accumulated on various FTP/Gopher/UUCP/... servers over the years, which was getting to the point where it was hard to just know where something you were interested in was located. Not too long after that (AltaVista came along in 1995), there were enough index pages that it was hard to keep track of where a good index for what you were looking for was, much less the data itself, and web search was born.

So it wasn't simply a matter of the world turning its back on the wonderful potential of Engelbart's NES and Nelson and van Dam's HES and then suddenly waking up in 1991 to realize what they should have known all along. Things were happening in the meantime:
  • Computing power, storage and bandwidth were increasing exponentially (as in, actually exponentially, at a more-or-less constant proportion per unit time, and not just "by a lot")
  • As both a driving cause and an effect, the number of people with access to computing power, and the amount of data they wanted stored, also increased exponentially
  • People continued to experiment with ways of organizing information and navigating complex webs of connections (1987's HyperCard comes to mind, but it's not the only example)
I think this is a common thread in technology in general: Things take a while to develop. It's not enough just to have a good concept, or even a working realization of it. You generally need a certain amount of infrastructure and a growing need that your new concept will meet, and some partial successes along the way.

Some other examples that come to mind:

  • SketchPad, which anticipated several major developments, particularly the Graphical User Interface, was written in 1963, GUIs weren't really widespread until the 1980s
  • The object-oriented language Simula came out in 1962 and SmallTalk in 1972. Objective-C was introduced in 1984, but OO languages didn't really get major traction until the mid 1990s
  • The concept of a neural network dates back to 1943, at least. Perceptrons were introduced in 1969. Hopfield networks were a topic of research in the 1980s. Transformers were proposed in 2017, now eight years ago. ChatGPT came out five years later.
It shouldn't be surprising that concepts run ahead of implementation. Concepts are a lot easier to come up with. More interesting is how we get from (some) concepts to widespread implementation, and which concepts get there.

No comments: