This is one of those posts that started as one thing, trying to make some sort of Larger Point, but ended up as ... something. It started out on the long-running theme of not-so-disruptive technology, then devolved into a technical exploration as I tried to back that point up, and then went a somewhat different direction because of what I actually found when I went researching, before sorta circling back to the general vicinity of the of the original theme and pulling together some threads from some of the first posts on this blog from, oh, a minute or two ago. Rather than try to polish all this up into some sort of coherent essay, I've decided to leave it pretty much as written. Perhaps as some sort of compensation, I've included a lot more links than I usually do.
Looking back I see that in 2024, I've already doubled my output from 2023 (by a score of two posts to one), so maybe I should quit while I'm ahead. But I had an idea for a post, and after re-reading back to July of 2020 (that is, seven posts), I'm pretty sure I haven't explored this particular point before, at least not recently. Or rather, I have, given that the not-so-disruptive technology tag is in second place behind annoyances, but if I've stepped back and surveyed it from a broader point of view, it hasn't been in the last four years.
(I also notice that the link to Intermittent Conjecture is for a four-year-old post, probably because that particular feature is no longer particularly supported, because of course it's not. Grandpa, what's a "blogroll"?)
I considered editing that last bit of snark out, especially since annoyances is already well represented, but I think that it's probably in line with the rest of this post, though maybe in a roundabout way.
It's almost an axiom that newly-developed technology will Change the World. I say "almost" because technically an axiom is a statement that you assume to be true because it's essential to the rest of your logical framework, but you don't have any other way to prove it to be true, so you have to just assume it. I'm thinking of mathematical axioms like "a thing is equal to itself" or, more esoterically, "if you have a collection of sets, you can form a new set by choosing one element from each" (it took quite a bit of work to figure out that you can't prove that from other axioms like "two sets are equal if you can match up their elements one-to-one in both directions").
"New technology changes everything" is a statement that people often assume to be true, and it's essential to at least some people's logical frameworks, but I wouldn't call it an axiom because you can actually look at any given new technology and, I claim, come to a reasonable conclusion as to whether it changed everything. And then, maybe, as a followup question, by how much?
To take a couple of easy, well-known examples, it's not hard to argue that, say agriculture changed everything, or antibiotics changed everything. Except ... depending on what you call "agriculture", you could argue that agriculture was around for thousands of years before cities like Shuruppak or Dholavira arose. On a smaller timescale, the first modern antibiotic was extracted from mold growing on a bacterial culture in 1928, but it wasn't available in useful quantities until the early1940s.
It's not the discovery of a technology that makes the difference. There wasn't even any one event that you could call "the discovery of agriculture." There was an event that could be called "the discovery of (modern) antibiotics (that were known to work by killing microbes)", but that in itself didn't change anybody's life greatly.
The point here that simple statements like "agriculture/antibiotics changed everything" turn a bit mushy after even a little prodding. More accurate versions might be "over the millennia, developments in agriculture have had a significant impact on human population and living patterns" or "the development, mass manufacture and widespread deployment of several types of antibiotics in the latter half of the 1900s had a significant impact on human health outcomes."
Clearly there have been significant changes in how people live, and clearly developments in agriculture and medicine, including the development of antibiotics, have played a significant role in that, but it's not a simple matter of "agriculture happened" or "antibiotics happened" followed by "everything changed". The actual stories are full of false starts, backtracks, accidental discoveries, social upheavals, twists of fate and all sorts of other seemingly extraneous factors. Which is the interesting part.
What got me started on all this was thinking about how the web has changed communication, and in particular telecommunication. Except, as soon as I wrote that, I realized that it's more a matter of the internet changing communication, since I've already argued that it's the web of links that makes the web webby, and I'll just claim here that this webbiness hasn't had a large impact on how we communicate with each other.
We could just as well have Skype and Zoom without the web. For that matter, to a large extent each social media platform is its own web, and not "the" web. But that way lies yet another round of fretting over what exactly am I blogging about here ... For now, let's file communication technology under "the web at large" or something and get on with it.
For most of human existence, the only way to communicate detailed information over a long distance was by people moving around. Travelers would bring stories and knowledge and trade items with them and information would diffuse across large areas, but if that traveler wanted to send a specific message to someone they'd met years ago while traveling someplace far from their current location, well, good luck with that. It may not have been impossible, but it couldn't have been commonplace.
Several thousand years ago, digital communication came along and changed this. With writing came the option of moving a written message with the sender's exact words (there wasn't any single "invention of writing", either, but let's just roll with it). Messages could be sealed so that their contents couldn't be easily changed, signed so that you could tell who they came from, and even encrypted so that only the intended reader could read them, or at least that was the idea.
Digital telegraph systems, also dating back thousands of years, could transmit text from point A to point B without even needing a person having to carry it. The Greek phryctoria, a system of towers on mountaintops with torches, are a good example but not the only one.
Two key measures of telecommunication are bandwidth, which is how many bits can be transmitted in a given amount of time, and latency, which is how long it takes to transmit any particular bit from sender to receiver. As usual, the actual definitions are more subtle, particularly for bandwidth, but these will do here. If you're feeling technical, feel free to read bandwidth as bitrate.
For example, if it takes three seconds to switch the torches in a telegraph tower around to show a new letter, and there are 24 possible letters, then the bandwidth is about 4.6/3 bits per second, or about 1.5bps. The latency from one tower to the next, around 30km away, is negligible (about 0.1 milliseconds).
If the message is supposed to be relayed to the next tower in a series of towers, it will take some amount of time for someone to read the arrangement of torches in the sending tower and put the same torches up so the next tower can see them. Let's say there are two people in the tower, one reading and one putting up torches, and it takes an extra second for the reader to read and announce the next letter, on top of three seconds to arrange the torches. Latency is then four seconds per tower. That is, if the first tower is sending a message and the second is relaying it to the third, the third tower is getting the message four seconds after it is sent. A fourth tower would be eight seconds behind, and so forth.
Suppose I want to send a message to someone ten towers away. Latency is still pretty good, relatively speaking. The last tower will be 36 seconds behind the sender (nine relays for ten towers). If that receiver sends a reply, I can get it just over a minute after sending my message (in more technical terms, round-trip latency is on the order of a minute). While this is glacial by today's standards, it's outstanding in comparison to a multi-day journey to get from where I am to where the receiver is, and I don't have to worry about someone waylaying my messenger along the way (or my messenger deciding they have better things to do with their time).
Bandwidth, though, is not so great. If I'm sending a short message like "Prepare for attack from the north," that's not a problem. Transmitting that message will take a couple of minutes and my receiver will have the whole thing half a minute after I finish sending it. But suppose I'm sending a trade agreement proposal that amounts to 12,000 bits -- still tiny by today's standards. That will take a couple of hours, which is still doable, though not a lot of fun for anyone involved.
But the people on the other end will want to respond with their own counterproposals, and so on. Pretty soon we're into days, and spare a thought for the twenty people up in the towers shuffling torches around and looking out for torches at other towers through the night (I'm going to go out on a limb and say this system works better at night).
Probably better to send a trusted emissary with the text of my proposal and maybe some other written instructions. And while they're at it, they could carry messages from other people in my area to people in the receiver's area, or anywhere along the way, and we have ourselves the beginnings of a postal system. The latency of a postal system is measured in days, but the bandwidth is essentially limited only by how fast people can actually write and read and how many people are sending and receiving messages -- you can fit a lot of sheets of paper onto a horsecart. Not to mention that you can also send drawings and diagrams easily on a sheet of paper.
This may seem like a lot of speculative detail about ancient systems of communication, and it probably is, but it covers the bulk of human history (the written-down part, as opposed to prehistory, which is most of human existence). From ancient times until the late 1800s, long-distance communication was mainly a matter of moving physical texts around, with limited use of alternatives that were much faster (in latency) but also much, much slower (in bandwidth), and quite a bit more expensive. This includes the era of the modern optical telegraph (late 1700s) and electrical telegraph (mid 1800s).
What happens next is interesting. I originally wrote "then came along the telephone," with the idea that it was a major leap to have the bandwidth to carry voice instead of the dots and dashes of morse code. Fortunately, I did a little double-checking and discovered that
- The bandwidth of a telegraph was not that low. A punched-tape system around the time of the telephone's invention could transmit upwards of 400 words per minute. At roughly 12 bits per word, that comes out to about 80 bits per second. That's nothing by modern standards, but it's about 50 times my guess for the phryctoria. Some of that is because Morse code encodes text more efficiently than torches, but most of it is due to the switch to electromagnetic transmission (um, light from torches is also electromagnetic ...).
- The bandwidth of human speech is not that high. In this old post I cited a world record of 10 words per second, or about 120 bits per second, but normal speech is much slower.
In other words, a telephone and a high-speed telegraph are transmitting words at about the same rate, though the telephone has the advantage of carrying tone of voice and not requiring someone to transcribe words onto a paper tape. I suppose this shouldn't be too surprising since both the telephone and telegraph are using the same underlying transmission medium of electromagnetic waves traveling along copper wires or, a little later, over the air.
The same technology could also transmit images. The first facsimile machine (perhaps you've heard of "faxes"?) was developed around the same time as the telephone. Later, in the 1920s, a number of inventors on a number of continents (including Leon Theremin, better known for
the musical instrument) developed various systems for transmitting moving images. Early television station WRGB ("RGB" can't be a coincidence, can it?) transmitted 40-line images at 20 frames per second. Let's guess that a 40-line image equates to 1600 8-bit pixels. That comes out to about 260 thousand bits per second (260kbps).
This is already a remarkable increase in bandwidth*, from a hundred or so bits per second in the mid 1800s to hundreds of thousands in the early 1900s. By the dawn of the internet, let's say 1974 -- fifty years ago -- when the proposal for TCP was published, a leased telephone line could carry around 50kbps (56kbps as I recall and Wikipedia seems to confirm). That was the basic unit -- it was entirely possible, and typical, to lease more than one. By the mid 1980s, NFSNET was using 1.5Mbps T1 lines. Later came T3 lines at 45Mbs (so a T3 is worth 30 T1, go figure), and today we're talking gigabits or more.
This is all a matter of how bandwidth is sold. The actual transmission cables are much heftier. Fiber optic cables can carry petabits per second (Pbs). A peta is a million gigas, that is, a petabit per second is a quadrillion bits per second, or about 125 thousand bits per second for every person on the planet. Commercially available cables are somewhat smaller, but not much, measured in hundreds of terabits, that is, hundreds of trillions of bits per second.
There are still some specialized applications that can give that much bandwidth a workout, but in human terms the amount of bandwidth available is absolutely ridiculous ("available to whom?" is a fair question). Which brings me back to one of the earliest themes on this blog: limits on
human bandwidth. That is, how much information can any individual person deal with? I discussed several aspects of this in
this post about, oh, seventeen years ago.
In terms of bits per second, our highest use of bandwidth is probably the visual system,.which processes somewhere around
a gigabit per second considered as raw pixels, but there's a lot of redundancy in there. A good MP4-compressed video stream, which includes audio, is more like 10Mbps. Since a format like MP4 is tuned to provide only the information we actually process, it's probably a better measure of how much data the visual system is actually processing.
There's a lot we don't know about our other sensory input -- touch, smell,
proprioception and whatever else, but it's clearly operating at a much lower bandwidth (for example, a walking robot does not need a fiber optic cable to tell the CPU how far its knee is bent or how much pressure its foot is exerting).
In other words, there are many, many ordinary houses with much more than enough bandwidth to saturate the sensory input of all the humans in them, if said sensory inputs could all be magically connected to a stream of bits. In practice, it means that there's enough bandwidth for everyone in the place to spend all their time watching video.
But -- and maybe this really is leading to some sort of point about technology changing everything -- that's been true for quite a while, at least since the advent of 24-hour cable TV, which is to say, also about 50 years ago, which I've just called the dawn of the internet. I don't think this is at all a coincidence. Let's try to boil all the stuff about bandwidth down to a few bullet points:
- For most of human existence, long-distance, low-latency bandwidth was zero -- there was no way to get a specific message across a long distance quickly. You could interact with some directly at short distance with high bandwidth and low latency, but that was about it.
- For most of human history, long-distance, low-latency bandwidth has been very low. In some times and places it was possible to quickly transmit a short message over a long distance, but even then, latency was measured in minutes and bandwidth in single-digit bits per second.
- Starting in the 1800s, electromagnetic transmission led to huge increases in low-latency, long-distance bandwidth, from single-digit bits per second to current rates, which are enough to enable video calls between any two internet-connected points.
- In the mid to late 1900s, bandwidth was high enough and cheap enough to enable two innovations:
- Cable TV carrying over a hundred channels 24/7
- Wide-area digital networking
Of the two, digital networking was by far the slower. Early networks mainly transmitted text, whether in human or computer languages. If you had a terminal at home, you could typically connect to your local network at speeds of 110 to 2400 baud (in general a
different unit from bits per second, but in this case the same), and hope that you'd remembered to turn off call waiting on your landline. Then, after a long day of hacking, you could flip on the TV and watch at something like a megabit (resolution was lower in those days).
Even backbone connections were very slow by today's standards. This doesn't seem like a technical limitation, since ordinary coax cable could handle megabits, but more a matter of there not being that much digital information to send. If I wanted to talk to a colleague on the other side of the country, I wouldn't have tried to set up a call over the internet at the time. I would just pick up the phone.
The digital convergence that happened gradually over the next couple of decades consisted largely of building up the internet backbone, which was based on telephone and cable technology (mostly telephone, I believe), to the point where it could carry digital information at a rate comparable to the analog technologies that had been around since the beginning of the whole exercise.
Technically, this was revolutionary. For most intents and purposes, anything that was analog in the mid 1900s, particularly television, telephone and radio, is now carried digitally on the same network infrastructure that you can use to send purely digital information like ... text and emails? Source code?
This is a kind of interesting way to look at it. Hiding inside the massive digital network that delivers sound and video to us is a tiny replica of the original internet, albeit expanded from a few thousand researchers to a significant slice of the world's population. Billions are bigger than thousands, of course, a million times bigger, in fact, but overall digital bandwidth has increased by much more than a factor of a million.
(The early internet wasn't just used for email and source or object code. It was also used to transmit scientific data. Some datasets can be quite large, particularly in
astronomy and
particle physics, large enough to saturate even the modern backbone. But in such cases data is generally transmitted by putting it on physical media, which is then shipped. The postal service still wins on bandwidth. And yes, I am proudly using both
data and
media as mass nouns here.)
I think what I'm trying to sort out here is that the digital convergence can be looked at two ways. The original vision was to bring the intelligence of the internet to existing audio and video media. A TV cable brings a fixed set of channels into your house and very little back out. An analog phone circuit delivers voice traffic from point A to point B. A digital network can carry information from any number of senders to any number of receivers and do any kind of processing along the way.
On the other hand, technically, the digital convergence was a shift from sending analog data over analog lines (or over the air) to sending the same data over the same lines, or at least the same types of lines plus the cell network (also fundamentally analog), but encoded digitally, then re-encoded into analog signals and likewise decoded and re-decoded on the other end.
Why do that?
The wilder speculations of the 1990s haven't really panned out. A phone call is still a phone call. True, most of the time it's easier just to text, but texting needs much less bandwidth than calling. It certainly does not require a huge buildout of digital bandwidth. All the texts you send in a year would probably amount to a few seconds of audio.
TV shows are still TV shows and movies are still movies. Exciting new possibilities like interactive choose-your-own-adventure TV are an occasional novelty. Live streams allow viewers to interact with the presenter/performer, but so did call-in TV shows.
The difference is control. Outside the occasional news program or sporting event, I'm not sure I can remember the last time I watched something at the same time it was broadcast, if it was ever broadcast at all. I haven't bought an album in years, even in digital form. I stream what I want to watch or listen to, and I'm hardly a bleeding-edge early adopter. If I want to participate in a livestream, I can choose that. More importantly, if a creator wants to put on a live stream, they can easily do that. If I want to set up a video call with some people at work (or not at work), that's easy, too.
Some of these might be possible with the old technology. I could imagine a high-bandwidth phone service that would allow you to call a special number to connect to a video server and pick out what to watch on your video-enabled phone terminal, but putting everything on a digital network that handles data as bits regardless of its content or where it's going has made all of this much easier.
This is all sliced finely enough that individual people can decide which individual people to communicate with, from friend group to celebrity influencers to major organizations and whatever else. I'm personally not sure how much the behavior that this has enabled is new and how much is stuff that people were doing anyway. I explored that theme fairly early on,
here,
here and
here for example, but I don't really do much with social media, even if you count blogging and the occasional visit to LinkedIn.
I think "Digital communication has changed everything" is true in about the same way as "Agriculture has changed everything". On the one hand, it has to be true. Being able to communicate instantly with any of billions of people has to be different from only being able to communicate instantly with the people around you. Being able to transmit high-resolution video across the world with negligible delay has to be different from being able to send a letter across a continent in days or weeks.
Being able to stream from a wide collection of audio and video is certainly different from having to buy or borrow books, records/CDs and videotapes/DVDs, and since that shift has happened well within living memory, it can certainly seem like things are changing rapidly.
But on the other hand, digital technology, including digital telecommunication, has been around for thousands of years. Analog telecommunication has been around for about a century and a half. What we might call the digital revolution is a change in how we transmit and access information, primarily audio and video, that had previously been analog, sitting on top of a huge increase in overall telecommunication bandwidth that began happening over a hundred years ago.
Just as there is no particular beginning of agriculture, there is no particular beginning of digital communication. Even if you could pinpoint the first time a person deliberately planted a seed with the intention of harvesting food later, or the first time a person deliberately made marks to represent words with the intention of someone else reading them later, it wouldn't tell you much. What matters isn't the particular starting point, but the long history of development and use over the millennia.
So far, advances in communication have been about people communicating with people. Machines do communicate with other machines without direct human involvement, but this is mainly in service of people communicating with people. This may change, but that's for
another blog.
As far as people communicating with people, the limiting factor is mainly the people themselves. There are only so many conversations one can have and so many people to have them with. The whole point of a video conversation is to make the call as much like talking face to face as possible, that is, to accommodate our limitations in how we communicate. There are now ways of broadcasting a message from one person to millions of people, or even a billion, but even if one person can broadcast a message to a billion people instantly, those billion people will make sense of it in terms of their own lives, their own views and their own desires.
The how of communicating with other people has changed greatly over the millennia, and particularly greatly in recent decades. This in turn has significantly affected whom we can communicate with. But what we talk about, even if we're talking about how quickly things appear to be changing, doesn't really seem to have changed much at all.
One of the earliest themes of this blog was trying to understand what effect the web and the internet would have on how we talk to each other. My instinct has been generally been to push back against "It's all different now" narratives, and I think my instinct has largely been borne out (but then, I would think that, wouldn't I?).
And yet, I can't believe that nothing has changed. A lot has changed. Some part of me wishes that, after nearly two decades, I could arrive at some sort of grand summing-up of What The Web Is About and what effect it's had, but after all this time, I'm not sure I have much beyond my original take: "It's not nothing, but I'm not sure what it is, except whatever it is doesn't line up that well with the hype."
No comments:
Post a Comment