Wednesday, December 28, 2022

Goblin Mode McGoblin Modeface

 Each year, Oxford Languages, which produces the Oxford English Dictionary among other things, selects a word of the year, "a word or expression reflecting the ethos, mood, or preoccupations of the past twelve months, one that has potential as a term of lasting cultural significance."  This year, the choice was opened up to online voting.  Over 300,000 people cast their votes over the course of two weeks and the winner was goblin mode

a slang term, often used in the expressions ‘in goblin mode’ or ‘to go goblin mode’ – is ‘a type of behaviour which is unapologetically self-indulgent, lazy, slovenly, or greedy, typically in a way that rejects social norms or expectations.’

The runners up were metaverse and #IStandWith.

The press I've seen about this tends to emphasize the online voting aspect of the selection, with the suggestion that those rowdy internet folks got one over on the stodgy old OED, but I think that misses a couple of important points.

First, the OED as an institution isn't particularly stodgy.  While Oxford might suggest the British power structure or Tom Lehrer's indelible image of "ivy-covered professors in ivy-covered halls" (leaving aside that that's more a US reference), the dictionary itself has historically been concerned with documenting how people actually use the English language, rather than trying to dictate what "proper English usage" might be.  It is descriptive rather than prescriptive.

The dictionary styles itself "the definitive record of the English language".  This is meant to include everything, including dialects from all over the world, terms of art for all kinds of trades and professions, archaic words from a thousand years ago and all manner of other English usage, including today's internet slang.

From the OED's point of view, goblin mode is a perfectly good term to research and define, as is anything else that people actually use.   If a bunch of internet trolls had decided to vote for glurglebyte or some other made-up word, and the OED actually went with it, that would have been a different matter, but there are plenty of examples of people using goblin mode prior to the online vote.   The word of the year page even gives a couple of examples from The Grauniad and The Times.

One might argue that people weren't using goblin mode all that much, and some other term, whether metaverse, #IStandWith or something else, might have made a better word of the year, but the fact that hundreds of thousands of people voted for it suggests that, even if the votes were meant ironically, there's something there that led people to coalesce around that particular word.  You could even argue that the online vote gives an otherwise ordinary bit of internet slang a much better chance of becoming "a term of lasting cultural significance".

The word of the year page goes further and argues that goblin mode is indeed a good word for a year in which people are finding their way out of a world of lockdowns and overflowing hospitals and questioning just which pre-pandemic norms are really worth keeping.   Sure, the Oxford folks may just be trying to put a brave face on being pwned, but to me it seems more like they saw the results and weren't particularly bothered.

I think there's another important point to note here.  While there have been plenty examples of internet-driven crowds doing bad things, or even horrible things,  it's worth remembering that this particular crowd of net.denizens was operating from a completely different mindset: As with Boaty McBoatface, they did it because it was fun, for sheer hack value.

While it would be a mistake to ignore bad behavior, it would also be a mistake to over-focus on it.  Like anywhere else, bad things can happen on the web without making the whole place a cesspit.  There's lots of questionable content out there and a certain amount of outright lies and hate, but there's also a lot of good information and not a little outright goofiness.  However much there are people out there trying to steer us toward conflict and anger, we still have choices about what we browse and what we post.

A few hundred thousand people upvoting a random bit of slang may be a drop in the bucket, but there are a lot more drops like it.  That says something about who's out there and what they want, just as surely as the nastiness elsewhere does.

Friday, November 25, 2022

Is it the end of the Web as we know it?

Or maybe a better question is "What is this Web we speak of, anyway?"  My default answer: dunno, I'm figuring it out as I go along.

I think the last time I mulled that second question over, in the context of "Web 2.0" (Remember Web 2.0? I think it was one of the exits on the Information Superhighway), my opinion was that the big division was between everything that came before and "the Web", or "Web 1.0" as I don't recall anyone calling it very much.  In other words, that first time someone chased a link from one web page to another using a graphical browser was an epochal event, even if hardly anyone noticed at the time, and what's come after has been a steady stream of technical improvements and services founded on that base.

Two types of service in particular have been prominent over the last decade or so: social media and cryptocurrencies, and both seem to be in questionable shape at the moment.  I've cast a somewhat skeptical eye on both over the years, but that hasn't stopped them from intersecting with the lives of billions of people.

Billions in the case of social media, at least.  I don't actually know how many people own cryptocurrencies, directly or indirectly, but who among us hasn't seen an ad for one or another, or read about the latest crash/rugpull, not to mention the millions of people living in countries that have made cryptocurrencies a significant part of their monetary system, so I'd say billions there, too, depending on how you count.

But the past year has not been particularly kind to either.  This is all over the news at the moment, but just for later reference, let me list a few items of note

  • Elon Musk's takeover of Twitter is off to a rocky start.  My guess is that the new ownership will find some way to keep the servers running and reach some sort of new equilibrium, but with a sizable majority of the workforce either forcibly terminated or choosing "take the severance and get on with my life" over hardcore intensity, it's safe to say there will be a period of adjustment.  Major advertisers seem to be sitting on the sidelines in the meantime and, thanks to the billions in debt that came with the leveraged buyout, the burn rate has increased from "we'll be out of cash on hand in a couple of years if nothing changes" to "we'll owe more in interest this year than we have in the bank"
  • Facebook seems to have wandered off into the Metaverse.  This seems to me to be a classic case of optimistic extrapolation run amok.  Virtual reality is interesting technology.  It clearly at least has good potential for useful applications in areas like design and education.  Getting from there to a world where people spend comparable amounts of time in the virtual world to what they currently spend on scrolling through their feeds seems like a stretch.  Personally, I've tried out an Oculus, and there were definitely some cool things on offer, from a deeply moving immersive art piece on refugees to super slow-mo of a couple of guys making showers of sparks that you can walk around in.  But the age of those links should tell you how long ago that was.
  • No less than Ian Bogost, of Cow Clicker fame among many other things, has written an article entitled The age of social media is ending.  It should never have begun.  I'm incorrigibly skeptical about proclamations of the End of an Age, or the beginning of one for that matter, but Bogost makes some good points about the crucial distinction between social networking (good, and computers can be very helpful) and social media (the never-ending pursuit of clicks, shares, followers, content and so forth, not so good in Bogost's estimation).
  • Crypto exchange FTX has imploded, taking SBF (its colorful founder Sam Bankman-Fried) down with it, the latest of many crypto plays that turned out, shockingly, to have been built atop a house of cards.
  • Bitcoin, the grandaddy of them all, has fallen from its all-time high of close to $69,000 to, at this writing, around $16,000, down over 75%.  Interestingly, the price of BTC had pretty closely tracked the price of the S&P 500, leveraged about 3:1, until the recent FTX fiasco sent it further down.  What it didn't do was rise as reserve currencies hit a round of inflation, which as I dimly understand it was what was supposed to happen.
  • The whole advent of crypto exchanges has only emphasized the disconnect between cryptocurrency in theory -- decentralized, anonymous, free from government interference -- and practice -- centralized by exchanges and mining pools, generally tied to bank accounts in reserve currencies and subject to government regulation from several directions.
Plenty of cold water to be thrown on social media and cryptocurrency enthusiasts, but does this mean the whole thing is coming to an end?

Social media doesn't seem to be going away.  There's even been a rush of activity on Twitter, speculating about the demise of Twitter and what to do next, and if you want to use that as a jumping-off point for a rant about modern culture eating itself, be my guest.

Even if cryptocurrency is dead as an alternative to reserve currencies and more conventional payment systems -- I'm not saying it is or isn't, but even if -- I doubt it's going to stop trading anytime soon.  My personal benchmark for "crypto is dead" would be something on the order of "I can personally mine and take ownership of 1 BTC using my phone at a nominal cost".  We're quite a ways from that, but on the other hand there's still plenty of time left before the mining reward rounds down to zero sometime around the year 2140 at current rates.

In short, there are certainly some major disruptions going on in some of the major features of the Web landscape, but, in answer to the question in the title, they seem more like the kind of shakeup or reining in of excess that seems to happen fairly regularly, rather than some sort of deathblow to the Web itself.  Webvan, anyone?


But then, as I asked at the top of the post, what is this Web we speak of, anyway?

Apart from the time constraints of a busy life, I've been a less apt to post here, and in fact started a whole other blog (which I also don't post on very frequently), because I had come to the conclusion that a lot of things I wanted to post about weren't really related to the Web.  Even here, one of my more recent posts was me fretting about what even is the Web any more and why am I not writing about it?

That post, though, mainly talked about what the Web means day to day.  For better or worse, a lot of that has to do with social media, and I have no interest in devoting a large chunk of my time to what's going on in social media.  Plenty of other people do want to do that and do a better job than I would.  But what is it that makes the Web webby, and how does that relate to the Web as it impacts our lives?

If you peel back all the layers, all the way back to that first link chased on that first graphical browser, the Web is about links.  If you've ever meandered from one Wikipedia article to the next, following links in the page or the "see also", you've been using the Web at its webbiest.  Likewise, I think, if you've browsed your favorite magazine and followed the links from one article to the next, within that publication or outside.  The web of interconnections is what makes the Web.

That primordial web is still around and likely isn't going anywhere, because this sort of browsing from one topic to the next is probably pretty tightly wired in to the way our brains work.  What has happened is that a couple of layers have grown on top of it.

One is search.  You can find all sorts of interesting things by browsing, but often you just want to know where to find, say, a replacement battery for your cordless vacuum.  Browsing would be a horrible way to go about that, but you don't have to.  Just type some likely terms into your search bar and there you are.  This is useful enough that companies can make quite a bit of money by running ads on a search platform, and I doubt this business model is going away, whatever the fortunes of the particular companies providing it.

Social media constitutes a different layer on top of the web.  As I've mentioned before, I'm not active on social media, but it seems to me that while you can certainly browse the links of your social network to find people that people you know know, and you can follow links from a post/tweet/story/whatever to more things that you might be interested in, the main innovation in social media is the feed, which brings content to you without your having to search for it or stumble onto it.

This isn't limited to social media.  I spend quite a bit of time reading my news feed, anti-social though that may be.  In any case, I think there is a distinction to be made between information you actively seek out and information that some person you're following, or some algorithm, or some combination of the two, brings to you.  I doubt that this is going anywhere either, but it looks like there is some rethinking going on about how to control the feed of incoming information, and, to some extent, how much attention to pay to it at all.

Interestingly there was a lot of interest a while back in social search, where you could ask questions of the crowd and people would dig up answers, and people would get paid, and various companies would take various cuts, one way or another.  I think that fell by the wayside because automated search does a better job in many cases, and when it doesn't, asking someone you know without anyone in the middle generally works fine, or at least no worse than trying to ask a pool of random people.

Also interesting: Nothing in those last few paragraphs involves cryptocurrencies, even though I implied earlier that upheaval in that world might have something to do with "the end of the Web as we know it".  I think that's because, even if stories about cryptocurrency have been all over the web, cryptocurrency itself doesn't have much to do with the Web, because it just isn't webby in that primordial sense.  Following some sort of network of transactions, link to link, is not exactly played up as a major use case.


I've actually found working through this pretty encouraging.  A few posts ago (that is, over a year ago), I was ruminating on whether there was anything webby left that I might want to talk about.  Going back to first principles about what makes the Web the Web immediately revealed a view in which the very basis for the Web is alive and well, and aspects of it that are prominent now, like search and feeds, can at least be understood in relation to it.

Saturday, July 30, 2022

Dear screenwriters: Bits can be copied

There's a new thriller movie out on one of the major streaming services.  I don't think it matters which movie or which service.  If you're reading this years from now, that statement will still probably true, at least to the extent there are still streaming services.  If you're pretty sure you know which 2022 movie this is referring to, but haven't seen it yet and want to, be warned.  There are mild spoilers ahead.

As with many such films, the plot revolves around a MacGuffin, a term apparently coined by Angus MacPhail, which Alfred Hitchcock famously glossed as "the thing that the spies are after, but the audience doesn't care."  In other words, it doesn't really matter what the MacGuffin actually is, only that the characters do care who gets it and so spend the whole film trying to make sure it ends up in the right place and doesn't fall into the wrong hands.

The plot device of a MacGuffin is much older than the term itself, of course.  The Holy Grail of Arthurian legend is one, and the oldest recorded story known so far, The Epic of Gilgamesh, sends its protagonist to the Underworld in search of one.

Clearly there's something in the human brain that likes stories about finding a magic item and keeping it away from the baddies, and in that sense the MacGuffin in the big streaming service movie is a perfectly good MacGuffin.  The protagonists and antagonists vie over it, it changes hands a few times, lots of things explode and eventually the MacGuffin is destroyed, ending its magic powers.

Except ...

The MacGuffin in this case is basically a gussied-up thumb drive containing information certain people do not want to become known.  Our protagonist receives the item early in the film (with suitable explosions all around) and promptly sends it off to a trusted colleague for safekeeping and decipherment.  Later we learn that the trusted colleague has, in fact, received the drive and cracked its encryption, revealing the damning information.

In real life, this is when you would make a backup copy.  Or a few.  Maybe hidden in the insignificant bits of JPEGs of cute kittens on fake cloud accounts with several different services.  Maybe on some confederate's anonymous server somewhere on the dark web.  Or at least on a couple more thumb drives.  For bonus points, swap out contents of the original thumb drive for a clip of the Dancing Baby or some similar slice of cheese.

(As I understand it, there are some encrypted devices that are tamper-resistant and designed not to be readable without some sort of key, so you can't easily copy the encrypted bits and try to crack the encryption offline, but here we're told that the encryption has already been cracked, so they have the plaintext and can copy it at will.)

The problem with that, of course, is that the drive would then cease to be a MacGuffin.  Why send teams of mercenaries and a few truckloads of explosives after something that might, at best, be one copy of the damning information?  The only real reason is that it makes for an entertaining way to spend an hour or two and screenwriters know all about writing MacGuffin-driven thriller plots.

Which is fine, except ...

If you think about the practicalities, there's still plenty of tension to be had even if the bits are copied.  Our protagonist has reason to want the secret information to remain secret except in case of a dire emergency, but they also want to be able to preserve it so that it can be released even if something happens to them.  How to do this?

If you've uploaded the bits to one of the major services, then who gets access to them?  Do you keep the information in a private file, memorize the account password and hope for the best?  What if you're captured and coerced into giving up the password?  On the other hand, if you die without revealing the information, it will just sit there until the account is closed, unless someone can figure out enough to subpoena the major service into handing over access to a bunch of cat pictures hiding the real information.  Which you encrypted, of course, so who has the key?

Maybe you share the encrypted bits with a journalist (or two, or three ...) with an "in case of my death" cover letter saying where to get the encryption key.  But what if they decide to go public with it anyway?  The more journalists, the better the chance one of them will publish if something happens to you, but also the better the chance that one of them will publish anyway.

Maybe you put the encrypted bits someplace public but write the encryption key on a piece of paper and lock it away in a safe deposit box in a Swiss bank.  Now you've traded one MacGuffin for another.  But maybe someone at a different spy agency has a backdoor into your encryption.  The baddies at your own agency are going to keep the contents to themselves, but maybe one of them has a change of heart, or gets double-crossed and decides to go public as revenge, and they need your copy since they no longer have access to the original bits and didn't make their own copy.

And so forth.  The point is that information doesn't really act like a physical object, even if you have a copy in physical form, but even so there are lots of ways to go, each with its own dramatic possibilities depending on the abilities and motivations of the various characters.  Most of these possibilities are pretty well-used themselves.  Plots driven by who has access to what information have been around forever, though some have paid more attention to the current technology than others -- "Did you destroy the negatives?" "Yes, but I didn't realize they'd left another copy of the photographs in a locker at the bus station ..."

Opting for a bit more realism here gives up the possibility of a "destroy the magic item, destroy the magic" plot, but it opens up a host of other ones that could have been just as interesting.  On the other hand, the movie in question doesn't seem to blink at the possibility of a full-on gun battle and massive explosions in the middle of a European capital in broad daylight.  Maybe realism was never the point to begin with, since that seems pretty unlikely.

Oh, wait ...


Thursday, June 2, 2022

Check out this new kitchen hack!

In case that title somehow clickbaited you to this quiet backwater, no this isn't really about cooking, but for your trouble: The easiest and least tearful way I know to slice onions is to cut them in half lengthwise, so each half has a little piece of the roots holding it together.  If you think of the roots as the South Pole and the stem end as the North Pole, the first slice is from pole to pole.

Chop off the stalk end and peel off the outer layers, under cold running water if that seems to help (I think this is a little easier than slicing the stem off first, but your mileage may vary).  Put the halves down on the flat side and slice vertically with the slices parallel, also running north-south.  Julia Child recommends another pass, horizontally, still slicing north-south, and who am I to argue?  At this point, the root and the shape of the onion layers are still holding everything together.  Finally, slice vertically, but with the slices running east-west.  Each cut slices off a little pile of nicely diced pieces.

This isn't new -- I first heard about it on a Chef Tell segment many years ago, Mastering the Art of French Cooking came out in 1961 and I'm sure it's been around much longer -- but it works a charm.  Bon Apetit, and remember that a dull kitchen knife is more dangerous than a sharp one.


So it's not new, but is it a hack?  And what's with all these "life hack" articles that have nothing to do with writing clever code?

For my money, the onion-dicing method is absolutely a nice hack.  A hack, really, is an unexpected way of using something to solve a problem.  The usual way to dice something is to slice it, then cut the slices crosswise into strips, then cut the strips crosswise into little dice.  If you try that with an onion, the root is in the way of the north-south slices described above, and the easy way to start is to slice it east-west, into rings.  You then have to dice up the rings, which are hard to stack since they're already separated, and like to slide around and separate into individual rings, and have a lot of exposed surface area to give off tear-producing onion fumes.  In short, you have a mess.

The chef's method takes advantage of the two things that otherwise cause problems:  It uses the root end to hold things in place and keep the exposed area to a minimum, and it uses the layering of the onion to save on cutting (if you omit the horizontal slices, as I usually do, you still get decently-diced pieces, good for most purposes, just a bit coarser).  This is the essence of a hack: using something in a non-obvious way to get the result you want.  It's particularly hackish to take advantage of something that seems to be an obstacle.

Not every hack is nice, of course.  The other popular meaning of hacking, that many geeks including myself find annoying, the computing analog of breaking and entering or vandalizing someone's property, stems from a particular type of hacking: finding unexpected vulnerabilities in a system and taking advantage of them to break the system's security.  As I've discussed at length elsewhere, this isn't necessarily bad.  White hat hackers do just this in order to find and patch vulnerabilities and make systems more secure.  The annoying part isn't so much that hack is associated with breaking and entering, but that it's associated with any kind of breaking and entering, regardless of whether there's any skill or actual hacking -- in the sense of making unexpected use of something -- involved.

I should note somewhere that hack often has negative connotations in software engineering for a completely different reason: If you take advantage of some undocumented feature of a system just to get something working, you have a fragile solution that is liable to break if the system you're hacking around changes in a future update.  In widely-used systems this leads to Hyrum's law, which basically says that people will write to what your system does, regardless of what you say it does, and with enough people using it, any externally visible change in behavior will break someone's code, even if it's not supposed to.

Hacking lives in gray areas, where behavior isn't clearly specified.  "Dice this onion with this knife" doesn't say exactly how to dice the onion.  Someone taking advantage of a quirk in an API can usually say "nothing said I couldn't do this".  There's nothing wrong with unspecified behavior in and of itself.  It's actively helpful if it gives people latitude to implement something in a new and better way.  The trick is to be very specific about what can happen, but put as few restrictions as possible on how.

There's an art to this.  If you're writing a sorting library, you could say "It's an error to try to sort an empty collection of things".  Then you have to make sure to check that, and raise an error if the input is empty, and whoever's using your library has to be careful never to give it an empty collection.  But why should it be an error?  A collection with only one thing in it is always sorted, since there's nothing else for it to get out of order with.  By that reasoning, so is an empty collection.  If you define sorted as "everything in order", that raises the question "but what if there isn't anything?".

If you define sorted as "nothing out of order -- no places where a bigger thing comes before a smaller thing", then the question goes away.  If there isn't anything in the collection, nothing's out of order and it's already sorted.  In math, something is vacuously true if there's no way to make it false.  "Nothing out of order" is vacuously true for an empty collection.  Often, allowing things to be vacuously true makes life easier by sidestepping special cases.

As a general rule, the fewer special cases you need to specify what happens, the easier a system is to write and maintain, the more secure it is against unwanted forms of hacking like security exploits and Hyrum's law, and the friendlier it is to good kinds of hacking, like people finding clever new ways to improve the implementation or to use the system.


So what about all this "life hacking"?  Should people use computing jargon for things that have nothing to do with computing?  I have two answers.

First, the term hack isn't really about computing.  It's about problem solving.  The first definition in the Jargon File (aka Hacker's Dictionary) is "Originally, a quick job that produces what is needed, but not well.", with no mention of computing, and elsewhere it attributes early use of the term to ham radio hobbyists.  As it happens, the actual definitions of hack in the Jargon File don't really include "using something in a non-obvious way to get the result you want", but I'd argue that the definition I gave is consistent with the The Meaning of 'Hack' section.

Second, though, even if hack was originally only applied to coding hacks, so what?  Language evolves and adapts.  Extending hack to other clever tricks reveals something new about what people are trying to get at by using the word, and in my view it's a lot better than restricting it to security exploits, clever or not.  Sure, not every "kitchen hack" or "life hack" is really that hackish, and headline writers are notoriously pressed for time (or lazy, if you're feeling less generous, or more apt to make money with clickbait, if you're feeling cynical), but there are plenty of non-computing hacks floating around now that are just as hackish as anything I've ever done with code.