Showing posts with label movies. Show all posts
Showing posts with label movies. Show all posts

Saturday, July 30, 2022

Dear screenwriters: Bits can be copied

There's a new thriller movie out on one of the major streaming services.  I don't think it matters which movie or which service.  If you're reading this years from now, that statement will still probably true, at least to the extent there are still streaming services.  If you're pretty sure you know which 2022 movie this is referring to, but haven't seen it yet and want to, be warned.  There are mild spoilers ahead.

As with many such films, the plot revolves around a MacGuffin, a term apparently coined by Angus MacPhail, which Alfred Hitchcock famously glossed as "the thing that the spies are after, but the audience doesn't care."  In other words, it doesn't really matter what the MacGuffin actually is, only that the characters do care who gets it and so spend the whole film trying to make sure it ends up in the right place and doesn't fall into the wrong hands.

The plot device of a MacGuffin is much older than the term itself, of course.  The Holy Grail of Arthurian legend is one, and the oldest recorded story known so far, The Epic of Gilgamesh, sends its protagonist to the Underworld in search of one.

Clearly there's something in the human brain that likes stories about finding a magic item and keeping it away from the baddies, and in that sense the MacGuffin in the big streaming service movie is a perfectly good MacGuffin.  The protagonists and antagonists vie over it, it changes hands a few times, lots of things explode and eventually the MacGuffin is destroyed, ending its magic powers.

Except ...

The MacGuffin in this case is basically a gussied-up thumb drive containing information certain people do not want to become known.  Our protagonist receives the item early in the film (with suitable explosions all around) and promptly sends it off to a trusted colleague for safekeeping and decipherment.  Later we learn that the trusted colleague has, in fact, received the drive and cracked its encryption, revealing the damning information.

In real life, this is when you would make a backup copy.  Or a few.  Maybe hidden in the insignificant bits of JPEGs of cute kittens on fake cloud accounts with several different services.  Maybe on some confederate's anonymous server somewhere on the dark web.  Or at least on a couple more thumb drives.  For bonus points, swap out the contents of the original thumb drive for a clip of the Dancing Baby or some similar slice of cheese.

(As I understand it, there are some encrypted devices that are tamper-resistant and designed not to be readable without some sort of key, so you can't easily copy the encrypted bits and try to crack the encryption offline, but here we're told that the encryption has already been cracked, so they have the plaintext and can copy it at will.)

The problem with that, of course, is that the drive would then cease to be a MacGuffin.  Why send teams of mercenaries and a few truckloads of explosives after something that might, at best, be one copy of the damning information?  The only real reason is that it makes for an entertaining way to spend an hour or two and screenwriters know all about writing MacGuffin-driven thriller plots.

Which is fine, except ...

If you think about the practicalities, there's still plenty of tension to be had even if the bits are copied.  Our protagonist has reason to want the secret information to remain secret except in case of a dire emergency, but they also want to be able to preserve it so that it can be released even if something happens to them.  How to do this?

If you've uploaded the bits to one of the major services, then who gets access to them?  Do you keep the information in a private file, memorize the account password and hope for the best?  What if you're captured and coerced into giving up the password?  On the other hand, if you die without revealing the information, it will just sit there until the account is closed, unless someone can figure out enough to subpoena the major service into handing over access to a bunch of cat pictures hiding the real information.  Which you encrypted, of course, so who has the key?

Maybe you share the encrypted bits with a journalist (or two, or three ...) with an "in case of my death" cover letter saying where to get the encryption key.  But what if they decide to go public with it anyway?  The more journalists, the better the chance one of them will publish if something happens to you, but also the better the chance that one of them will publish anyway.

Maybe you put the encrypted bits someplace public but write the encryption key on a piece of paper and lock it away in a safe deposit box in a Swiss bank.  Now you've traded one MacGuffin for another.  But maybe someone at a different spy agency has a backdoor into your encryption.  The baddies at your own agency are going to keep the contents to themselves, but maybe one of them has a change of heart, or gets double-crossed and decides to go public as revenge, and they need your copy since they no longer have access to the original bits and didn't make their own copy.

And so forth.  The point is that information doesn't really act like a physical object, even if you have a copy in physical form, but even so there are lots of ways to go, each with its own dramatic possibilities depending on the abilities and motivations of the various characters.  Most of these possibilities are pretty well-used themselves.  Plots driven by who has access to what information have been around forever, though some have paid more attention to the current technology than others -- "Did you destroy the negatives?" "Yes, but I didn't realize they'd left another copy of the photographs in a locker at the bus station ..."

Opting for a bit more realism here gives up the possibility of a "destroy the magic item, destroy the magic" plot, but it opens up a host of other ones that could have been just as interesting.  On the other hand, the movie in question doesn't seem to blink at the possibility of a full-on gun battle and massive explosions in the middle of a European capital in broad daylight.  Maybe realism was never the point to begin with, since that seems pretty unlikely.

Oh, wait ...


Thursday, March 28, 2013

Film studios ... not dead yet.


A few years ago I ran a series of posts (starting with this one) questioning a 60 Minutes piece on online video piracy.  My take was that 60 Minutes was parroting the MPAA's stand on piracy at the time without critically examining it as one might expect from an investigative news program.

I stand by that.

In one segment, director Steven Soderbergh doubted whether films like The Matrix could be made any more, since piracy was putting the studios out of business and keeping them from financing original works from outsiders.  At the time of that interview, Avatar was on its way to grossing an all-time record 2.8 billion dollars on a budget of $237 million.  Granted, James Cameron is not exactly a hollywood outsider (more on that below), but if the studios aren't financing new faces, it doesn't appear to be for lack of money.  Six of the top ten highest-grossing films have been made since that interview, and ten of the top twenty.   Comcast (owners of Universal) has nearly tripled its stock price.  Disney, Time-Warner and Viacom (owner of Paramount) have approximately doubled.

Overal box-office grosses have been basically flat since that interview, which would indeed be bad news for studios, if that were the only way that they made money.  But it isn't.  Video on demand and DVD/Blu-ray releases, with much lower overhead than the box office, have been a standard part of movie releases since before that interview was done.

Home video numbers seem harder to come by than box office grosses, but there's no doubt that, however much illegal copying may be going on, there's plenty of legal rental going on as well.  It doesn't look like the ability to copy bits online is hurting the film industry any more than the ability to copy them on videotape did.

In fact, the folks at South by Southwest seem to think that video on demand is actually helping get original films from outsiders made and seen.  The title of the panel, How I Learned To Stop Worrying and Love VOD, is itself instructive.

Nor do I think anyone seriously sees this as a triumph of the brave heroes at the MPAA against the evil pirates.  Rather, the industry has adjusted to the new technology and figured out how to make money off of it.  Which is their job.

Sunday, February 6, 2011

You joined the social network ... now see the movie

To be clear right off the bat: This is about the movie The Social Network.  It's not about Facebook, the company, or Mark Zuckerberg, the CEO, or any other actual person, place or thing.  True, there's a person called Mark Zuckerberg in the movie, there is a university called Harvard and a substance called beer, and probably a bit more than the usual amount of care was taken to align those depictions with their real-world counterparts, but it's a movie.  Likewise, my comments here are about the movie.

I liked it.  It's not a bad movie.  But then, I liked Hackers when I finally saw it.

The techspeak is reasonably believable.  In particular, the rapid-fire voiceover as Zuckerberg puts together Facemash is taken directly from the real-life Zuckerberg's online diary (which, however, gets tarted up a bit for the camera).  Using wget to fetch pictures off a web site with an index page full of them is not exactly cutting-edge, but the Zuckerberg character acknowledges as much.  Hacking a perl script with emacs -- or vi, if you prefer -- is a bit more like it.  None of it's neurosurgery, but this is a quick hack.  Judging by the timestamps, he is hacking reasonably quickly, so the guy knows how to get under the hood and get his hands dirty.

What's more interesting is not the coding but the engineering.  In the process of pulling together mugshots of as many Harvard students as he can, Zuckerberg runs across a house whose particular setup makes the task difficult.  What does he do?  Does he down four cans of Jolt Cola and miraculously come up with a superhuman hack to break in?  No.  He punts.

Absolutely the right call.

To make Facemash work, he just needs a bunch of pictures.  He doesn't need every single one, which is fortunate because many aren't online at all.  So why waste time trying to pick the high-hanging fruit when the low-hanging fruit will do?  That little bit of realism actually making it into a big-budget film, even if it goes by so fast you have to think back to realize it happened, and the lack of the usually obligatory thirty-seconds-of-typing-and-the-magic-ACCESS-GRANTED-popup-fills-the-screen scene (yeah, I'm talking about you, Iron Man 2), make a refreshing change, to say the least.

Now, when the site actually goes up and the kids start having fun with it, the resulting traffic apparently brings the Harvard intranet to its knees.  Seriously?  According to the script there were on the order of twenty thousand hits in two hours, if I remember right.  That's about five hits a second, probably more at peak, but not a lot more, and these are fairly small pages -- a couple of mugshot images and some HTML.  It's all going to Zuckerberg's server, and that's not falling over.  The network can't keep up with one Linux box in someone's dorm room?  Sounds like a bit of dramatic license to me.

Similarly, what does the security chief care if some student was snarfing images off the other dorms' servers?  That's not a security threat, it's an annoyance for whoever's administrating those servers and as I understand it a breach of the undergraduate code of conduct.  Judging by the complete mishmash of setups, the sysadmins are probably students themselves, not the university's IT department.  The security guy's job is to keep outside people from causing mischief, and probably to keep everyone from messing with the more sensitive administrative data, particularly grades.  But I digress somewhat.

Actually, one more bit of geekery: Zuckerberg sits, preoccupied, in an OS class while the professor talks about memory management, page tables and such.  Zuckerberg walks out.  The professor taunts him for giving up, at which point Zuckerberg rattles off the answer the professor was looking for.  Except the correct answer was "Sixteen bit virtual address space?  Do what now?  All you've got is 64K and you're going to swap some of it to disk?  It's 2003.  My phone can eat 64K for a light snack."

But starting a site with hundreds of millions of members isn't about coding, nor is it primarily about software engineering in the larger sense.  It's about pulling together the right ideas and getting the word out.  As Zuckerberg points out later, it's also important to have reliable servers, and (as the movie character doesn't mention but the real-life CEO probably would) things like an extensible platform for third parties, but none of that matters if you don't have something of interest running on them in the first place.

Which is why, at least in the movie version, it seems to me that the Winklevoss twins and their partner were more than fairly compensated for their trouble.  Did Zuckerberg deal badly with them by neglecting to mention that he wasn't really working on their site but was in fact working on his own take on a similar idea?  Of course.  Does that mean they invented facebook and he stole it from them?  Not so much.

It's quite clear that the (movie) Winklevosses would have done the site differently.  For starters, "exclusivity" is not a great way to get to a hundred million members.  Nor did they seem to like the look of the site, though that might have been sour grapes.  For all the talk about the first mover advantage -- and one of these days I'd like to have a look at whether such a thing really exists -- MySpace was already around and known to all involved.  For that matter the Harvard house pages that Zuckerberg cribbed called themselves "facebooks".

If Zuckerberg would have been stealing anything, it wasn't the idea of a social networking site, but the Winklevoss's vision of it.   But that wasn't the vision that Zuckerberg implemented.  Facebook (in the movies or real life) isn't just a social network.  It's a collection of features, like relationship status, the wall, the ability to tag photos, a privacy policy, and so forth.  The bulk of these features were implemented well after the split.

The Winklevosses had a concept of a social networking site and they wanted to hire the job done.  They hired the wrong guy and it cost them the few weeks it took them to realize their mistake.  $60 million seems ample compensation for that, even taking into account that their hired hand at least passively misled them.  It's not like Zuckerberg was the only techie in the Cambridge area in 2003 that could have put up a server, or the twins would have had to scrounge for funding to hire someone new.

Again, just going by the movie account.  The real-life version has been hashed out in court.

That's probably deeper into that tar pit than I should have gone.  What's more interesting here is a larger point: Which counts for more: the original idea or the implementation?  There are certainly egregious cases of unscrupulous operators outright stealing an idea and passing it off as their own, but the scenario put forth in the movie isn't such a case.  All other things equal, it's the implementation that counts, just as you can't copyright or patent an idea, only its expression.

At the end of the day, it's not the people with original ideas that tend to go on to business success.  I could rattle off a long list of computing pioneers who didn't become gazillionaires in startups, either because they didn't found startups or the startups didn't succeed.  It's the people who make those ideas into something that people actually use.  Of the mix that goes into that -- design, coding, financing, marketing, knowing the right people (social networking, that is), relentlessness, sheer dumb luck and whatever I left out -- the technical ingredient is arguably one of the most replaceable.

Which brings me back to the mugshot hacking.  The whole hack was nothing but pulling together existing pieces -- the pictures, Apache, wget, perl and yes, emacs to synthesize something new that people wanted.  Nicely foreshadowed.

Saturday, March 13, 2010

More cheesy movie goodness

Since I've admitted to watching watching Ghostbusters II recently, I suppose it will do no harm to admit to having watched Hackers as well. Hey, I missed it the first time around. Skipped it, actually, on the grounds that I would have a hard time accepting its depiction of computers and networks or the inevitable Markov chain of random technical terms.

I was right about the cargo cult computing but, being perhaps older and wiser, much more able to bear it. It's a fun movie to look at, with plenty of whizzy graphics. Come to find out much of it was done with motion-controlled models, as the CGI of the time would have looked too artificial. In movie logic, dancing equations and morphing false-colored talking virus heads on 90s-era hardware are the most natural thing in the world, because they provide atmosphere.

The movie does a surprisingly good job of capturing the gray hat hacker ethos, however much it fails to convey the paint-drying dullness of most forms of hacking to (non-hackish) spectators. I've always found it unfortunate that "hacker" has come generally to mean not hacker programmer but black hat hacker or even script kiddie (fortunately, we have "geek" these days covering roughly the same territory that "hacker" used to). On that front Hackers may have done more good than harm. They also manage to mention the Dragon Book, so props for that.

I didn't pay much attention to the plot because, well, it didn't seem like that kind of movie, but I did react to two of the film's more famous howlers. One was the RISC/CISC confusion in the Pentium scene (if you want to know more detail than that, you should probably just watch the film). The other was when the main characters gush over a 28.8bps modem.

Ah yes, this is a blog about the web, isn't it? I'm getting there ...

As IMDB duly points out, they meant 28.8kbps, not 28.8bps. But that's just a typo and I heard it as 28.8kbps at the time. What caught me was that, more than the cassette tapes, minifloppy disks, haircuts and rollerblades, it's the bandwidth that sends the whole thing horribly off the rails. Never mind the rest of the implausibilities. They're supposed to be doing all this over 14.4 or slower? Oy. Guessing passwords, sure, if they're weak and easily guessed, but if they ever have to download more than a few hundred K, they're doomed.

It's the bandwidth that makes the modern web what it is. Thanks to YouTube and company, the video of the villain's face on the hero's laptop screen seems perfectly plausible until you remember what size pipe it's all supposed to be going through. In fact, much of the cyberstuff, except for the phreaking (the technology has moved on) and the social engineering (ageless and evergreen), is probably more plausible now than it was then, thanks to the web.

Sunday, November 22, 2009

Today is yesterday's tomorrow (sort of)

The other night I was watching Ghostbusters II (oh, don't ask why) and right in the middle of it Harold Ramis' character uses The Computer to look up information on a historical figure. I'll use GBII for reference here since it's handy, but I could have picked any number of others.

The Computer has been a staple of science fiction for decades. It's interesting that its role in such movies is very often not to compute but to look something up, as was the case here. Our hero gives the computer the name, and back comes a neatly formatted 80-column by 24-row answer, with underlines and everything, saying who the person is.

Of all the technological devices in such movies, The Computer always seemed among the less plausible. I'm not counting the ghost-zapping equipment as technology; it's magic and falls firmly under suspension of disbelief. The Computer counts as technology because it's assumed just to be there. At some point in the future, super-powerful all-knowing computers will be generally available. How do we know? Just look at the movies ...

There were a couple of reasons The Computer always seemed particularly implausible. First, knowing a bit about real computers makes it harder for me to gloss over the technical hurdles. Force fields? Jet packs? Sure, why not? That's physics. Physics is what you major in if you're too smart for anything else. They'll figure it out. But a computer you can just type some vague query into and get a sensible answer? Come on. Like that'll happen.

Second, it always seemed like a computer smart enough to, essentially, act like the Encyclopedia Galactica would surely have all kinds of other powers that the careful scriptwriter would have to take into account. If The Computer can tell you who the bad guy in the painting is, why can't it tell you how to take him out?

You can probably tell where I'm going with this. Today, about fifteen years after GBII, you can sit down at your home computer, type in the name of a historical figure and very likely come up with a concise, well-formatted description of who the person was, thanks to the now ubiquitous browser/search-engine/Wikipedia setup.

As powerful as it is, though, the system is an idiot savant. It won't tell you how to neutralize a malevelolent spirit (or rather, it won't tell you a single, clear way to do so) and it won't do a lot of other things. It just allows you to quickly locate useful information that's already been discovered and made publicly available. It's powerful, but not magic.

What particularly strikes me about the description above is the presence of Wikipedia. Large, fast networks of computers were already building out in 1994. Mosaic came out while GBII was in production. The missing piece, and one that I don't recall very many people predicting, was the massively-collaborative human-powered Wikipedia, not a technical advance in itself, but something very much enabled by several technical advances.

The Internet, HTTP, browsers, scripting languages, broadband, email, databases, server farms, cell phones, etc. -- these are all technologies. Wikipedia isn't, and yet it fits easily and comfortably into the list of advances from the last few decades. It fills a niche that's been anticipated for decades, but -- fascinatingly -- not by the anticipated means of using sheer computing power to somehow divine the history of the world.