Before I get into details, let me give my gut reaction: The concerns are legitimate. Several aspects of TC are potentially obnoxious, open to abuse or both, and those who promote it don't necessarily have their interests aligned with mine or yours. On the other hand, the problems in question are really legal and ethical problems. Nor are they particularly new or unique to TC.
As such, there are already significant counterbalances that will tend to prevent the nightmare scenarios -- particularly if people are generally informed of the dangers. For this reason, I'm very happy to see people like Vixie and Stallman voicing their concerns. If it turns out they're wrong, it will be in part because they -- and many others in the FSF, the EFF and elsewhere -- raised the alarm. And if they're right, we need every bit of firepower working on the solution we can get.
With that in mind, here are some of the objections I've seen:
Constant contact with the mothership: In order to make sure that the hardware is running the right firmware/software and not some hacked version, to push out updates and enhancements and to update access rules, a given hardware device will communicate frequently with the vendors of the various software it runs. This communication can potentially include all sorts of information stored on the device, whether it's any of the vendors' business or not.
But this is pretty much what's happening now. The major OSs have been converging for some time now, and one thing they've converged on is frequent automatic updates. Updates that call back to the mothership, send you digitally signed modifications to your software -- including your kernel -- and ask you to give an administrator password or otherwise grant permission. It's not just OSs either. Browsers update themselves and ask you to install plugins. So do at least some of those neat desktop widgets.
Does this communication include sensitive information about what's on my disk and what I do with my computer? To the best of my knowledge, no. How do I know this? Basically the vendors say so and no one seems to say they're lying. If someone does start leaning over the line, we hear about it. Just as we're hearing about the potential for abuse of TC.
Unique ids on everything: In order to make sure that a given piece of content can only be accessed on a particular device, you need to know what exact device is trying to read it. In the TC world, every device includes a unique "endorsement key" -- a private cryptographic key that allows anyone who knows the corresponding public key (which will generally be anyone) to send it messages only it can read. It also allows the device to prove that it has that key and since no one else can do so, it uniquely identifies the device.
Unique ids have been around for some time now. In the physical world they're called serial numbers. Network interfaces have MAC addresses. Every bluetooth device has a unique address, and so on.
Neither are digital keys and signatures new. The problem is more that, with an unforgeable ID, a given device, and its usage history if it chooses to record that, can be strongly tied to a given system and from that generally to a given person. If the system is in frequent contact with the mothership, then that information can easily be uploaded and used for any purpose the vendor sees fit.
Again, the problem is not the unique ID per se, but what may and may not be done with the information associated with it. This is a legal and ethical problem, and analogous cases already exist, for example cell phone records (and land line records before them).
Disappearing ink: If I send you a file encrypted to your private key (using your public key), once you decrypt it it's yours to do with as you please. If I send you a file encrypted with some secret key and give you the key, it's the same deal. But suppose I give you a file encrypted to some key that I don't know, and you don't know, but some black box installed on your system does know. That box is set up never to reveal the actual bits I sent you directly, but it will allow you to view the contents on your screen (or play them on your headphones, or whatever).
Then when and whether you can read that file is up to the black box. If the black box is under my control, then so is the file, effectively. This leads to several worrisome possibilities. In order of (IMHO) decreasing likelihood:
- Lock-in: If that black box is the only way you can view the content I sent you, then it's going to be hard to switch to a different brand of black box. You would most likely need me to either re-send the file, or give your black box permission to hand it over to the new black box (it seems possible that you could require me to include some sort of transfer permission with the file when I send it, but that has its own problems).
- Plausible deniability run amok. I'm your boss. I send you an order to do something bad. You do it. I tell your black box to get rid of the order. Effectively, I never told you anything. The authorities come a-knocking. You hang and twist in the wind. You sent a copy of the order to the press? Too bad they can't read it. I retire to a sunny tropical isle.
- Outright censorship: If all systems are "trusted", then anything published is under the control of whoever controls the black boxes. You write something I don't like. I tell every black box I control to refuse to show it. If I'm an OS vendor or the government, that could be quite a few boxes. In the worst case, it could be every legal box. You never said anything, and Oceania has always been at war with Eurasia.
In cases where protecting data is important, there will be market pressure against closed solutions. If I'm, say, a bank or insurance company, I don't want to depend on a particular vendor's device still working twenty years from now -- or that vendor not having folded and taken its keys with it -- when I want to retrieve data from a storage medium that doesn't even exist today.
The plausible deniability problem is ancient. There have always been and almost certainly always will be ways of telling someone to do something without actually coming out and saying it. There will always be a legitimate need for secure, untraceable communication, whether it's sending a protected email or calling someone into an empty room out of sight and earshot. This ability will always be open to abuse.
In order for the worst nightmare scenarios to happen, TC has to be pervasive and mandatory. Right now we're several steps from that. In the US, at least one bill trying to require TC for internet access has died in committee, due in part to the efforts of the FSF, EFF and others, but also to the ability of at least some senators to understand the free speech implications. Again, the problems are more legal and ethical than technical.
Except ...
No unauthorized modifications: If you want to make sure that a system can be trusted (for whatever we're trusting it to do), you also have to make sure it stays that way, particularly when you have a mechanism for automatically delivering updates. Asking the user for authorization is not enough. Very, very few people have the expertise and time to examine every single update that comes across.
A trusted system will only accept updates with the proper credentials, using strong digital signatures. If I try to install my own word smasher or OS, the system will refuse.
This is, of course, a topic near and dear to Stallman and the free software community in general. The whole reason that free software exists outside a small group of dedicated individuals is that it fulfills an important need. It's a counterbalance to commercial software's tendency to lock people in.
Free software says "Here's exactly how the system you're running works. If you don't like it, you're free to rewrite it or (more likely) find a better version or hire someone who can write you one." TC says, "Hey, buddy, you can't change that unless the people who made the system know they can trust you" and that, of course, directly conflicts with the whole "if you don't like it fix it" principle.
There's a case to be made for TC or something like it in special-purpose content-delivery systems like music players and set-top boxes. The question is whether this is the first step toward TC everywhere whether we want it or not.
One of the main sources of concern is whether we will reach some sort of tipping point, where so many systems are TC-enabled that you effectively have to protect your data in order to exchange it at all.
This is not a problem on the sending end. If I'm sending from an open system to a TC system, I can always make my own copy of whatever I send before I armor-plate it. Or can I? The trusted system I'm sending to might be so heinous as to accept only messages produced by a particular trusted word smasher, and not just random files that happen to be properly encrypted. That would be pretty heinous, but who knows?
It's definitely a problem on the receiving end. Just as people sometimes insist on sending attachments in proprietary format, they could choose to send messages in protected form, so that I need a particular trusted black box in order to view them. If practically everyone they correspond with has such a box, then it's most convenient for them to send everything that way and conversely there will be great pressure for anyone without such a box to get one. In other words, I have the choice of receiving a message written in (potentially) disappearing ink, or no message at all.
The $64,000 question is, will people stand for this? I can see such a system being put in place inside a large company in order to, say, enforce document retention policies. It's harder to see it evolving among companies or groups of individuals. Postel's principle that one should be liberal in what one accepts and conservative in what one produces seems particularly applicable here.
In short, the situation definitely bears watching. My guess is that better alternatives will arise to solve the problems, such as copyright protection and malware, that TC is trying to address. The climate doesn't seem particularly threatening at the moment, but if this should change, we should all be ready.
No comments:
Post a Comment