The other day someone asked me whether it was supposed to be cold out that week. I didn't know offhand. "That's OK," they said, "I'll check the computer. The computer knows."
It occurred to me that if someone were trying to convince a skeptical public back in the 80s that this whole "personal computer" thing was really going places, and that person were allowed just one ten-second glimpse into the faraway world of 2010 to show the audience, they would probably give their eyeteeth for that particular glimpse. Ditto for a budding AI researcher.
Except ... the viewer from thirty years ago would naturally take "the computer knows" at face value. Computers in the 21st century would be so fast and so smart that the personal computer in the kitchen could predict the weather.
Today, by contrast, we don't generally assume that computers "know" much of anything, but we do assume that they can easily direct us to someone who does, in this case the people at a weather service. Granted, said forecasters are making use of computers that, as far as computing power, could swallow an 80s-era supercomputer whole without a hiccup. Nonetheless, we don't assume that our own computers could do any such thing, or even that a supercomputer is so omnipotent as to make weather forecasters redundant.
That's the difference between having a PC and being on the web. The primary function of most computing devices -- personal computers, phones, netbooks, routers, etc. -- is communication. That's not to say that computers aren't essential in producing and cataloging data, but data is only useful if you can get to it.
What good is half a language?
4 years ago
No comments:
Post a Comment