Peer-to-Patent is designed to crowdsource the task of finding prior art for pending patents. As I understand it (and once again, I'm not a lawyer), an invention must be original to be patented. If it was publicly known before you invented it, then it's not original, even if you didn't happen to know about it. You're expected to do your homework. On the other hand, if someone already invented it but kept it a secret, you can still patent your independent discovery.
This makes the searching of prior art a crucial and time-consuming part of the patent application process. The US patent office is heavily backlogged. Enter Peer-to-patent. Peer-to-patent publicizes patent applications and asks the world at large to turn up relevant examples of prior art.
Again leaving aside the policy issues here, a couple of things jump out. One is that crowdsourcing does not necessarily involve the "wisdom of crowds". Wisdom has to do with estimation and judgment. In a typical wisdom of crowds example, the aggregate result of the crowd's responses is more accurate than any particular individual's result. A hundred people can pick Oscar(TM) winners better than any particular person in the group can. As far as I can tell, this is more a matter of the central limit theorem (not to be confused with the law of large numbers) than anything else.
A case like prior art is more a matter of resources. If the aim is to find things known to "persons skilled in the art", then a crowd of people skilled in the art ought to turn up more than a patent examiner because there are more of them. In the typical "wisdom of crowds" scenario, the crowd members need not be particularly skilled in the task at hand. In peer-to-patent, they had better be. Having hordes of people send in piles of junk to sort through would be worse than useless.
A crowdsourced task needn't be difficult. In fact, it should be relatively easy for the participants. As a corollary, you might have to select your participants, while the point of "wisdom of crowds" is that you don't make any particular effort to find skillful participants. So
- Harnessing the "wisdom of crowds" means aggregating the results of a large number of random people to get a better answer than any particular person might give. Everyone gets the same task.
- Crowdsourcing means divvying up a large task amongst a large number of qualified participants. They get different instances of a particular kind of task, though they may pick them from a common set.
- The Oxford English Dictionary relied on volunteers to find citations of words in print.
- The Galaxy Zoo project asks volunteers to classify galaxies by shape (mainly clockwise spiral, anticlockwise spiral, elliptical and "other") -- not a hard task for humans, but one that the participants are assumed to be skillful at. In fact, you have to take a couple of little tests before getting real live galaxies to classify.
- Projects like GIMPS and SETI@home split large tasks across masses of mostly-idle processors.
- Wikipedia asks everyone to do a bit of editing. The aim is not to average everyone's opinions on a topic but to turn up and aggregate facts from independent sources.
- And now Peer-to-Patent asks everyone to turn up prior art on their choice of patent applications.
No comments:
Post a Comment