The United States Patent and Trademark Office (USPTO) is in a pretty tight spot. The entire office is buckling under the weight of more than 600,000 backlogged applications. Within the software office, the time from application to resolution is typically four years – with the first replies from an examiner taking almost two years. For the technology industry, where product cycles routinely last only a few months, that’s stultifying.
Faced with this overload, the USPTO announced this week that it’s exploring forward-looking partnerships with technology companies, such as IBM, Red Hat, Novell, and Google, to create three evaluation systems, being worked on concurrently, to both increase the quality of software patents and shorten the time it takes the office to either issue or decline a patent.
One project would create a centralized, searchable repository of all open-source code and related documentation in existence. The second would create an indexing system to rank the viability of patent applications. The third would tap into the greater community’s intelligence when reviewing patent applications (something organizations such as Wikipedia have done for years, albeit toward a different goal).
All three of these projects would take advantage of a couple of the most lively trends on the Web right now: metatagging and social networking Metatagging allows individuals to add descriptive terms to online elements, such as photos or files, which make the files easier to find by others. Social networking sites such as Flickr and Delicious have built businesses, in part, around tagging and make it easy for people to share their expertise and opinions among designated “friend” groups or the community at large.
Metatagging may come into play with the patent office’s search tool and repository, although opening up the repository to even a limited number of people may prove troubling for some. In addition to creating a centralized repository for all open-source code and related materials (diagrams, documentation), the project group is also considering creating a taxonomy so that open-source developers can “label” their code to help patent examiners and other interested parties understand what it is. “The public could use it as well,” says Mark Webbink, deputy general counsel for Red Hat, and open-source seller on the Web. Then, a partner such as Google or IBM could create a search tool that would combine all the data and allow examiners to hunt the repositories for prior examples – as simply as someone might search for an online recipe.
“We need a tool that will enable sifting through the code in such a way that’s useful to the patent examiner,” says Manny Schecter, an associate general counsel for IBM. “We should be able to have it done this year.”
The problem, however, is that opening up the system to the public would make it difficult to create a standardized system that would be usable by the patent office. Today, patent examiners must peck and hunt through an almost infinite amount of information to determine prior art for software patents. (“Prior art” is the office’s term for a previous example of the item up for patent consideration.) If such an item is discovered, a new patent can’t be awarded. Organizations such as SourceForge and the Open Source Development Lab are invaluable resources, but still aren’t exhaustive in their collections. “There’s a lot within the open source community that’s valid prior art, but because of how it’s stored, it’s not accessible to examiners,” says John Doll, commissioner of patents with the USPTO. “We have a hope that if we have a standardized system, we can find it in the future.”
By “standardized system,” Doll means a system in which any type of open-source code can be searched using a centralized search tool, regardless of what format the code exists in.
It might seem that creating such a centralized, standardized system would be relatively easy. However, the problem facing software patents, according to Red Hat’s Webbink, is the relative newness of a software patent.
“Unlike the chemical, biological, and mechanical arts, where there are hundreds of years of prior art available, the patent office has only been doing software patents for 20 years and doesn’t have a repository,” says Webbink. “The PTO feels it’s getting criticism for not doing thorough searches, but in fairness, there’s no easy place to go for some of this stuff. The industry needs to help them if they want the situation to improve.”
Earlier this year, Professor Beth Noveck, an intellectual-property lawyer at New York Law School, wrote a blog post about the need for the patent office to take advantage of the collective intelligence online to assist in vetting prior art. What was needed, she wrote, was something like a wiki, where people could contribute their expertise on various matters. “We’re at a critical moment,” said Noveck. “We have the social software available with collaborative filtering, social reputation systems, so that we can do online peer review. There is so much dissatisfaction with the patent process; this is a ripe opportunity to move to peer review.”
Not long after her posting in July, Noveck was contacted by IBM about the idea. The company had been considering something similar. Since then, Noveck has written a draft proposal for the plan and, in good form, has launched a wiki for people to contribute thoughts on the proposal. The system could work as follows: vetted experts in various fields sign up for RSS feeds and receive alerts whenever a new patent application is posted online that fell within their expertise (such applications are available to the public). The experts could contribute their thoughts to the appropriate examiner on whether or not prior art existed, assisting in the patent process. “What might take an examiner 15 to 20 hours to research and determine might take an expert 15 minutes,” says Noveck.
Patent office commissioner Doll is excited by the collective intelligence idea, but is concerned that it might run afoul of some of the office’s legal frameworks. “It’s a great idea,” he says. “The problem we have is we have a statute and we’re restrained from opening the examination for anyone other than a patent office examiner.” However, Doll said there might be a window of time in which outside opinions could be heard – after the patent application has been made public, but before the grant is awarded. “Our legal staff is looking at that right now.”
Meanwhile, Noveck is launching a nationwide tour in the early spring to colleges and intellectual property think-tanks to help vet the idea; and the companies involved in the repository and indexing projects are already at work. Interested parties can attend a public meeting on the various proposals at the patent office on February 16.
A quick guide to the most important AI law you’ve never heard of
The European Union is planning new legislation aimed at curbing the worst harms associated with artificial intelligence.
It will soon be easy for self-driving cars to hide in plain sight. We shouldn’t let them.
If they ever hit our roads for real, other drivers need to know exactly what they are.
This is the first image of the black hole at the center of our galaxy
The stunning image was made possible by linking eight existing radio observatories across the globe.
The gene-edited pig heart given to a dying patient was infected with a pig virus
The first transplant of a genetically-modified pig heart into a human may have ended prematurely because of a well-known—and avoidable—risk.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.