Headline Issues
communications
Intellectual Property
Media Regulation
E-Commerce
Global Economy
energy
competition
state policy
aspen summit
other topics
 
First Generation Filtering: Milestones and Millstones

Progress Snapshot
Release 3.14 November 2007

by Solveig Singleton *

View as PDF


Google has now initiated a beta test of a filter designed to identify copyright videos posted on to YouTube. Also, a group of content companies and distributors of user-generated content including CBS Corp., Dailymotion, Fox Entertainment Group, Microsoft Corp., MySpace, NBC Universal, Veoh Networks Inc., Viacom Inc. and The Walt Disney Company announced a set of "User Generated Content Principles" ("UGC Principles") to resolved problems with infringing material on user-generated content sites.  The companies' hope is that negotiated business solutions can avoid clumsy or expensive litigated and legislated approaches. These business developments are law and policy landmarks of sorts, though they leave many questions unanswered.

 The first question that naturally arises is how Google's implementation of filters will affect Viacom's suit against YouTube. Several cases in the Napster/Grokster line establish that Google can avoid liability by effective filtering. Most recently, the court hearing the suit against Streamcast and Morpheus emphasizes that the filtering must be effective, even the "most effective means available" to prevent infringement, regardless of cost. Thus if the filtering system works well enough, Google's action could in theory satisfy Viacom and relieve Google of liability going forward. This makes it more likely that this aspect of the suit will settle.

 But copyright owners--including Viacom--are unlikely to be completely satisfied with the filters going forward. Other plaintiffs are unhappy now, reportedly including Bob Tur, the photo-journalist who filed the first copyright suit against YouTube. The system works by comparing works being posted with copies of copyrighted content provided by copyright owners. The duty still lies with copyright owners to make copies of the content available for comparison.

 Many copyright owners object strenuously. If no more is required of YouTube or similar sites as a legal matter, it amounts to asking the copyright owner to effectively register their copyrights again and again, worldwide, with YouTube and any similar site. Even large entities with a worldwide presence might find this hard going, and smaller entities or individual owners would be in even bigger trouble. Fingerprinting technology that refers distribution sites to a single database of fingerprinted content would come closer to satisfying their needs. Ultimately, copyright owners are likely to expect sites to use more efficient and effective filters.

 Furthermore, Google's use of filters now does not settle the question of whether and what damages are owed Viacom and other plaintiffs for past infringement. If the latter aspect of the case goes to trial, "whether" is more important than what. Google's argument has been that YouTube is entitled to the same safe harbor as an ISP under the DMCA so long as it complies with notice and take-down. Since YouTube has not filtered until now, Google must continue to rely on the safe harbor argument to avoid liability for past damages. Should Google bet the (server) farm on such a defense? Given the wording of section 512 (c), the application of the safe harbor to YouTube seems unlikely, as it suggests that the harbor is not available when infringing material is apparent:

 (c) Information Residing on Systems or Networks At Direction of Users.—

(1) In general.— A service provider shall not be liable for monetary relief, or, except as provided in subsection (j), for injunctive or other equitable relief, for infringement of copyright by reason of the storage at the direction of a user of material that resides on a system or network controlled or operated by or for the service provider, if the service provider—

(A) (i) does not have actual knowledge that the material or an activity using the material on the system or network is infringing;

(ii) in the absence of such actual knowledge, is not aware of facts or circumstances from which infringing activity is apparent; or …

In the larger picture, the implementation of Google's filters makes it harder for any site that specializes in creating a platform for distribution of certain content to argue that large-scale filtering is too economically burdensome or technically difficult. And the going will be rougher for less specialized platforms making this argument as well, including ISP's. This is likely to have its greatest immediate impact on the Hill in Congressional debates about the duty and power of universities to block students from downloading and uploading copyrighted content.

And it has repercussions for online service providers generally.  In hearing in the spring of 2007, some Congressmen suggested revisiting the question of the duties and definition of ISPs and OSPs more generally in light of these and other business developments. Filtering will get cheaper and more effective at the same time as worldwide broadband usage ramps up.  The sheer volume of posts means the efficacy of  notice-and-takedown is reduced, and will be more obviously so as time goes on; DMCA provision i (1) (b) seems to anticipate the deployment of more sophisticated methods, in defining standard technical measures the use of which become a requirement for an OSPs entitlement to safe harbor defenses. Filtering seems like at least a partial answer to the question of which obligations are good candidates to replace notice-and-takedown.

 Certainly not, however, a final answer. The courts will continue to grapple with context-specific questions of who qualifies for the safe harbor, how effective filtering has to be, and the next question--how to approach methods of defeating the filters. Anxiety about privacy and free speech will emerge; some of it based on strange misconceptions about one's right to do wrong unperturbed by interference from other private interests affected, but not all of it.

 Also, policymakers will continue to grapple with the question of whether and when provision must be made for fair use. The adherents to the UGC principles promise to try to do so; several organization, including EFF, have proposed further guidelines for accommodating fair use within filtering, such as the rule of thumb that files that consist of less than 80 percent of a protected work would be removed only after infringement is confirmed by human review. In the United States, there is no general legal obligation for a filtering system to recognize fair use; likewise, it is hard to imagine any filtering system sophisticated enough to distinguish a fair use from an infringing use automatically. Legislating such a requirement would thus pull the rug out from under filtering and take content markets back to unfiltered square one, leaving consumers with nothing to stand between them and the occasional lawsuit.  

The most promising development would be for companies that employ filters to simultaneously roll out user-friendly licensing mechanisms to permit users to license uses that would otherwise be filtered. At the same time, dispute resolution procedures can be established so items wrongly caught in a filtering net can be freed (a piece of the puzzle notably missing from eBay's comparable infringement take-down process). The UGC Principles emphasize this two-pronged approach, currently the most likely solution to the fair use problem.  It has the advantage of keeping unwary consumers out of the courts, while helping them to understand what legitimate uses their service provider does support; competition will help drive service providers to widen the scope of legal uses supported closer to the full breadth of what copyright will allow, uses for which there is negligible demand excepted.

 A mature business consensus on this issue would both inform the courts and avoid Congressional tinkering. Filtering is a significant step towards a system for "enforcing" copyright that fits in a digital environment, unlike the slow, expensive see-you-in-court model.


*Solveig Singleton is a lawyer and senior adjunct fellow with The Progress and Freedom Foundation. The views expressed here are her own, and are not necessarily the views of the PFF board, fellows or staff.

 

 

The Progress & Freedom Foundation