User-generated content is wonderful stuff. Massive fortunes have been built on its back as we can see with the Facebooks and Twitters of the world. With the good comes the annoying, which in this case means coming up with a method for dealing with the massive amount of content users can upload to a site. Many businesses use moderators as a solution. If you do, it may be time to reconsider given a recent appellate ruling suggesting the use of moderators waives the safe harbor immunity provided under the DMCA.
“Direction Of A User”
To understand the case in question, we must first look at the actual language of the DMCA. The critical section reads as follows:
512(c) Information Residing on Systems or Networks At Direction of Users. –
(1) In general. A service provider shall not be liable for monetary relief, or, except as provided in subsection (j), for injunctive or other equitable relief, for infringement of copyright by reason of the storage at the direction of a user of material that resides on a system or network controlled or operated by or for the service provider, if the service provider… [Emphasis Added.]
Historically, courts have ruled that “at the direction of a user” should be interpreted liberally. So long as the user initiated the action, it didn’t matter whether a site or app used moderators or individuals to screen the content. To the surprise of nearly everyone, a ruling out of the Ninth District may change this standard.
The case in question is Mavrix Photographs, LLC v. LiveJournal, Inc. Mavrix sued LiveJournal for copyright infringement involving 20 photos uploaded by a user to LiveJournal. LiveJournal argued it was protected by the safe harbor provisions of the DMCA since the user had initiated the upload of the images. The trial court judge agreed, and dismissed the lawsuit against LiveJournal.
Mavrix appealed the case. In doing so, the company argued the DMCA safe harbor provisions did not apply in the case because LiveJournal used moderators to review the content and decide which images would be posted to the site. Given this, it was the moderators, not the users, who were “directing” which material would appear on the site. Mavrix further argued that the moderators were agents of LiveJournal, which meant LiveJournal was directing the upload of content from a legal perspective.
A three-judge appellate panel of the Ninth Circuit Court agreed.
In ruling, the court did not find LiveJournal liable per se because of the use of moderators. The court instead suggested that where the screening process is not automated, the trial court must consider the extent to which the moderators had the ability to control what was and was not shown on the site. If Mavrix could show the moderators were heavily involved, then the DMCA protections would not apply.
Interestingly, the Mavrix court is not the first to address the moderator issue recently. In BWP Media USA Inc. v. Clarity Digital Group., LLC, the Tenth Circuit ruled that the “direction of user” issue should be interpreted to mean the person who initiated the upload and the presence of moderators was not relevant to the discussion. This decision was presented to the appellate court in the Mavrix case, but the appellate justices rejected the decision out of hand leaving us with a split between the Ninth and Tenth Circuits.
Our Agent Service
Does the Mavrix case impact our service? In a word – no. We do not “direct” or moderate the content uploaded to your site or app. We only receive copyright complaints on your behalf, so our service does not place your safe harbor immunity at risk under Mavrix Photographs, LLC v. LiveJournal, Inc.
Using moderators, employees or third parties to screen content users are uploading to your site or app? Make sure to consult with legal counsel to determine the impact on the immunity granted to you under the DMCA.