Archives for November 2015

Sony BMG v. Tenenbaum: Judge Provide Outlines of Possible Fair Use Defense for Peer-to-Peer File Sharing

Participants in the P2P world have long hoped that courts would recognize that at least some forms of file sharing constitute fair use. In a recent opinion in the Tenenbaum file-sharing case, Judge Gernter of the District of Massachusetts enumerated several unauthorized uses of copyrighted music thinks might constitute a fair use — including file-sharing under certain limited circumstances. While Judge Gertner’s opinions have limited precedential value, they may point to some pathways to legitimacy for the oft-maligned P2P industry.

Joel Tenenbaum was a sophomore at a small college in Baltimore. Like many other students, he used a number of music file-sharing services, including Kazaa, through which he shared songs with other users. In 2007, he was sued by Sony BMG and other recordings company for copyright infringement for sharing 30 songs. One of the defenses that Tenenbaum raised was fair use — a defense which the Judge Gertner rejected in July 2009, shortly before the commencement of trial. As is well-known, trial did not turn out well for Tenenbaum, who was found liable for infringement and statutory damages of $675,000.

On December 7, 2009, Judge Gertner issued a lengthy opinion that provided the full justification for her earlier rejection of Tenenbaum’s fair use defense. See Sony BMG Music Entertainment v. Tenenbaum, D. Mass., Memorandum and Order (Dec. 7, 2009). Judge Gertner began by agreeing with Tenenbaum that the Copyright Code does provide for a fair use defense that applies to all forms of copyrights. See 17 U.S.C. § 107. Judge Gertner noted that the fair use defense was developed by judges “who recognized that the monopoly rights protected by copyright were not absolute.” Where a use did not injure the market for the original work, and advanced a public purpose, such as education or artistic innovation, it could be considered “fair” and not infringing.

When Congress codified the fair use doctrine in Section 107, it set out a list of four, non-exhaustive, factors that a court is require to determine whether a use is fair. These include: (1) the purpose and character of the use, including whether the use is commercial or for nonprofit educational purposes, (2) the nature of the copyrighted work; (3) the amount and substantiality of the portion used in relation to the copyrighted work as a whole, and (4) the effect of the use on the potential market for or value of the copyrighted work. Courts have added several additional factors.

Purpose and character: Judge Gertner found that the key issue was whether the defendant’s use of a work was “accompanied by any public benefit or transformative purpose.” Tenenbaum argued that file-sharing provides a public benefit by increasing access to copyrighted works. Judge Gertner found that this public benefit was not sufficient, since “nearly every unauthorized reproduction or distribution increases access.” For file-sharing to constitute a transformative use, it would need an effect similar to that held fair in the Betamax case — permitting the use of a work in new way, rather than merely providing users with free copies of works that they could otherwise purchase.

Harvard Study Finds Significant Limits in the Ability of Current Technology Used by Social Networking Sites to Reduce Online Risks to Minors

In a report released on January 14, 2009, the Internet Safety Technical Task Force concluded that the technologies currently being used by digital media companies to address youth safety are “helpful in mitigating some risks to minors online, but none is fail-safe.” The study, which was conducted at the Berkman Center for Internet and Society at Harvard University for the 52 State Attorneys General, reviewed technologies such as age verification and identity authentication, filtering and auditing, text analysis and biometrics. (fn1) However, it found that these technologies do not even address the most common online threats faced by minors — harassment and bullying. Moreover, while the these technologies can be of use against other threats, such as preventing minor access to adult content, each can be circumvented.

The Task Force report identified three major categories of threats faced by minors online: (1) sexual solicitation, (2) online harassment and cyber-bullying, and (3) exposure to problematic content. Of these, the Task Force found that bullying and harassment, most often by peers, are the most frequent threats that minors face online. Bullying and harassment include acts designed to embarrass, humiliate or threaten a minor.

While sexual solicitation is a risk, the study found that “the image presented by the media of an older male deceiving and preying on a young child does not paint an accurate picture of the nature of the majority of sexual solicitations.” Rather, most solicitation is between minors, and even in most off-line encounters arranged through the Internet, the minor knows that he is being solicited by an adult. While there is a risk of exposure to unwanted harmful material, “those most likely to be exposed are those seeking it out, such as older male minors.”

Circuits Shift Away from Finding that the Communications Decency Act Provides Broad “Immunity” from Liability for Third-Party Content to Digital Media Providers

Sometimes a shift in label can signal a shift in policy. A recent shift by the Ninth Circuit away from the use of the term “immunity” when describing the effect of the Communications Decency Act appears to signal such a change.

In earlier cases, the Ninth Circuit frequently referred to the Communications Decency Act as providing “immunity” to internet service providers who publish third-party material. See, e.g., Batzel v Smith, 333 F.3d 1018, 1029 (9th Cir. 2003). Many other circuits followed this characterization. The exception was the Seventh Circuit, which pointed out that the operative language, in 47 U.S.C. § 230(c)(1), did not use the word “immunity”, but merely provided an exclusion from liability by means of a definition — by defining an internet service provider as not a “publisher or speaker” in certain contexts. Doe v. GTE Corp., 347 F.3d 655, 660 (7th Cir. 2003).

The Seventh Circuit’s approach is not to assume that an internet service provider (ISP) receives blanket immunity for third party content, but to ask whether the suit in question is seeking to treat the ISP as a publisher or speaker. See Chicago Lawyers’ Comm. For Civil Rights under the Law v. Craigslist, Inc., 519 F.3d 666, 670-71 (7th Cir. 2008). If the theory of liability is something other than that the ISP is publishing or speaking the words in question, liability may be imposed. For example, the Seventh Circuit stated that Section 230(c)(1) would not “help people steal music or other material in copyright.” Id. at 670. (Fn1) The Communications Decency Act would not protect such activities as aiding, abetting, inducing or encouraging, or conspiracy with, a third party to place illegal content on a site. Id. at 671-72.

In earlier decisions, the Ninth Circuit has not been adverse to finding against Communications Decency Act immunity for internet service providers. However, it generally did so by finding that the ISP was itself a co-provider of the illegal content. This was the approach in Batzel v. Smith and Fair Housing Council v. Roommates.com. (Fn2)

While the Ninth Circuit probably has not abandoned this approach, in Barnes v. Yahoo, the Ninth Circuit has now also adopted the Seventh Circuit’s “definitional” method for analyzing the scope of the Communications Decency Act. Barnes v. Yahoo, 2009 WL 1232367 * 3-4. This enabled the Ninth Circuit, in Barnes, to find against CDA protection for third-party content, because it was able to characterize the cause of action as something other than holding an ISP liable for speaking or publishing third party content — in this case, breaking a promise regarding third party content.

Looking at typical cases in which the Communications Decency Act has been applied (defamation, fraud, obscenity, assault/harassment), the “definitional” approach to the scope of the CDA would seem to move the debate to determining the kinds of acts by an ISP that rise to the level of encouraging illegal behavior. Given the fact-intensive nature of this determination, the outcome of many cases currently in the works should be interesting.

Communications Decency Act Update: A CDA Defense Can Be Raised in a Rule 12(b)(6) Motion to Dismiss

Two recent decisions have eliminated questions about a defendant’s ability to use the Communications Decency Act (CDA) to obtain a quick dismissal of a lawsuit. Federal rules permit a defendant, under certain circumstances, to get an immediate dismissal of a lawsuit, without every being required to file an “answer” to the complaint, make any disclosures, or engage in any discovery. Winning such a “motion to dismiss” cuts off a lawsuit at its knees, immediately eliminating the costs and risks associated with the suit.
One of the bases on which a motion to dismiss can be brought is “failure to state a claim on which relief can be granted” — a Federal Rules of Procedure “Rule 12(b)(6)” motion. In general, a Rule 12(b)(6) motion can only be used if the complaint is so defective that the plaintiff’s allegations against the defendant, even if true, would not qualify for any form of relief from the court. For example, a complaint for common-law fraud would be dismissed on a Rule 12(b)(6) motion if it failed to allege that the defendant made a false statement that the plaintiff actually relied on — because to get damages for a false statement made by a plaintiff, the defendant must have actually relied on that false statement.
Internet service providers have often used Rule 12(b)(6) to obtain dismissal of suits brought against then for their publication of third-party material by successfully asserting that the Communications Decency Act (47 U.S.C. § 230) barred the claim. However, a recent ruling from the Ninth Circuit threatened to overturn this practice. In a May 7, 2009 opinion in Barnes v. Yahoo!, Inc., __ F.3d___, 2009 WL 1232367 (9th Cir. 2009), the Ninth Circuit stated that “section 230(c) provides an affirmative defense” and that [t]he assertion of an affirmative defense does not mean that the plaintiff has failed to state a claim, and therefore does not by itself justify dismissal under Rule 12(b)(6).” The proper procedure, according to the opinion, was for the defendant Yahoo to have filed answer asserting its CDA defense, and then to have filed a motion to dismiss under Federal Rule of Procedure 12(c) — a motion for judgment on the pleadings.
A Rule 12(c) motion can’t be filed until all the pleadings are “settled” — i.e., after the complaint and all answers have been filed, and all Rule 12(b) motions resolved. This might not occur until many months after a suit is filed. Following the procedure suggested by the Ninth Circuit would have forced Yahoo to start making unwanted disclosures in its answer and possibly under federal automatic disclosure and discovery rules, and to have continued to burn through cash defending the suit.
When I first read this portion of the Ninth Circuit opinion on Barnes v. Yahoo, it struck me as a little odd. Every litigator knows that courts don’t like to waste time with obviously meritless suits and that courts often will grant a motion to dismiss if the plaintiff’s allegations reveal the presence of an affirmative defense that would bar the case from proceeding. The most common example would be if the allegations in the complaint show that the claim is barred by the statute of limitations. I have participated in successfully bringing several such motions.