Making Elsevier look good

For many years, Dutch publishing giant Elsevier has been a kind of bête noir for academic librarians, serving as principal whipping-post for the exorbitant price increases that have been strangling off the scholarly communications system for over 20 years. But the ground has shifted somewhat, and we have recently observed some academic press and scholarly societies – agencies whose mission is, putatively, to serve research and scholarship – adopt policies that make Elsevier look almost scholar-friendly.

We have recently witnessed the unseemly spectacle of two at least nominally university-related presses suing a university to try to narrow the scope of fair use for academics, calling out by name some of the very authors upon whom they depend for the content that fills the pages of their publications. Now another organization that is supposed to represent scholars, the American Psychological Association, has turned to bite the hand that feeds it.

First there were the threats to sue a major American university library for allegedly using too many examples from the “APA Manual of Style” in the teaching materials it creates to help students learn how to use that citation format. Since continued sales of the Manual depend on students being trained to use it and faculty assigning it, and since there are other nearly identical and completely substitutable style formats available, it is hard to see what these threats could hope to accomplish. Shutting down one’s principal market is a radical and unproductive way to protect one’s copyright.

Now comes the news that the APA is announcing that authors publishing articles in its journals that are based on NIH-funded research “should NOT” deposit their own works in PubMed Central as is now required by law. Rather, they will be required to pay APA $2500 so that the articles can be deposited by the publisher. Since there is virtually no cost associated with the mechanics of deposit itself, and the NIH policy allows an embargo on public availability of articles of up to one year in order to protect the traditional subscription market, it is hard to see what this policy is intended to accomplish other than to force an additional income stream out of the faculty authors who already provide the APA with free content. And there is heavy irony in the APA’s assertion that they can do this “as the copyright holder.”

APA is trying to put its own authors between the proverbial rock and a hard place, and it is behaving as if theirs is a non-competitive market. This is not, in fact, the case – only two of the top ten psychology journals in 2007, based on impact factor, were published by the APA, and one non-APA journal editor expressed pleased surprise at the new policy because it was sure to benefit those other journals. But for years our faculties have behaved as if they were, indeed, captive to specific journals. As scholarly societies are driven, apparently by fear and anger more than a realistic business strategy, to treat the authors on whom they depend with such contempt, one can only hope that this misperception will begin to change.

Two simple and specific messages need to be delivered over and over to our faculty authors if this dysfunctional and abusive system is to change.

First, they need to be reminded that they do have choices about where they publish their work; there is no logic in remaining loyal to a particular journal when the publisher of that title has clearly decide to place profit and self-interest above the well-being of the academy, the discipline, or its scholarly authors.

Second, regardless of where they publish their research, scholars should resist transferring copyright to journal publishers. APA can only tell scholarly authors what that can and cannot do with their work after they have received a transfer of copyright; up to that point they must negotiate, not dictate. Academic presses can only sue universities over e-reserves because they have been given the copyright in those scholarly works in the first place. To cut the Gordian knot that is plaguing our scholarly communications system, we need to make an exclusive right to publish for a limited time (with reservation of some negotiable authors’ rights within that period) the standard for scholarly publishing agreements. As the original owners of copyright, forcing that change is within the power of faculty authors.

NOTE — Half an hour after this post was published, the APA web page referenced above no longer carries the policy announcement and says simply that the page is under review. We shall have to wait and see what APA comes up with, but the two cardinal points mentioned herein remain valid and urgent.

What can best practices do for us?

As promised, I want to look at a different kind of “new tool” to help users of copyright-protected content figure out what they can and cannot do as they work on new creations.

Best practices are a relatively new phenomenon in the copyright environment. The Center for Social Media at American University, a joint project of School of Communications and the Washington College of Law, has really lead the way in creating statements of best practices around fair use in video production. The first one, produced in cooperation with several documentary film groups, is a Documentary Filmmakers’ Statement on Best Practices in Fair Use. That statement has proved very successful in gaining recognition both amongst filmmakers and from the ancillary organizations like the insurance companies that support and underwrite documentary film projects.

Next there was a report on user-generated video called Recut, Reframe, Recycle that spelled out six creative practices that, the report’s authors felt, were potentially legal but were in danger of being curtailed by the draconian measures being sought by many in the content industry to combat online sharing of video and music files. Even though creative remixing is a very different activity, both legally and in its value to society as a whole, much of the “anti-piracy” rhetoric seems unable to make even the grossest distinctions. Thus the stakeholders in that conversation felt the need to articulate another set of best practices, released last week.

The Code of Best Practices in Fair Use for Online Video is intend to provide support for the activities of filmmakers who create works like “Dramatic Chipmunk,” which is used to illustrate the report’s cover. Such works are new creations built from the building blocks of other people’s work. This, of course, was the original purpose for the “copyright bargain” Congress was empowered to make by the Constitution (although the Framers probably did not foresee some of the results of that bargain!). This new code of best practices describes itself this way: “This is a guide to current acceptable practices, drawing on the actual activities of creators, as discussed among other places in the study Recut, Reframe, Recycle: Quoting Copyrighted Material in User-Generated Video and backed by the judgment of a national panel of experts. It also draws, by way of analogy, upon the professional judgment and experience of documentary filmmakers, whose own code of best practices has been recognized throughout the film and television businesses.”

For me, an immediate question is how these statements of best practices differ from the various attempts to articulate guidelines to define fair use, attempts that have caused great anxiety and a notable “chilling effect” on fair use despite the best intentions of those who promulgated them. The quickest answer is that best practices are usually generated from within an industry or an industry segment, whereas guidelines have traditionally been negotiated between users and rights-holders. “Best practices” are not an attempt to define a “safe-harbor” that will necessarily protect one from lawsuit, especially since many such attempts have proved illusory in the past. Rather, their aim is to accurately describe a consensus with a particular user group about what is and is not acceptable. Such a consensus can serve a couple of purposes.

First, it can help prevent the kind of “self-censorship,” or chilling effect, that is all too familiar among users; the decision by a filmmaker to forgo the best shoot or abandon good footage because a copyrighted work was accidentally captured in some of the frames, for example. Best practices can provide reassurance to that filmmaker that what she hopes to do is well within the standard practice through her industry.

Second, best practices could provide courts with exactly the kind of “industry standard” that is useful in determining when to find infringement or to protect a particular use as fair use. These documents can provide courts with a synoptic view of what kinds of practices are necessary for professional filmmakers and amateur videographers alike to create new works. By spelling out what kinds of practice are needed, as far as fair use is concerned, for creativity to flourish, statements of best practice can show courts that the particular industry is acting in good faith and can provide a broader perspective on the specific issue that has come before that court.

Best practices will not solve all the problems in the highly contested world of copyright and user rights, but they can serve a useful purpose. It is important to distinguish that purpose from the more grandiose and unrealistic claims made for copyright guidelines. Best practices may not stave off lawsuits, but they can help courts judge those lawsuits fairly and they can help users avoid letting the fear of a lawsuit overwhelm their urge to create.

Note — after completing the above post I discovered this contribution to a debate about best practices, which I now call to the attention of interested readers.

New tools for recording copyrights

Several new tools have recently become available to make copyright record keeping and searching somewhat easier, although it still is not what could be called simple. Perhaps more importantly, another set of “best practices” in fair use has been issued by The Center for Social Media at American University, which offers the opportunity to comment on what these statements of best practices are and what they hope to accomplish.

The first new tool worth noting is from the Copyright Office itself — a new ingestion system that permits, for the first time, online registration of copyrights. It is hard to believe that this is the first time the Copyright Office has stepped away from paper forms, but that is the case. Starting July 1 it is possible to submit an online registration form and pay a fee that is $10 lower than the standard $45 cost of registration. The deposit requirement, which mandates that copies of a registered work be sent to the Library of Congress, will still have to be fulfilled by U.S. mail. It is also possible to track the status of a registration process that is done online. In addition to the online system, there is also a new paper form which uses barcodes to speed processing; the applicant fills out the form online, prints it off and mails it with the regular fee, but it does not take as long, in theory, for the Copyright Office to process. Since registration is still necessary before a copyright holder can file suit for infringement, a quicker registration system should help speed the judicial process a little. It will also make it easier to find copyright owners for works that are relatively new or newly registered.

Searching for copyright owners will become much more urgent if any version of the Orphan Works legislation pending before Congress actually is enacted, so copyright renewal records are as important, if not more important, than initial registrations. For new works, there is no doubt that copyright protection is in force unless there is some form of waiver like the Creative Commons license. But for those works most likely to be orphaned — works published between 1923 and 1963 — it will be vital to know if a copyright was renewed and, if so, by whom. Stanford University has offered a database of copyright renewal records for some time, and now there is a single XML file of both renewal records and original registration records from 1978 onwards available from Google. The digitization of these records required the efforts of several dedicated organizations, including Carnegie Mellon’s Universal Library Project and the Project Gutenberg.

Once this XML file became available, it did not take long for some copyright geeks (no offense intended; I am one myself) to design a simple interface to search these records. This site designed by a law student at Tulane University, under the direction of Professor Elizabeth Townsend Gard, should make it much easier to examine the Copyright Office records, and they are promising a more sophisticated tool by Fall. Whether or not we actually get orphan works legislation, it remains very difficult to find rights holders for lots of different kinds of works, and we must be grateful to all of the folks who have created these tools to make that important task a little bit easier. All of the sites, however, come with the warning that it is never certain, based on a search of these records, that a copyright was NOT registered or renewed; while they will tell us who did file for registration or renewal, it will remain something of a risk to use a work for which one does not find a record in these databases. That is why orphan works legislation is needed, so that a user who makes the effort to search these records and cannot, in good faith, find a rights holder is subjected to a much lesser risk than one who uses a work without any attempt to find out if copyright persists and by whom it is held.

See tomorrow’s post for discussion of a different kind of new tool — a statement of best practices for fair use in online video.

How “real” is intellectual property?

Toward the end of a session on copyright at the American Library Association’s annual conference last week, Carrie Russell, who is the Director of the ALA’s Program on Public Access to Information, exhorted the audience never to speak about copyright “ownership.” “Rights holders,” she said, do not own anything at all; holding IP rights is not the same as owning “real” property.

Based on the common understanding of ownership, it is easy to see what Carrie is getting at here, and to agree with it. There are fundamental differences between real property and intangible intellectual property. The most obvious is that borrowing or sharing intellectual property does not diminish the supply of it. And a rights holder loses his or her rights after a set period of time, the period set by the statutory grant of those rights. This makes it very clear that intellectual property rights are indeed a creature of law, created by legislative action and not by natural right.

But in truth, all property ownership, at least in the 500-year-old Anglo-American tradition, is similarly limited. It is a truism of property courses in law school that owning a piece of land means holding a bundle of rights, most importantly the right to exclude others from the property. But once real property ownership (as well as ownership of “chattel”) is seen as a bundle of rights (just as copyright is), the distinction between real and intellectual property seems less clear and telling. In a recent blog post about the Israeli copyright scholar Orit Fischman Afori, William Patry has occasion to quote the British philosopher Jeremy Bentham on this topic: “there is no such thing as natural property; it is entirely a creature of the law. … Property and law were born together, and would die together. Before the laws property did not exist; take away the laws, and property will be no more.” If real property is subject to the same limitations as intellectual property — each is a limited set of rights granted by statute rather than a permanent and uncompromisable outgrowth of natural law — it is interesting to ask what the real consequences of the analogy between owning IP and owning a car or a piece of land might be.

This analogy, of course, is a favorite of copyright “maximalists” who frequently complain, for example, that car thieves get thrown in jail while “pirates” of copyrighted music must be sued individually and at great cost to the rights holder. Many would like to view ownership of IP as a kind of “allodial,” or absolute, ownership, and would be surprised to learn that no ownership under our system of law derived from feudalism is actually so absolute. All ownership is subject to limitations imposed by law to achieve a fair balance between exclusive possession and socially beneficial use. If maximalists got their way and IP ownership was really treated just like owning real estate, they might regret what they wished for.

Real property ownership is, after all, subject to lots of limitations. Zoning laws, for example, place strict limits on the use of particular parcels of land; I cannot open a law office in my garage in the neighborhood in which I live because it is zoned for residential use only. Not really very different from all those restrictions on the exercise of copyright found in sections 107-122 of Title 17. And in the world of both real and personal property, the “doctrine of first sale” is virtually absolute; the law looks very suspiciously on any attempt to restrict the “free alienability” of land and often will not enforce such restrictions. IP owners who have recently tried to attack first sale in several court cases would not benefit much if the analogy with real property were strictly applied. Finally, property rights in the bundle that land owners get can be lost if they are not exercised. If I occupy a piece of land for a set period of time — 15 years in many states — and the owner makes no attempt to eject me, I will become the new owner of that land. Imagine how our orphan works problem would diminish if we applied that same principle to copyrights. On this score, copyright owners, whose rights persist for life plus 70 years whether they exercise them or not, are much better off than are those who own land. A copyright holder can choose to exercise their exclusive rights in one case, then ignore other infringements for many years before electing to enforce their rights again; a landowner does not have that luxury.

The relationship between real property ownership and the same concept regarding intellectual property is complex, but both are bundles of rights that are subject to many limitations and exceptions in statute and in common law. Neither copyright maximalists nor those who advocate for more limited IP rights have the argument all their own way when the analogy with land is invoked, but especially for the copyright owner who asserts that his or her rights should be treated just the way real property ownership is treated the message is ‘be careful what you wish for.”

Shaking the money tree

In a talk given at Cornell University last week, Steve Worona of EDUCAUSE said about business models for distributing intellectual property that “every few years the entertainment industry has to be dragged kicking and screaming to the money tree and have it shaken for them.” His point that the first reaction of entertainment company executives is to “tamp down” new technologies in order to protect out-dated business models is certainly borne out by recent history. Back in the 1980’s, of course, the industry fought hard against the growing use of home video recorders, both in the Supreme Court and in Congress, even as a new business model that would eventually make billions for the studios was being developed in spite of their opposition. No less an advocate for the old ways than Jack Valenti eventually realized that the movie industry lost that battle because they were perceived as anti-consumer. Nevertheless, the recording industry continues to make the same mistaken, even going so far as to sue they very consumers on whom it relies.

Are there alternatives? Worona’s talk is very persuasive in its discussion of why old models (based on counting copies) do not work for new technologies (which replicate bits) and how it is possible to develop new models that really can “compete with free.” I have written about such models before, and also noted in a post last week this article by Tim Lee about an alternative path for copyright law that could support such new ways of profiting from intellectual property without crippling technological innovation. Some of those alternatives deserve further discussion. (and a lively discussion is continuing on the Cato Unbound site).

First, it is worth noting the survey, reported by Ars Technica, that suggests that young people are willing to pay for music if it is offered on terms that seem reasonable to them. Although I can imagine the skepticism this will generate within the content industries, it at least suggests that innovations, rather than lawsuits, are worth a try; both may be risky, but the rewards will be greater from the latter.

Lee’s article briefly catalogs a variety of business models, in several different content industries, that rely on new ways to make a profit. One that caught my eye was the Web service called Imeem, which combines a legal music downloading service with social networking opportunities. Revenue is generated through advertising, and the music is licensed using revenue-sharing agreements with the four major record labels. Users can create and share playlists and download music from those shared lists for free. As Lee says, “It is only a slight exaggeration to say the Imeem deal amounted to a de facto legalization of online file sharing, provided that the labels get a cut of any associated revenues.” Is this the future of the music business? I don’t know for sure, but I do know that I, as a music lover who has never obtained a music file from any online source other than iTunes, will now be looking on Imeem first; legal, ad-supported free music  certainly works for me.

In his talk at Cornell, Worona suggested that, when a business learns that it will have to compete with free – with someone offering the same or substitutable product at no cost – the appropriate response is not to call the FBI, as the recording industry has done, but to call its own marketing departments. That is what Imeem has done, and they are giving the money tree yet another shake; let’s hope the music industry is paying attention this time.

Bad strategy and poor reporting

It is hardly surprising that the recent effort by the Associated Press to stop bloggers from quoting news articles, even when they link to the source on AP’s own web site, has generated lots of comment in the blogging world. AP recently sent takedown notices, using the procedures outlined in the Digital Millennium Copyright Act, to try to have blog posts that quoted as little as 35 words of an AP story removed from the Internet. The has been enough coverage that it seems unnecessary to rehearse all the commentary; there is a story at Ars Technica here, and one from the Electronic Frontier Foundation here. Basically most of the coverage makes the same two, fairly obvious, points; this is a terrible strategy from a public relations point of view (as even AP now admits) and it represents an interpretation of fair use that would entirely eviscerate that vital exception if accepted by the courts.

What does deserve extended comment however, is one of the news stories that repeats a couple of common misconceptions that need to be dispelled. This report on the E-Commerce Times site offers the opportunity to clarify and correct two important errors about the DMCA and fair use.

First, the E-Commerce story quotes a source who refers repeatedly, and defiantly, to “this ruling.” This is probably just careless language, but it also re-enforces the mistaken notion that receipt of a DMCA takedown notice means that infringement definitely has taken place. In fact, a rights-holder sends a takedown notice, using very specific provisions that the DMCA added to chapter 5 of the copyright act (17 U.S.C. 512), because they merely believe that their copyright is being infringed. There is no required quantum of evidence beyond a “good faith belief that use of the material… is not authorized,” nor must a rights holder consider possible defenses to the claimed infringement. These provisions were never intended to substitute for a judicial determination on the question of infringement; they are intended, instead, to help the ISP avoid liability for any possible infringement by users of the service. The ISP does have to remove the material or block the user upon receipt of a take down notice, but they also must notify the user of the action and restore the material if the user sends a counter notice stating their own good faith belief that the removal was wrongful. Thus the notice and takedown process helps establish if there really is a conflict and gives the ISP a protected role when there is, but it leaves the resolution of the issue of infringement up to a court. The mere fact that the AP sent these initial notices is in no way any sort of “ruling” or definitive decision.

The second error in the E-Commerce story is its reference to “the fair use provisions of the Digital Millennium Copyright Act,” which, we are told, the AP hopes to clarify. There is, of course, no fair use provision in the DMCA; fair use is much older than that piece of relatively recent legislation. Indeed, fair use is a doctrine initially created by judges in the early 19th century (in the US) to mitigate the harmful effects of the copyright monopoly. The DMCA, which took effect only in 2000, does not add anything to the fair use analysis, nor does it, in theory, narrow its scope; where fair use is mentioned in the DMCA it is only to emphasis that Congress did not intend the provisions of the DMCA, which attempt to deal with some of the new issues arising in a digital environment, to alter the applicability of fair use.

This last point is important, because it reminds us that we are not dealing with any new provision about what uses are acceptable in the digital realm. Instead, the same old provision about fair use (17 U.S.C. 107), which emphasizes the privileged status of news reporting and has traditionally been held to protect short quotations, would be applied in deciding whether or not these passages from AP news stories were used by bloggers in a manner authorized by law. The assertions by AP that these uses are not fair use seem difficult to credit, but the point is that a court would have to decide the issue (if the AP decided to push that far; it is a much more costly and serious step than merely sending a takedown notice), and the standard used to make that decision would be the familiar four factors of fair use, just as they were outlined by Justice Story in 1841.

Everything old is new again?

Some intellectual property issues are hardy perennials; they bloom anew with great regularity. One such issue is the doctrine of first sale, which in other countries and other contexts is sometimes called the doctrine of exhaustion. However it is named, it refers to the nearly universal practice of holding that the “first sale” of a particular embodiment of intellectual property – a copy of a book or a CD – “exhausts” the exclusive right of the copyright holder to control further distribution of that embodiment. It is the right of first sale that allows used book stores, video rentals and lending libraries to flourish.

First sale has never been popular with the content industry; both licensing arrangements and DRM can be seen as modern attempts to exercise control over the downstream use and distribution of IP beyond what is allowed by copyright law. Back at the beginning of the 20th century, in fact, the Supreme Court had to deal with a case involving what I like to call the first “end user licensing agreement.” In Bobbs Merrill Co. v. Straus (1908), the Court found that an attempt by a publisher to mandate the retail price at which stores could sell the book “The Castaway” by Hallie Rives failed because of the doctrine we now call first sale. The publisher of this obscure novel inserted a “requirement” underneath the copyright notice that the retail price of the book must not be less than one dollar, and sued the store owned by Isidor Straus – Macy’s – when it sold copies for less.

In the past few weeks, two cases have been decided, one in the copyright arena and one dealing with patents, that again remind us of the continuing importance of first sale/exhaustion in a balanced system of IP protection.

In Universal Music v. Augusto the facts sounded strangely similar to Bobbs Merrill; a music company tried to distribute free promotional CDs of its music and prevent the resale of those CDs by simply placing a notice on the face of the disc. In granting summary judgment for the E-Bay vendor who resold some of this CDs, Judge Otero of the Central District of California noted that this kind of restraint on subsequent transfer had been rejected over 100 years ago. Also implicated rejected in this decision is the attempt to create a license transaction merely by a one-sided statement that that was what was occurring. The court rightly found that the CDs were transferred to the recipients (by gift, in this case) and were therefore subject to the exhaustion of the distribution right.

The patent case, Quanta Computer v. LG Electronics, also involved an attempt to control subsequent uses of a product embodying a patented process after the initial sale of that product. LG sued to collect a licensing fee from Quanta because Quanta used chips containing a process patented by LG, even though those chips were manufactured by an intermediary company (Intel) that had itself licensed the process from LG. In essence, LG wanted a cut on every downstream product that contained the already-authorized chips, but the Supreme Court said no: “The authorized sale of an article that substantially embodies a patent exhausts the patent holder’s rights and prevents the patent holder from invoking patent law to control postsale use of the article.”

As sturdy as these recurring issues are, however, we should not conclude that copyright law is ticking along without difficult, adequately resolving conflicts in the 21st century with its arsenal of 20th century doctrines. The current Issue of “Cato Unbound,” on “the future of copyright,” does a superb job of alerting us, if we didn’t already see it, that copyright law is struggling to keep up in the digital age. The lead essay by Rasmus Fleischer, begins with the fascinating point that in the 21st century we have moved to trying to regulate tools with our copyright law rather than content. In a digital age, he points out, many of the distinctions our law relies upon, like the difference between copying and distribution, no longer make any sense. As Fleischer says, “the distinction is ultimately artificial, since the same data transfer takes place in each.” This point undermines the comforting thought, expressed above, that first sale, for example, is still doing its job in copyright law, since the move to a digital environment makes application of an exception to the distribution right, but not the right of reproduction, highly problematic.

Fleischer’s article goes on to paint a fairly gloomy picture about a “copyright utopia” being advocated by the content industries, especially big entertainment companies, that could seriously undermine both technological innovation and civil liberties. He ends with the “urgent question regard[ing] what price we will have to pay for upholding the phantasm of universal copyright.”

In a reply essay, “Two Paths for Copyright Law,” Timothy B Lee suggests that things may not be as bleak as Fleischer suggests. He reminds us that it is only a very recent development that anyone has even considered question the legality of private, non-commercial copying for home use, and he opines that the effort to now assert control over such copying has already proved a failure. The alternative — the second path for copyright — is, as has been suggested before in this space, the development on new business models, which will largely be funded by advertising, to meet the non-commercial demand for content. The role of copyright law, in this scenario, is to protect content creators from unfair and unauthorized commercial exploitation of the works by competitors. It is commercial competition that copyright is intended to regulate, he suggests, not use by consumers. And he catalogs a wide variety of business models already being adopted by the major content industries, even as the pursue lawsuits against customers and strict laws from Congress, that seem to recognize the inevitable move towards a market solution, rather than a legal one, to the challenges posed by new technologies.

Some copyright doctrine remains unchanged over a hundred years, yet we have to adapt to rapid innovation even as we preserve what works in our law. The essays by Fleischer and Lee paint two different pictures of the future of copyright; the attraction of Lee’s vision, for me, is that it looks at what copyright has traditionally been designed to accomplish, the control of commercial competition, and offers hope that if we stay focused on that role for the law, the market will adjust to the technological innovations for users that currently frighten the content industries so.

A”twitter” about contracts

Although I had heard of Twitter for a while now, I did not really know what it was until prompted to learn more by two recent articles. One is this piece in the Chronicle of Higher Education about potential library uses for the “microblogging” or social messaging service. It recalls the discussions I heard recently about the different level of involvement folks from my institution felt at an academic conference when the audience for various talks was using Twitter during the programs to share comments, examples and the like. Rather than being distracting, as I suspected it would be, the reports were that this added a welcome dimension to the conference experience.

What caught my professional attention, however, was this report of an ongoing controversy between Twitter and some of its customers about the terms of service to which every user agrees when they sign up for the service. The specific argument concerns the degree to which Twitter was obligated to pursue complaints of harassment directed against another user. On that issue, Twitter seems to be caught between a rock and a hard place — if they do not take steps to stop harassment they seem to condone a clear violation of a condition of use that they imposed, but if they do take action they may put in jeopardy the “safe harbor” protection from liability based on user postings that they gain under section 230 of the Communications Decency Act.

The broader issue, in my opinion, is the role of these terms of use statements in governing the relationship between users and the providers of Internet services. For one thing, it seems that such contractual agreements can be changed at the will of the provider. As the article cited above tells it, rather than address the harassment issue, Twitter indicated that it would wash its hands of the issues and simply “update” its terms of service. More amazing yet is the statement that Twitter borrowed its TOS from Flickr, apparently without much attention to what they contained. A Twitter executive is quoted as saying that, as a start-up, Twitter just “threw something up early on and didn’t give it a lot of thought.”

Who knew that these Internet companies had such a cavalier attitude to the non-negotiable contracts they are imposing on Internet users? Actually, the terms Twitter uses, and says they borrowed from Flickr, are much less lengthy and burdensome than those now used by Flickr itself; since acquisition by Yahoo! the terms of use that a new Flickr user agrees to (standard Yahoo! terms) prints out to seven type-filled pages, where the Twitter TOS amounts to only two pages. These click-through terms are being enforced by courts as binding contracts, even when the Internet service provider doesn’t “give them a lot of thought.” In the case about the plagiarism detection site Turnitin, high-school student users were held to the terms of service they clicked through even though they made valiant efforts to modify those terms.

As more and more communication on campus happens over these kinds of proprietary sites and networks, and as commercial Internet tools become more common for student and faculty worker, these contracts will increasingly control what we can do. Often they give the owner of the site or tools an exploitable interest in the work created or stored there. Yet very few people even realize that they are binding themselves to detailed and enforceable terms whenever they click “I agree.” It is therefore becoming ever more important that courts find ways to introduce some nuance into their enforcement of these click – through agreements, rather than simply enforcing them blindly as the Virginia court did in Turnitin. At least one proposal for such a nuanced approached, that considers when a contract, especially a non-negotiable online contract, should be preempted by federal copyright law and the policy that law is aimed at enacting, is found in this complex but compelling article on “Copyright Preemption of Contracts” by Christina Bohannan. We can but hope that courts will develop a more sophisticated approach to these contracts, whether they use Bohannan’s proposed approach or some other, as they become more aware that such contracts may undermine both the policy behind copyright law and the traditional rules of contract formation, and they may do so, if left unchecked, based on very little thought or reflection by the party that is imposing the terms.

Use case on NIH Public Access

Another question that is becoming common is about how to comply with the National Institute of Health Public Access Policy. The answer presented here was to an inquiry about an article accepted for publication in the journal “Nature,” whose policy about compliance is fairly well-publicized and easy to find. The specific steps that an author must taken to be sure they have the rights necessary to authorized deposit (or to be sure the journal will deposit for them) will vary with each publisher; where there is uncertainty about the policy or negotiations required, the answer will be much longer that this one.

Dear Professor _____________,

Congratulations on the paper! The first step in complying with the NIH public access policy is to be sure you retained the right to deposit the article when you signed a publication agreement. If you signed Nature’s usual author’s license, a copy of which is available here — http://www.nature.com/nature/authors/submissions/final/authorlicense .pdf — there will not be any problem. That license allows the author(s) to retain copyright, although it gives Nature an exclusive right to publish, and it specifies that the author can place the article in a funder’s open access database subject to a six-month embargo.

Assuming that this is the license you signed, your next step is to actually deposit the article in PubMed Central. You do this using the NIHMS system; there are instructions and links here — http://publicaccess.nih.gov/submit_process.htm . We are being told by those who have used it that the submission process is fairly easy and straightforward. Nevertheless, if you have any difficulties, just let me know and I or one of the librarians will be glad to come to your office and help you with it.

Once you have submitted the article, along with any supplemental material, all you have to do is wait. NIH will send you, or the principle investigator named on the appropriate grant if that is someone other than you, a final copy of the article as it will appear in PubMed Central for verification. It is important to review the article at that time to be sure everything is correct, just as you would do with the page proofs for the journal, and respond to that e-mail.

At some point in the process you will be asked to verify that you have the right to authorize PMC availability and to tell PMC about any embargo. As I said, if you signed the usual Nature license you do have the right to authorize availability and you should indicate a six month embargo. Even though you should submit your article immediately, it will not appear in the PMC database until six months after publication in Nature, in accordance with your license obligation.

For future reference in any paperwork submitted to the NIH, you will need to obtain the PMC ID number for your article. This helps NIH track compliance with the policy and is now required on renewal applications, progress reports and the like. Again, library staff can help you find this number if you have any difficulty.

The timeless folly of DRM

There is a good deal of value in reading older works, even in a field that changes as rapidly as copyright. It is a fascinating exercise, for example, to read attempts in the late 1960’s and early 1970’s to influence the direction of the “new” copyright law being considered (which was passed in 1976). L. Ray Patterson’s “Copyright in Historical Perspective” (Vanderbilt University Press, 1968), for example, or now-Justice Stephen Breyer’s 1970 Harvard Law Review article on “The Uneasy Case for Copyright,” offer an all-too-contemporary sounding warning about the doleful consequences of writing a copyright law that does not pay enough attention to users’ rights or assumes that the concerns of industry as expressed at a particular moment should be enshrined in a statute meant to function for decades.

James Lardner’s 1987 book about the development of video recording devices and the subsequent copyright consequences, “Fast Forward: Hollywood, the Japanese and the Onslaught of the VCR” (Norton) is another example of an older work from which there is still a lot to learn (my principle embarrassment in discussing the book lies in revealing yet again how often my own reading follows suggestions made by Bill Patry). As I read the book this weekend, I was struck especially by a small remark that, to me, reflected on a mistake the content industry cannot seem to stop making.

During the district court trial over the issue of whether Sony’s Betamax device created liability for its maker due to copyright infringement, the trial judge, Warren Ferguson of the Central District of California, refused to allow the attorneys for Universal and Disney to put on a rebuttal witness who would argue that the court could reasonable force Sony to adopt a technological measure that would permit the non-infringing purposes Sony (with the help of Mr. Rogers, among others) had demonstrated for the VCR while preventing unauthorized recordings of broadcast TV. A “jamming device” was suggested that could, the witness would have asserted, be incorporated into all VCRs at a (relatively) minimal cost and would block recording of programs unless the broadcast chose to permit those recordings. Sounds a lot like the “broadcast flag” argument and the recent flap over Microsoft Vista preventing the download of some NBC TV programs, doesn’t it?

We are still wedded to the idea of technological solutions to the problem of unauthorized uses, and we have now gone so far overboard as to give legal protection to such technological systems, even when the have the intent and /or the effect of prevent perfectly legal uses or of reducing access to works no longer protected by copyright. And we continue to pursue a DRM “arms race” in which each new system is seen as a challenge in the user community and few last more than a couple of weeks before keys and hacks are discovered. The wisdom of Judge Ferguson’s words in refusing to entertain this burdensome and unwise “solution” in Sony are, as yet, unheeded: “As sure as you or I are sitting in this courtroom today, some bright young entrepreneur, unconnected with Sony, is going to come up with a device to unjam the jam. And then we have a device to jam the unjamming of the jam, and we all end up like jelly.” Now that this headlong plunge into chaos has been enshrined in section 1201 of the Copyright Act, Judge Ferguson (whose original decision in the case was ultimately affirmed by the Supreme Court) seems more and more like a prophet to whom we should have listened.

Discussions about the changing world of scholarly communications and copyright