Last week Federal Communications Commissioner Robert McDowell spoke with a small group of Duke administrators about a wide range of topics. In response to one question (which was, I have reason to know, deliberately provocative), Commissioner McDowell, who is a Duke alum, gave a pretty ringing endorsement of the unregulated Internet. He referred approvingly to the history of the Internet as an open environment that has, throughout its history, been free of government regulation.
McDowell chose to ignore, in these comments, the pre-history of the Internet as DARPAnet, a creation of the Defense Department’s Advanced Research Projects Agency. But really his position is the one from which I want a government regulator to start; a stance of healthy skepticism toward regulation is the best way to ensure that careful thought proceeds regulatory enactment. While suspicion of regulation is almost always a justified foundation, however, it is not necessarily the final word on the matter.
The context of the question Commissioner McDowell was answering was ‘Net neutrality, and in that context it is particularly easy for the FCC to oppose regulation, since that is the position favored by the major telecoms. But it is far too simple to say that at long as the government keeps its hands off, the Internet will stay unfettered and equally accessible to all. Commissioner McDowell clearly knows this, and his assertion was that competition is the best way to prevent private entities from closing off the Internet pipes to certain types of traffic. But he also noted that the economic downturn has delayed the implementation of additional pipes, and it is still true today that the backbone of the Internet is in the hands of only a few major corporations.
The fear here is that these companies may find it desirable to implement differential pricing — charging more for certain kinds of traffic — and that regulation might be necessary to preserve the openness that has, so far, been a hallmark of the Internet. ISPs might, for example, decide that voice-over-internet phone services compete with another part of the business of their parent telecoms and introduce higher prices for VOIP to choke off such services. UPDATE — As this report indicates, this is a very real concern that the FCC continues to monitor
A similar decision to charge more for high-bandwidth uses could be implemented in a misguided attempt to prevent video piracy. Illegal video downloads, of course, use a lot of bandwidth, but so do perfectly legal file transfers. The danger with these kind of “solutions,” whether they are differiential pricing, filtering or agreements between ISPs and content companies, is that they are likely to exclude too much content and too many users. When this happens, the free speech goals which copyright is meant to serve are undermined, often in the name of copyright protection.
The recent announcement of a new anti-piracy strategy from the RIAA, and the continued behavior of YouTube toward repeat notices of copyright infringement, illustrates this danger. The RIAA has agreements with some ISPs (it is not saying who) to cut off Internet access for those accused of repeated illegal downloading. But we know that the RIAA has not been very careful about its accusations in the past, so there is a real concern that users will lose access based on inaccurate information and poorly substantiated charges. And even before the RIAA’s new strategy is put in place, we know this kind of abuse is happening. Here is a report from the ArsTechnica blog about a case where what is quite likely to be fair use — the posting of film clips on YouTube to augment an online critical essay — has lead to the author having his YouTube account shut down because of DMCA notices that claim infringement but do not have to prove it or to take into consideration any of the myriad ways the uses on YouTube might be justified. By disconnecting users after “three strikes” based on mere accusations, YouTube is already implementing the practice the RIAA is negotiating with ISPs. And we can see that that process is ripe for abuse.
The moral here is that regulation of the Internet is a complex topic. Reliance on the market alone will not always guarantee that the ‘Net will remain open and accessible on an equal basis for all. As more and more basic and vital information and services become Web-based, such access must be preserved. The trick will be to figure out the right moment and the right way to preserve access, but the time will come when those decisions must be faced, since we have already seen that reliance on market forces and good will alone will not suffice.