American Enterprise Institute

Copyrights are more than just federal “privileges.”

[Commentary] On July 1st, the American Enterprise Institute’s panel discussion, Copyrights and Innovation: Understanding the Debate, provided a good overview of the differing perspectives of copyright skeptics and supporters.

However, one set of closing remarks made a critical mistake worth correcting. It was argued that US copyrights are mere “privileges,” because they are protected only by a federal statute -- not by the state common-law claims (and statutes) that protect other private property rights. It was thus argued that if Congress repealed the federal copyright act, then the “privilege” of copyrights would cease to be legally protected by US laws.

But this copyright-is-just-a-federal-privilege argument suffers from at least two fatal defects.

First, it just got US copyright law dead wrong. Some copyright skeptics make this error because the Supreme Court, in Wheaton v. Peters, rejected the idea of federal common-law copyrights.

Second, the idea of abolishing the private copyrights of authors is not really some innovative, liberating, thought-experiment recently concocted by copyright-skeptical academes.

Developed representative democracies like the United States now usually rely on four cultural systems to produce expressive works: (1) an academic system; (2) a philanthropic system; (3) direct government funding of particular works; and (4) a commercial-production system. Copyright laws can interact with all four systems, but they are indispensable only to the fourth -- the commercial-production system.

[Sydnor is a visiting fellow with AEI’s Center for Internet, Communications, and Technology Policy]

US court case may shed light on ICANN’s legal status

[Commentary] By a strange and convoluted process, a pending legal case about payments to terror victims may end up clarifying several open questions about Internet law.

The US Congress has authorized lawsuits against sovereign governments for terrorism that has harmed Americans. Plaintiffs in several cases have successfully convinced US courts that the government of Iran is responsible for funding terrorism against American citizens.

Getting the money is not so easy, since Iran has been almost totally cut off from the US banking system. However, Iran has not been cut off from the Internet, and it turns out that by virtue of being on the Internet, Iran has financial ties to American entities after all.

On June 24, the US District Court for DC authorized preliminary steps towards seizing Iranian payments to ICANN in several ongoing cases. The court authorized subpoenas against ICANN for information on its negotiations with Iran, and issued writs of attachment against any Iranian payments.

[Rabkin is currently a postdoctoral researcher at Princeton University]

Disclosing interconnection agreements creates anticompetitive risks

[Commentary] The Federal Communications Commission announced that it had begun reviewing the recent interconnection agreements that Netflix signed with Internet service providers Comcast and Verizon.

The Commission’s growing interest in the heretofore unregulated interconnection market has prompted some commentators to renew their calls for all interconnection agreements to be filed with the Commission and made publicly available. Greater transparency, they argue, would provide consumers a better understanding of the economics of the Internet ecosystem beyond last-mile broadband networks, and would help the public police potential anticompetitive risks.

While transparency is often a laudatory goal, a mandatory public disclosure requirement in this case may ultimately harm the very competition that proponents seek to protect. Even absent any actual anticompetitive effects, the increased antitrust scrutiny invited by price transparency will impose additional costs on the industry, which will ultimately be passed along to consumers in the form of higher prices.

[Lyons is an associate professor at Boston College Law School]

When it comes to net neutrality, the Nordic model is the best approach

[Commentary] Network neutrality is a global debate. A number of countries have implemented laws or are in the process of doing so. Each country defines the issue differently and thus creates laws with different provisions.

This creates a problem of international harmonization for the Internet, which is inherently global. Net neutrality rules are a difficult compromise between consumer protection and increased governmental control of the Internet, but the multi-stakeholder model strikes a balance.

Norway’s model for net neutrality, established in February 2009, is the longest running regime of that type in the world. No violations of net neutrality have been documented under the model. Swedish regulators observed at a recent event that the model is working, and ISPs are actually becoming more transparent.

The Nordic model preserves a role for the regulator to frame the discussion while at the same time encouraging participation by operators, content/application providers, and consumers. In this way, the regulator is less of a warden and more of a mediator.

Nordic regulators have agreed to cooperate on net neutrality. Should an EU law come to pass, it would supersede the enlightened approach taken by the Nordic countries. The better outcome would be to build on the efforts of the Nordic regulators, and make their model the global standard.

[Layton studies Internet economics at the Center for Communication, Media, and Information Technologies (CMI) at Aalborg University in Copenhagen]

Protecting the Internet from political agendas: Takeaways from the ICANN meeting

[Commentary] The recent Internet Corporation for Assigned Names and Numbers (ICANN) meeting in London highlighted two issues that appear unrelated on the surface but link current concerns about how to manage Internet Governance going forward.

One is the transition of the Internet Assigned Numbers Authority (IANA) stewardship away from the US Government, and the second is the Government Advisory Committee’s multi-meeting, dragged out discussion on .wine/.vin new Top Level Domains (TLDs).

We need to know that the European Commission and its government colleagues understand this is a technical coordinating organization. If governments are willing to bring an issue from outside of the technical functions of the Internet into ICANN, how do we trust them to keep the IANA function purely technical and to stay away from the temptation to use it as a political lever?

If France wants to protect its geographic indicators on wine labels, for instance, it needs to continue negotiations with its global partners in the world of trade agreements, not technical Internet functions. The temptation to settle a commercial trade issue through Internet Governance will be just the beginning of mass political misappropriation of the largest economic tool that we have seen in this century -- the Internet.

[Tews is Chief Policy Officer at 463 Communication]

What the FCC’s broadband tests really measure

[Commentary] The data in the “Measuring Broadband America” report released by the FCC on June 18th shows that Americans get the broadband speeds they pay for.

The report plainly says (page 14), “This Report finds that ISPs now provide 101 percent of advertised speeds.” This couldn’t be any clearer. The FCC even places this finding in context by contrasting it with the results from their preceding report, “The February 2013 Report showed that the ISPs included in the report were, on average, delivering 97 percent of advertised download speeds during the peak usage hours.”

The FCC report also engages in a rather peculiar exercise of measuring web page loading speeds and attributing them to ISPs. The FCC’s web loading time test actually says more about the web server than it does the network. This is nice data for researchers to have, but it tells us very little about ISP networks. For this to be a meaningful measurement, the FCC would also need to account for CDNs, web server location, and web server response time.

It would be good for the FCC to clearly separate ADSL from VDSL and for it to measure speeds up to 1 Gbps. Once it’s done with that, it’s fine for it to try to get a handle in web servers and interconnections, but it appears that the FCC has a long way to go before it really understands what it’s measuring.

Aereo: Too clever by half gets you nowhere, fast

[Commentary] Since 10:17 a.m. on June 25, 2014, hundreds, if not thousands, of articles and blog posts have been written explaining, dissecting, and analyzing the Supreme Court’s decision in ABC v. Aereo.

The basic point I want to make is that Aereo is a broadcast policy and communications law opinion; it is not a copyright opinion. I am generally not a proponent of outcome-oriented decision-making. But in this case, Aereo’s technology and business model so clearly sought to bypass (or, at best, simply ignored) so many important policy issues that what amounts to a summary dismissal of the company’s theory of operation seems entirely warranted.

[Hurwitz is an assistant professor at the University of Nebraska College of Law, where he teaches telecommunications law, cyber law, law and economics, and other regulation-related subjects]

If it ain’t broke… FCC’s ‘Measuring Broadband America’ report shows a healthy Internet sector

[Commentary] The Federal Communications Commission published its fourth annual report on measuring fixed broadband, and the results yet again speak to the healthy state of the sector and the benefits of light-touch regulation.

The last thing the country needs right now is for the FCC to regulate Internet Service Providers under Title II of the Telecommunications Act, as if they were public utilities. Significantly, in the report, the FCC researchers found that DSL, an older technology deployed in a more heavily regulated environment, gives consumers less than what they are expecting -- while cable, fiber, and satellite goes above and beyond.

That consumers are trading up to higher speeds is further evidence that they are enthusiastic about having high-quality Internet service -- and trust the providers to give them exactly that. That is hardly a marketplace that’s crying out for a regulatory fix.

[Glassman is a visiting fellow at the American Enterprise Institute]

Antitrust vs. net neutrality: Consumer welfare in focus

[Commentary] The House Judiciary Committee has heard testimonies on “Net Neutrality: Is Antitrust More Effective than Regulation in Protecting Consumers and Innovation?” I, however, would characterize the question a la a Seinfeld episode (paraphrasing): “Are you just saying you want to have Internet freedom for the little guy or do you really want to have Internet freedom for the little guy?”

If you want to protect the little guy, then this committee hearing was for you. The answer to the question posed -- given the empirical evidence -- is yes, antitrust is more effective than regulation at protecting consumers and innovation -- at least when the proposed regulation is “net neutrality.” Why is antitrust so much better than regulation at ensuring innovation in the Internet ecosystem? The reason is simple: unlike regulation, which specializes in control by the few -- a few commissioners, the few well-connected lobbyists, a few politically favored corporations -- antitrust is the law of the people.

For all their talk about wanting to preserve the Wild West of the Internet, net neutrality proponents seem to think consumers don’t know what they want until they’re told -- and free music is not what they really want. This convoluted and patronizing consumer welfare analysis is antithetical to antitrust.

Antitrust does not judge the legitimacy of consumer desires; it merely ensures that corporations do not censor your demands through anticompetitive practices.

[Boliek is an associate professor of law at Pepperdine University School of Law]

Slow Netflix? It’s not always your ISP’s fault

[Commentary] In the last week of May, a large number of New Zealand customers started experiencing difficulties accessing Netflix. In the context of the current net neutrality debates in the US, it was inevitable that some immediately jumped to the (erroneous) conclusion that their Internet service providers (ISPs) were deliberately slowing down or blocking Netflix traffic for some strategic reason.

However, a little local journalistic sleuthing elicited only denials from local ISPs that they were ‘shaping’ Netflix traffic. Furthermore, the problem appeared to be affecting customers of a number of ISPs. A day later, all became clear: Netflix itself was primarily responsible for the interruptions to its New Zealand customers, but it wasn’t intentional.

The New Zealand ISPs were also contributing to the problem, but this too was accidental. It turns out that Netflix had made some technical changes to its content distribution that had caught the ISPs ‘on the hop’. The change meant that the ISPs were no longer detecting and caching the popular content, so everyone had to endure the long wait as every copy requested was streamed live from the US.

[Howell is general manager for the New Zealand Institute for the Study of Competition and Regulation and a faculty member of Victoria Business School, Victoria University of Wellington, New Zealand]