American Enterprise Institute

What Heartbleed tells us about software liability

[Commentary] The technology press has been awash in stories recently about the so-called Heartbleed bug that releases sensitive user data from any service using the OpenSSL encryption library.

To hold companies accountable for Heartbleed, we would need not product liability or restrictions on contract liability waivers, but rather, some sort of tort liability for service operators. Internet services can grow very popular very quickly. Consequently, increased liability could result in a structural shift in the Internet ecosystem.

Large, established companies such as Facebook or Google would likely become more averse to security risks, making them more cautious and shy about innovation. Small startup companies, on the other hand, who are constantly at risk of failing, would not have the resources or incentives to increase security and would take the risk of innovating without investing in security. Therefore, smaller companies would get an increasing advantage over their slower-moving, larger rivals who delayed new innovation in order to minimize security risks. Such a market, where low-security startups represent an increasing share of the computing industry, would be inherently more hazardous.

Ideally, large companies would voluntarily collaborate to improve the security of common shared infrastructure like OpenSSL or Linux. However, no intervention along these lines is likely to more than a moderate benefit. We don’t have robust ways to measure security improvements, or how security-critical any piece of code is. As a result, we aren’t going to be able to construct robust incentives here. Ultimately, the right lesson to draw from the Heartbleed bug is that we do not yet know the right technical or social mechanisms for building large software systems securely and economically.

[Rabkin is a researcher interested in techniques for building and debugging complex software systems]

Outcome document arrives before the doors open for the NetMundial conference

[Commentary] Wikileaks has posted a draft outcome document created by the Executive Stakeholder Committee of the NetMundial Internet Governance conference to be held in Brazil April 23 – 24.

Before the first multi-stakeholder invited to attend had taken a seat, the organizer had already decided what principles they were going to agree to and a road map for implementation. The main challenge of the document is this: while the principles may seem reasonable, even laudable, the roadmap for implementation has many challenges and potential hidden agendas. The Internet as a “Human Right”, access to information, free flow of data, freedom of association, expression, privacy, accessibility, etc., all sounds like principles we want to embrace.

And a governance process encouraged to be “open, participatory, multi-stakeholder, technology-neutral, sensitive to human rights and based on principles of transparency, accountability and inclusiveness”, sounds like a good idea we should be able to support. But now I’m wondering who will have the capacity to implement all this pro-free rhetoric?

Most likely, that job would fall to governments around the world. It would be up to them to let us know when we have hit our Internet Governance metric. Many of the items in the leaked outcome document are good aspirational goals. It’s deciding on the path to achieve them that will be the major challenge for the multi-stakeholders engaged in this process. Continuing to seek balance between governments, industry partners, content providers, and the end user will be an ongoing challenge.

[Tews is the Chief Policy Officer at 463 Communications]

Hey Comcast -- let’s talk sports (regional sports networks that is)

[Commentary] I am usually not surprised at America’s fascination with sports telecasts (I share it; that’s why I write on it!). But surprised I was when not just one, but at least three Senators took a great deal of time to pose questions about sports programming during the Comcast/Time Warner hearing.

As any good fielder would do, I called out to the television “I got this,” but the Senators apparently didn’t hear the play. So let me try to respond to their concerns in this format. Roughly paraphrased, the Senators are uneasy with the proposed merger because Time Warner owns certain regional sports networks (RSNs) and Comcast owns interests in other RSNs. They fear that together the two companies would create a sports network juggernaut that could leverage these networks into an indomitable market share -- crushing all competition.

The perceived power of the RSN is almost hypnotic for the regulator – and apparently also for Senators and their constituents. They just can’t let it go. There are procompetitive rationales for not making such programming contracts compulsory, but for now I’ll leave it at this: senators, RSNs will not affect the market power of a merged Comcast/Time Warner. Just ask the Federal Communications Commission -- “they got this.”

[Boliek is an associate professor of law at Pepperdine University School of Law]

Comcast, Netflix, and the unregulated interconnection market

[Commentary] The Federal Communications Commission confirmed that it will not expand the scope of the ongoing network neutrality proceeding to encompass peering and transit services.

This was a blow to Netflix CEO Reed Hastings, who launched a blistering attack designed to focus the agency’s attention on this market. But given the robust state of competition in the interconnection market, the FCC’s response was good news for the future of the Internet ecosystem.

Despite Hastings’ hand-wringing, the Comcast-Netflix agreement illustrates the robust competitiveness of the interconnection market. The fact that Netflix pays Cogent transit fees is uncontroversial. The Netflix-Comcast agreement should be no less controversial merely because the identity of the transit provider has changed, or because the transit provider also happens to manage a last-mile broadband network.

The FCC is absolutely right that the market for interconnection is robust and competitive, meaning there is no reason to justify regulatory intervention. Its regulatory humility is both refreshing and promising for the future of Internet policy.

[Lyons is associate professor at Boston College Law School]

The EU’s roaming and net neutrality vote puts it on the path to a digital crisis

[Commentary] The EU Parliament voted on a telecommunications package that includes free roaming and net neutrality.

Future historians will likely mark the date as a key event for the onset of the EU’s digital crisis. U politicians have made free roaming the centerpiece of their digital market effort. Essentially, it is an effort to harmonize mobile roaming prices across the 28 member nations, regardless of the underlying costs. It is nothing more than “feel good” politics. Plus, it’s a cheap win for politicians as operators foot the bill. No matter where their customers go, operators are forced to eat the costs of their customers’ traffic, even if the costs exceed their revenues.

There are two nasty intended consequences of the free roaming regime. First, artificial price ceilings create a perverse market for mobile arbitrage. People and speculators can game the trade of SIM cards, buying them in low cost traffic countries (Lithuania) and bringing them to high cost traffic countries (e.g. UK or Germany where spectrum costs and taxes are considerably higher). Second, in order to police this illicit activity, the EU will have to start a surveillance regime. Europe, which is still smarting from the financial crisis, is on the course for another, digital one, where users will not be able to get the network service they need because operators are too poor to deliver it.

Wrong-headed and overzealous investment has stifled broadband investment. Network neutrality is a mess, not least for content/applications providers. Creating a world with a patchwork of net neutrality regulation will itself provide arbitrage opportunities.

[Layton studies Internet economics at the Center for Communication, Media, and Information Technologies (CMI) at Aalborg University in Copenhagen, Denmark]

Aereo: Pulling the wrong question from the quiver

[Commentary] Aereo goes to the Supreme Court in what is one of the most important broadcasting and copyright cases in recent history.

Unfortunately the question that the Court is considering -- whether Aereo violates broadcasters’ public performance right under copyright law -- is the wrong question to ask. No matter the outcome of the case, it will be the wrong policy outcome. Either Aereo wins, which is the right outcome under copyright law but the wrong one for broadcast policy, or Aereo loses, which is the right outcome for broadcast policy but the wrong one for copyright law.

Unfortunately, these copyright issues miss the more challenging -- and important -- question in this case. The central issue in the Aereo case is not about copyright law -- it’s about how much, and whether, we continue to value over-the-air broadcast television as we transition to primarily wireline-based distribution of video content. On this front, Aereo deserves to lose. Aereo’s business model is problematic not because it pushes the boundaries of copyright law, but because it undermines the economics of the over-the-air model. In other words, Aereo’s business model threatens the two sources of revenue that underlie the broadcast business: advertising and retransmission fees.

[Hurwitz is an assistant professor at the University of Nebraska College of Law]

Wake up, FCC: The Internet Protocol transition is now

[Commentary] Some 45 years after design work started on the cellular network and the Internet, the Federal Communications Commission (FCC) issued an Internet Protocol (IP) Technology Transitions Order amounting to a reluctant invitation for trials on the decommissioning of the legacy telephone network.

While the telephone network is no longer the centerpiece of telecommunications in the United States or around the world, the FCC is clearly anxious about turning it off, probably because the FCC and telephone network grew up together. This reluctance is apparent in the many obstacles the FCC’s transition order places in the way of the decommissioning trials. While it is worthwhile to ensure that no essential capabilities are abandoned in the transition from the telephone network to its replacement (pervasive broadband networks running IP), it is important for the FCC to approach this transition sensibly.

Cyberspace, self-defense, and the law of the sea

[Commentary] If you happened to turn your eyes towards Capitol Hill, you might have witnessed the latest round of senatorial efforts to discover an effective cybersecurity strategy.

In a pair of hearings, Senators and administration officials focused on stricter data breach reporting regulations and expanded liability protection. Sen John McCain (R-AZ) again proposed a separate committee on cybersecurity. Federal Trade Commission Chairwoman Edith Ramirez stressed that the government needs stricter rules for data breach reporting and greater freedom to prosecute offending institutions.

Our congressmen and women would do well to read “Navigating Conflicts in Cyberspace: Lessons from the History of War at Sea” by Professor Jeremy Rabkin of the George Mason School of Law (and Adjunct Scholar here at AEI) and his son, Ariel Rabkin. In their article, published in the Chicago Journal of International Law in the summer of 2013, the Rabkins redirect the reader’s gaze from small-scale modifications to a broader, more ambitious proposal: “ground American policy on cyberattacks in…[the body of law] dealing with armed conflict on the high seas.”

Currently, the law of armed conflict governs American and international cyber conflict. These rules, the Rabkins argue, unduly limit national and private cyber defense options. Cyberattacks, however, do not solely target military facilities, personnel, or capabilities, nor do they solely originate from military facilities or personnel. The law of armed conflict, by designating military objectives as the only acceptable targets of armed responses -- cyber retaliation included -- fails to account for the civilian involvement that is an unavoidable reality of conflicts in cyberspace.

Underlying the Rabkins’ proposal is a basic and natural argument for adopting the laws governing conflict at sea: the right of self-defense. The same should apply to companies operating in cyberspace.

[Cunningham is a Research Assistant at the Center for Internet, Communications, and Technology Policy (AEI)]

Crawford’s faulty arguments -- in all their naked glory

[Commentary] Susan Crawford is talking again, this time in a long interview with Ezra Klein of Vox.

Klein pushes her more than most of her interlocutors, and, as a result, we see her arguments in all their naked glory. Crawford’s is not just a statist view of communications, but also a deeply pessimistic and nostalgic one. It must be. Otherwise, she would have to concede the success of a “private market… left to its own devices.”

Crawford selectively cites statistics that show US consumers getting poor service compared with Swedes, Slovenians, Singaporeans and the like. My colleagues Rosyln Layton and Bret Swanson dispose of such deceptive comparisons easily. For example, the current generation standard of 4G/LTE networks are available to 97% of Americans but just 26% of Europeans.

But where Crawford truly fails is her confusion between these selective inputs and America’s spectacular outputs. What the cable companies, the telecoms, and the satellites provide is an infrastructure, a platform. On top of that platform has been built a magnificent, rapidly changing edifice. Further, the Klein interview reveals Crawford’s fixation on delivery of Internet by wire.

But the growth and the future of broadband are mobile. Just ask Facebook. The mobile broadband market has ignited major capital spending by US companies -- six times more per subscriber than global counterparts.

[Glassman is a visiting fellow at the American Enterprise Institute (AEI), where he works on Internet and communications policy in the new AEI Center for Internet, Communications, and Technology Policy]

Gigabit boondoggle unwinds in Chicago

[Commentary] The state of Illinois gave $2 million to Gigabit Squared to wire the Chicago South Side for ultra-high speed broadband, and now they want their money back. This story is more common than you might think, and Gigabit Squared isn’t the only offender.

From Burlington (VT) to Provo, Utah, it doesn’t matter whether a city is large or small, or if it goes it alone or bands together with neighbors: designing, building, and operating a broadband network is harder than running a water system. Very few municipalities have the expertise to do these things successfully, and those who leap before they look are likely to end up wishing they’d examined all possible outcomes before committing their cash. Mooresville (NC) had to lay off police, fire fighters, and teachers to cover their broadband debt. This is more likely to happen than not.

The main reason we can expect municipal networks to fail is revealed in their goal-setting exercise. It’s perfectly reasonable for some form of subsidized broadband program in an unserved or severely underserved area: people can gain great benefits from using a smartphone, an iPad, or a laptop to access the Internet, explore educational opportunities, or read the news. But you don’t need the world’s fastest network to do these things; all you really need is a garden-variety DSL, cable, or mobile network. And given those choices, most people will adopt mobile first. As history shows, the speed and capacity of broadband networks generally improves to meet and exceed customer demands.

[Bennett is a visiting fellow at AEI]