American Enterprise Institute
Time to treat net neutrality as more than a hashtag
[Commentary] Believe it or not, Jimmy Fallon’s April 24 monologue on NBC’s The Tonight Show mentioned the Federal Communications Commission’s network neutrality proposal for the Internet was a rare guest appearance by the topic on broadcast and cable television.
A recent study from Pew Research Center revealed that television news programs covered net neutrality just 25 times between January 1 and May 12, which comes to less than 1% of all programs reviewed. And while the top 23 newspapers provided slightly more coverage than television, six of them carried nearly 70% of the 203 newspaper stories.
The consensus on Twitter doesn’t reflect reality. Many have grave concerns about the FCC overstepping when trying to enforce net neutrality. Classifying the Internet as Title II could end innovation in the Internet ecosystem, as it has done in the telecommunications sphere. And some point out that net neutrality, supported by so many large companies, is a pretty fine example of crony capitalism.
It’s possible that Washington is once again taking itself too seriously and making an inside-the-beltway mountain out of something the watchers of Jimmy Fallon consider a molehill. But there’s a more troubling possibility: that the story of how a critical decision affecting the way Americans access and pay for the Internet is being ignored by the media and left to 140-character explanations from self-appointed experts on #netneutrality. This issue is serious. It’s time to treat it as more than a hashtag.
Protecting privacy and property rights in the cloud
[Commentary] In the wake of the Snowden leaks, much attention has been given to the extent to which it is possible for unauthorized individuals -- including governments -- to gain access to electronic information.
It is becoming increasingly clear that statutory privacy laws and website codes pay only lip service to their promises to protect individuals’ and firms’ information. From the perspective of an economic contract, they are very difficult to enforce because it is extremely difficult -- or prohibitively costly -- to identify when breaches have occurred.
It is relatively straightforward to protect one’s physical property by putting boundaries around it to keep others out – the economic characteristic known as exclusivity. We can physically isolate the disks on which information is stored and invest in sufficient resources to exclude others up to the expected value we expect to gain by controlling the information. If the disk is illegally appropriated by another, this is obvious, because it is a ‘rival’ good. Either the legitimate owner has it or it is illegitimately in the possession of another.
The problem with digital goods – such as the information on the disk – is that they are neither rival nor easily excludable, especially when they become ‘unbundled’ from the ‘carrier medium,’ for example when transported from place to place over the Internet.
This makes them particularly problematic. But in the race to provide a raft of new means of preventing unauthorized access to electronic data, is enough attention being paid to impediments to authorized users’ legitimate access to cloud-based data?
[Howell is general manager for the New Zealand Institute for the Study of Competition and Regulation and a faculty member of Victoria Business School, Victoria University of Wellington, New Zealand]
Title II communications IS the ‘slow lane’
[Commentary] The substance of Title II common carrier regulation is very real, and it could deal a huge blow to the Internet economy.
Title II means price regulation. It means asking Washington and the state utility commissions for permission to launch new products, change existing ones, or deploy new technology, and to approve marketing and advertising programs. It means hundreds of other rules that were written for the monopoly telephone network 80 years ago but that would now apply to the vastly different Internet environment.
Title II would threaten the healthy system of Internet interconnection and peering that evolved without government oversight. Title II would bring back tariffs, intercarrier compensation, and a host of other bureaucratic do’s and don’ts. Meanwhile, because heavily regulated companies tend to be experts at operating in such a confusing environment, competition from new entrants would falter.
Quarantining the Internet from Title II was one of the best economic policies of the last generation. Unleashing Title II on the Internet could spread an epidemic of confusion and litigation across an Internet environment that over decades has developed millions of fruitful technical and commercial connections outside (and often oblivious to) the old Title II regime. In short, Title II would threaten Internet innovation at its very foundation.
[Swanson is president of Entropy Economics]
The #CommActUpdate is facilitating much needed improvement to spectrum policy
[Commentary] While many technology policy debates are characterized by a lack of reason, at least one area of vital national interest proceeds in a rational and transparent fashion: the process to update America’s Communications Act. Reps Fred Upton (R-MI) and Greg Walden (R-OR) lead the process with a series of opportunities for public comment.
While sharing has a role in spectrum policy, the US should certainly not give up the valuable efforts to auction relinquished spectrum for licensed use.
Indeed, the UK trades 84 percent of its spectrum, and where necessary, the government has seized spectrum from uncooperative government agencies. The Base Realignment and Closure (BRAC) project facilitated the difficult process of closing bases in phases following the Cold War. The US needs to take the same approach with spectrum, also known as “BRAC the spectrum”.
It is no small goal for which auction revenues are being raised the First Responder Network, FirstNET, a national communications network for public safety. As 9/11 and Hurricane Katrina revealed, the current patchwork of emergency communications in the US needs to be upgraded to a national state of the art network, and the cost is in the tens of billions of dollars. Spectrum license revenue could directly contribute to that effort and help fortify public safety.
Communications regulation needs to be transitioned from the current silo-based, sector specific paradigm to a modern, technology-neutral, competition-oriented approach. Most of the functions of the Federal Communications Commission are duplicative of functions performed by other agencies. Functions and resources should be rationalized effectively and redeployed to the appropriate agencies, or bundled into a specific agency for the management of spectrum.
[Layton studies Internet economics at the Center for Communication, Media, and Information Technologies (CMI) at Aalborg University in Copenhagen, Denmark]
Heartbleed -- the fallout Part 2
[Commentary] The revelation of the Heartbleed flaw has “prompted a full roar in the world of Internet security,” in the words of Washington Post media blogger, Eric Wemple.
Whatever the potential technical perils for Internet security, the Heartbleed episode has already produced significant policy repercussions.
Importantly, it has forced the Obama Administration to reveal details of its internal cybersecurity decision making hitherto kept out of sight. It has highlighted – though certainly not resolved – the difficult dilemma of balancing the intelligence imperative of keeping America safe against the commitment to protect the openness and security of the Internet. And finally, the Obama Administration’s decision to pull many final Heartbleed-like judgments into the White House raises serious questions about its actual ability to control the vast sweep and scope of such operations.
How to improve federal spectrum systems
[Commentary] I’m developing the idea of creating a Federal Spectrum Service, a government chartered for-profit corporation, to serve as the owner of all federal spectrum.
The FSS would control all federal spectrum use and manage it according to a ten-year plan for reducing the federal spectrum footprint in two stages. In the first stage, the FSS would be required to reduce the federal spectrum footprint by 50%, and in the second stage it would be required to reduce it by 50% once again. The spectrum thus liberated would be auctioned for public use.
Once this mission is accomplished, the FSS would cease to exist unless Congress explicitly re-authorized it to continue in some form. The FSS would have the power to meet this mandate, as it would assume immediate ownership and control of all federal systems that use spectrum directly, either as transmitters or receivers. Therefore, the FSS would be able to replace current systems with new ones that would use spectrum more efficiently and to auction the spectrum it frees up for public uses.
As a single entity with control of federal spectrum use, the FSS would not be affected by agency infighting and the fragmentation of spectrum expertise across the panoply of agencies. If the FSS finds the PCAST Report’s sharing recommendations sensible, it would be able to test them by having agencies share spectrum with each other.
While all of the liberated spectrum would be auctioned, it wouldn’t all necessarily go to the highest bidder. The proceeds from auctioning federal spectrum would easily pay for the equipment upgrades that would make even more spectrum available.
Lessons from the White House big data report: Learn to protect yourself
[Commentary] Recently, the Administration released a report on big data that highlighted both the positive opportunities of big data collection and how it dangerously compromises privacy, but failed to address the challenge of consumers willfully forfeiting their personal information.
The report emphasized big data’s potential to enhance lives through large scale information gathering. In particular, it noted the ability to create more efficient economic outcomes by identifying ways that industry and individuals can better use their time and materials. But the report also acknowledged that the mass collection of personal information creates significant privacy and security concerns.
Basically, big data can be good, we just need to reconsider how and why we use it and find ways to maximize the outcome while managing the risks. The White House Big Data report failed to pursue one crucial question: if all this personal information is so valuable, why do people give it away so readily and freely? An individual’s personal information is one of his or her most valuable assets. Before blindly signing away their information to the multi-billion dollar industry of consumer data analytics, consumers ought to consider the full cost of their actions.
[Tews is Chief Policy Officer at 463 Communications]
GON, baby, GON? Or new life for muni broadband?
[Commentary] Federal Communications Commission Chairman Tom Wheeler spoke forcefully at the annual NCTA Cable Show in Los Angeles. Most observers focused on his Open Internet remarks, but Chairman Wheeler also made waves by asserting an FCC power to overrule state laws that limit the ability of cities and towns to build their own networks.
In recent years, at least 20 states, after witnessing a number of failures of such municipal networks, enacted limits on such government owned network (GON) ventures. This list failures or aborted launches is long, and growing -- Chicago, Seattle, Tacoma, Utah’s UTOPIA, Minnesota’s FiberNet, the Northern Florida Broadband Authority, Burlington, Vermont, and Philadelphia and Orlando’s abandoned Wi-Fi networks, to name a few. So the financial case for GONs is not a strong one.
The bottom line is that until the FCC, states, and localities do a lot more to remove the remaining barriers to broadband investment, we shouldn’t be using scarce tax dollars to build duplicative networks.
[Swanson is president of Entropy Economics]
Net neutrality advocates need to get their facts straight
[Commentary] The Federal Communications Commission’s net neutrality rules are based on the false premise that American broadband services are sub-standard compared to those in other countries.
Advocates who buy this notion believe that network price and quality can only be improved by regulatory action that forces providers to make uneconomic investments. Before we can have a rational discussion about network policy, we need to get the facts straight.
Average broadband speeds in five of the top 10 are actually declining, while those in the US are improving. Chairman Wheeler’s Open Internet rules aim to preserve the goose that has laid these golden eggs while protecting America’s innovators and ordinary citizens from the hypothetical harms than can arise in markets with minimal competition.
In short, the proposed regulations permit a degree of experimentation with the pricing of technical services on the Internet provided that the common, baseline service continues to be adequate for the common, baseline set of applications.
The most common complaint emanating from the fainting couches occupied by (the mainly far left) net neutrality advocates is that the proposed regulations don’t go far enough to preserve the Internet as it has always been. This is an odd standard to apply to a technical system notable for its disruption of traditional industries such as music, journalism, travel, and retail.
Net neutrality advocates also worry that Internet Service Providers have incentives to exploit customers and harm innovation, fears inspired by every profit-maximizing business. But these incentives are counter-balanced by conflicting incentives to sign up more subscribers and to provide richer services.
The net neutrality debate: Why price discrimination can be good thing
[Commentary] Amongst all of the brouhaha circulating following the Federal Communications Commission’s network neutrality announcement, some of the most puzzling comments concern the purported ‘evils of price discrimination’ that will inevitably emerge if -- heaven forbid -- a network operator dares to charge one person a different price to move traffic over the Internet than another person.
The mere fact that discrimination could occur is deemed sufficient cause by many to justify its legislative prohibition. The ‘evils of price discrimination’ are almost always voiced by individuals fervently advocating for the necessity of universal and uncapped Internet access tariffs – often to the extent that metered Internet access should be legislated out of existence, so that the digital world can flourish unbounded and ‘free’, just as its instigators intended.
If one digs a little deeper, one would probably find that the vast majority of these ardent advocates currently purchase their (uncapped) fixed Internet connection in a ‘triple play bundle’ alongside their cable or IPTV subscription and some form of voice telephony service.
Do these advocates realize the double standard they exhibit when calling for the prohibition of one form of price discrimination while at the same time benefiting from price discrimination that underpins the entire business case of their digital experiences? Because ‘flat rate’ Internet access and triple play bundles are simply other forms of price discrimination. If price discrimination is illegal then surely these too must be banned?
[Howell is general manager for the New Zealand Institute for the Study of Competition and Regulation]