This is the text of a talk I recently gave at the monthly meeting of Twin Cities Catholic I.T. Professionals, Inc.. It is aimed at computer professionals who want to get a deeper understanding of net neutrality, and goes into much more technical detail than a general audience would want. Also, there are no helpful pictures or links in this one. For a less technical overview, aimed at my fellow political conservatives, see my original blog post, Why Free Marketeers Want To Regulate The Internet. Otherwise, please enjoy!
Thanks everyone for coming. I am James Heaney, and my talk is on network neutrality. I can’t claim any particular credentials on this topic, the way our past speakers have been able to. I did write a blog post about the economics of net neutrality that got picked up by TechDirt and retweeted by Vint Cerf, which was maybe the coolest thing that ever happened to me, but my interest in it is amateur: net neutrality sits at the crossroads between technology, economics, law, and public policy, which rings pretty much all my chimes. My presentation will start with tech, where you’ll probably know most of what I’m talking about, and move toward policy, which hopefully is a little more educational. Net neutrality a hugely complicated issue, and – while I do have an opinion – I think this is one of the few policy issues where this is no single right answer.
That said, let’s see how the room shakes out. Based on whatever it is you know – no matter how vague – do you think the FCC’s proposed regulations on network neutrality go too far, don’t go far enough, or are just right? And, yes, you have to decide, no matter how irresponsible your opinion. Don’t worry: I won’t tell the FCC.
Cool. And, just out of curiosity, do you think your opinion is fairly well-informed, or not?
So, let’s start with the basics. Net neutrality is about how data traffic is handled on the internet. What’s the internet?
(TED STEVENS IMPRESSION) “It’s… it’s… it’s a series of tubes!”
Heh heh. I love that one.
But, seriously, Senator Ted Stevens was basically right. The Internet is a bunch of computers stuck together with tubes. All of them want to send data over the tubes to everybody else.
When a home user connects to the Internet, he typically connects to an Internet Service Provider, or ISP. This ISP – let’s say Comcast – owns what is called a Tier-2 network. You give them money, they let you connect to every other computer in their network. Right now, Comcast will sell you “unlimited” access, which is actually 250 gigabytes per month, with 20 megabit-per-second-service, for around $80.
But Comcast isn’t connected to everyone on the Internet. In fact, it’s not really connected to very many people at all besides other Comcast customers, which is just a subset of other people in the United States. And they call it the Internet, not ComcastAmericaNet, so they must be doing something to get their users connected to the rest of the world.
Some of what they do is called peering. In a peering arrangement, Comcast calls up another network – say, Vodafone – and asks to send traffic to their network. In exchange, Comcast will let Vodafone send traffic back to Comcast. Once they’ve agreed, at a convenient location, they build a physical connection between their two networks, large enough to handle the agreed-upon data loads. Typically, peers don’t charge each other, because the arrangement is mutually beneficial: it widens both networks, improves performance on both networks, and increases the prestige of both networks. But, sometimes, especially when the data loads are unequal, one of the peering partners charges the other partner money for the privilege of peering. There are also public peering locations (Internet Exchange Points) where dozens or hundreds of different networks build access points and are allowed to peer with each other… normally for a fee, paid to the administrator of the IXP.
Now say Comcast wants to get access to British Telecom’s network. BT is willing, Comcast is willing, the prices are fair, just one problem: BT is in Britain. Peering requires a physical connection, and there’s no way for Comcast to build a connection to BT’s network without building a cable across the ocean. Comcast may be richer than Creseus, but even it can’t afford its very own transatlantic fiber line. So Comcast calls up somebody who has one – Level 3 Communications – and asks to pay them for what’s called “transit.”
In an Internet transit agreement, a Tier-2 network reaches another network by paying a third network – a middleman – for the privilege of using their tubes. So if Comcast pays Level 3 enough, Level 3 becomes the connection between Comcast and British Telecom, and now they can access each other’s networks.
One way or another, Comcast has to be able to connect to all the other networks on the Internet; otherwise they’re not really connecting to the Internet, but just a subset of it. This means setting up interconnection agreements (transit and peering) with everyone. This is very complicated, and involves a tremendous amount of private negotiation and difficult contracts. That it works at all is a testament to the miracle of free and relatively unregulated markets.
Oh, and I mentioned tier-2 networks a few times there, so you may be wondering what a tier-1 network is. Tier-1 networks are just like Tier-2 networks, except they are so big that they can connect to every other network without ever purchasing transit from a third party. Tier-1 networks do peer with other Tier-1 networks, and they are usually the ones selling transit to smaller outfits, but they do not themselves ever purchase transit in order to reach another network. There are currently seven tier-1 networks in the world – Level 3 Communications is an example of a tier-1 network.
Content can live in any of these zones: it can live on home computers and small-business servers that connect the rest of the internet through an ISP. More general content providers often connect directly to tier-1 networks. For example, my website, starshipexcelsior.com, lives on a Hostmonster server, which connects directly to the Cogent Communications network, a tier-1 network. This makes sense; it ensures that the average distance between my content and any computer connected to the internet is relatively short.
And providers with a particular need for high performance or high data loads (or both) often use a content distribution network. In a CDN, content is distributed from a central server to various endpoints around the world – points of presence (POPs) that are close to end users. This ensures that content reaches end users swiftly, and it often reduces the total cost of data transmission for the content provider, since not every piece of data has to come from central every time, but can be cached and reused at the points of presence.
So, yes, the Internet is just a series of tubes. But it’s a series of tubes governed by thousands of different companies, individuals, and governments, each with their own turf, each with their own carefully negotiated deals with dozens of other companies, each with carefully maintained interconnection points in thousands of locations around the world. That a fully interconnected “network of networks” could exist at all is humbling, when you consider its scope; that all these fiercely competitive networks can, under contract, cooperate so reliably that we often don’t even think of them as separate networks is perhaps, without exaggeration, free enterprise’s crowning achievement.
So why on Earth would anyone want to the United States government to step in and regulate it?
Let’s talk economics now.
The Internet works because it is, at every level, a free and competitive market, where all network managers are ultimately accountable to their customers. If a content owner doesn’t like the price that AT&T is charging for first-mile network access, she can easily call Cogent instead. If Comcast doesn’t like what Level 3 is charging for IP transit over the Atlantic so Comcast can reach British Telecom, Comcast can easily take its business to TaTa Communications instead. And if an end user thinks that TimeWarner Cable’s service is crappy or slow or expensive, he can switch to a different ISP.
This means that everyone is always competing for each other’s business, and that means that every company involved is always trying to deliver the most service to everyone else at the lowest total price. In a market where everyone has a strong incentive to be efficient and make their customers happy, the heavy hand of government regulation, no matter how well-intentioned, can only get in the way. Adam Smith’s invisible hand is already doing everything possible to maximize customer happiness, plus it turns a healthy profit for providers, keeping them interested in doing business.
At least, that’s how the economics of Internet connectivity work in theory. In practice, the Internet access market looks less like Adam Smith’s ideal free market every day. In one sense, it never resembled a competitive market in the first place.
Back in the late 19th and early 20th century, the United States noticed something odd about railroads: while there were several railroad companies, over time they competed less and less with each other. Instead they settled down to form a few regional monopolies. Within those regions, they acted like all economic monopolists do: they stopped responding to the needs of customers. Instead, the railroads set about extracting as much money from consumers as they could. Wealthier towns saw higher rail fares, for absolutely no reason except that the railroads thought they could get away with it. Service and safety became badly degraded, because why would a monopolist do more than the bare minimum to keep their trains running? What were the consumers going to do, not use trains? (Some people did stop riding trains; given the price, some had no choice. But not enough quit to make “good service and fair prices” worth it to the railroads’ bottom lines.)
Normally, the free-market response to terrible service and high prices is simple: start your own company and beat the tar out of the incumbents, taking away their market share fair and square. If there’s only one Chinese restaurant in town, and it’s terrible, you start another Chinese restaurant across the street, and may the best man win. But, in the railroad market, every startup failed, despite the fact that consumers wanted them to succeed. The free market simply broke down, the invisible hand stopped pushing prices down or lifting up consumers, and everyone ended up under the tyranny of the monopolies.
When consumers have no power in a market, and no firms can break in to make the market competitive, there’s only one entity that can break up the logjam: the government. In the 1880s and 1890s, Congress passed a series of bills that broke up some of the biggest monopolies, and which tightly regulated the railroad industry for over a century.
Later on, our grandparents saw the exact same thing happen in other industries, like electricity and phones. Slowly, in the aftermath of the Marginal Revolution, economists figured out why competition just didn’t work in some markets, and they coined a term for it: natural monopoly.
In most businesses – generally speaking – the more you sell, the more it costs you. If you run a hot dog stand, and you want to sell ten hot dogs in an hour, you have to buy enough meat, bread, and condiments for ten hot dogs. But if you want to sell a million hot dogs in an hour, then you have to buy a hundred thousand times as much meat, bread, and condiments, not to mention all the other costs of scaling up, from human resources to fuel for your grills to inventory tracking. The more you sell, the more it costs. This is almost always true – so much so that the standard supply and demand graph simply assumes it.
However, there are some markets where it is not true. Consider a power company at the dawn of the Electric Age. They build a power plant and power lines to carry electricity throughout town to their wealthy customers. One day, Bob the Barrister decides he wants electricity, too, so he calls the power company, which drives out, connects Bob’s house to the grid, and begins charging Bob for the electricity. Here’s the magic: by selling more product, the power company’s costs actually go down. See, it was already producing the electricity that Bob just purchased, because that’s mostly how generators work – they produce a certain amount of electricity, whether it gets used or not. It’s just that, until Bob signed up, that electricity was going to waste. Now Bob is paying for it. The power company also already had most of the infrastructure to move Bob’s electricity the five miles from the power plant to Bob’s house. Now Bob is helping pay for that infrastructure, too. The only added cost from Bob signing up with the power company was a single short cable and an hour or so of labor – which, from the company’s perspective, is a very low cost indeed, and is more than offset by the savings Bob’s membership brings. In fact, the more product the power company sells, the lower their average cost goes. In theory, they’d be able to return those savings to the consumer, lowering the price of electricity for all their customers every time they add one.
That makes it almost impossible – indeed, economically inefficient – for competition to survive in a market like this. All companies in the market fight bitterly to get the most customers (this is good). But, as soon as one company gets a small lead over the others, that company is able to cut prices, leading more customers to sign up, allowing the company to cut prices more, leading more customers to sign up… while the other companies are losing customers and are forced to raise prices, causing them to losemore customers, until they eventually go out of business. It’s a domino effect, where the invisible hand herds consumers into signing up with the same company faster and faster until it’s the only company left standing. Potential new competitors face daunting startup costs and the impossibility of beating the market leader on price. As a result, the single company that survives the initial round of combat becomes a permanent monopoly… and, as soon as its last competitor is dead, it begins raising prices to take advantage of monopoly profits. Because market forces alone forced this to happen, it’s called a natural monopoly.
As it turns out, most utilties work this way. Electric power is a classic example, water another. The government deals with these problematic markets in various ways. U.S. water systems are regulated very simply: they are owned and operated directly by the government, with no private competition allowed (not that it would be feasible anyway). The government then aims to deliver clean water to customers at the lowest possible price (with varying success). The electric system is little different: while people buy their electricity from a company rather than the government, that company is, in most states, regulated closely by the government, which sets a legally mandated price that all electric companies must use.
There is a great deal of suspicion of regulation, which is not without cause. Government-run or government-regulated monopolies don’t rely on the market to set prices; they rely on the best guesses of well-intentioned bureaucrats. They are complacent and often fail to innovate, because they have little or no incentive to do so. Their service is usually not as good as you’d expect for the price you’re paying, although you retain some leverage simply because you can vote out city officials who don’t do their jobs. For that reason, many states do as much as possible to deregulate their utility markets.
However, even those programs can only go so far. For example, in Texas, where deregulation was embraced more fiercely than perhaps anywhere else, power generation has been completely deregulated. But the delivery network, the grid, remains under the control of the incumbent natural monopolies, and Texas is forced to regulate them very tightly in order to prevent them from abusing their market power. This is because even Texas free-marketeer Republicans broadly agree that an unregulated natural monopoly is far worse than even a government takeover. Rather than relying on well-intentioned bureaucrats to set a fair price, the monopolist sets prices as high as possible – far higher than a free market would allow. Monopolists, too, are complacent, and don’t just fail to innovate, but often fight innovations, because innovation could disrupt their control. Their service is abysmal, because they have absolutely no reason to care about you. After all, what are you going to do? Disconnect from the electric grid? Move to another state? In economic terms, your personal demand curve is inelastic. In practical terms, they don’t care whether you’re satisfied with their service, and they don’t care whether they provide you with fair service at a fair price. You needtheir service and will pay nearly any price, tolerate nearly any indignity, to get it. While much of the monopolist’s effort remains focused on adding customers early on, that gradually peters out as they approach saturation, and instead they begin to work on ways to gouge more money out of existing customers.
If your customer experience with Comcast has been anything like mine, you’re beginning to see where this is going.
But first, a short aside: the “utility model” of natural monopolies is the main thing we’re concerned with today, but it’s not the only way a monopoly can arise naturally. There’s a closely related but distinct phenomenon called the “network effect,” where adding a new customer doesn’t lower corporate costs, but does increase the value of the service for everyone else using it. Social networks are a wonderful example of this kind of natural monopoly: within each region of the world, a single service has taken absolute control of the market for social profiles. In most of the world, it’s Facebook. In China, it’s Qzone. In Russia, everyone has a V Kontakte (KOHN-tact) profile. Many of us Americans are unhappy with Facebook, and would leave for a viable competitor if we felt we could – but the network effect has made it impossible for any competitor just starting out to give us the same value Facebook does, because Facebook alread y has everyone we want to connect with. So, for most of us, our practical options are to have a Facebook or to have no online social profile. This isn’t exactly the same way a railroad monopoly works, but it’s close, and I think we’re all probably more familiar with Facebook’s triumph over Google Plus than we are with Great Northern Rail’s defeat of the CB&Q railroad in 1901. Keep Facebook in mind as we start talking about the ISP monopolies.
By this point, even if you didn’t know a word about net neutrality or the Internet coming into this, it won’t surprise you to hear me refer to the ISP market as a natural monopoly situation. Their business model is identical to that of the power companies: they build huge networks of cables and, when you pay them, they connect the cables to your house. The only difference between an electric grid and a tier-2 network is what the cables are carrying. Likewise, natural monopolies gradually took over the telephone market eight decades ago… and, for just that reason, Bell Telephone and the Baby Bells have been tightly regulated by the Federal Communications Commission since the New Deal era.
Twenty years ago, it was hard to imagine the ISP landscape we have today. Back then, when the world wide web was a newborn, ISPs were a free-for-all, with thousands of competitive options in every region of the country. My family was a Sprynet house. That’s just what we expect to see in a young market, even one that naturally tends toward monopoly, because no single firm has had time to become dominant yet.
But, sure enough, starting around 1998, the market entered a long consolidation. Most of the early ISPs either failed, were bought up, or faded so that, today, they serve only some particularly arcane submarket, and not the average consumer. Today, the average consumer has very few choices. Speaking personally, there are only two serious competitors for my broadband internet coverage: CenturyLink and Comcast. That’s actually up from the past several years, during which CenturyLink told me it couldn’t reach my home with anything faster than dial-up. According to the FCC’s December 2013 report on broadband penetration, I’m one of the lucky ones: one out of every three Americans has access to just one broadband provider (where broadband is defined as at least 6 mbps downstream). These Americans have zero choices: it’s their ISP or the highway. Another one in three Americans are in my boat, with two options – although, if the FCC redefines broadband to 10 mbps downstream, as it is expected to do next year, I and many others will be back down to one option. [EDITOR’S NOTE: not only did this happen shortly after I gave the talk, but the FCC went much further than expected and defined broadband as 25 mbps downstream.] 5% of Americans have no broadband access at all, and the remaining quarter have three or more choices. No matter how you slice it, this is a far cry from the heyday when any hacker could run a commercially viable ISP out of his bedroom, and every consumer had his pick of the litter.
And they’re not done consolidating yet! When the Comcast-Time Warner merger is complete, bringing two of the biggest players together under one roof, the monopoly effects will be even stronger. We would expect prices to rise and service to degrade accordingly. According to the American Customer Satisfaction Index, ISPs are already the least popular industry in America – less popular than life insurance salesmen and the cigarette industry. Given how terrible they are already, it wouldn’t surprise anyone to see them get even worse.
One of the easiest ways the ISPs could make things worse is by attacking the principle of net neutrality. And that brings us to our point.
Net neutrality is a relatively simple principle. Indeed, it never needs to be defined, much less legislated, in a competitive free market, because customer demand virtually guarantees that every successful company will provide net neutrality. Only in a collapsed or collapsing market, where a few regional duopolies or monopolies control a sufficiently large slice of the global pie, can net neutrality start to break down. That’s when people start paying attention to it, and trying to pin down precisely what it means. (The internet has been net neutral since USENET days, but the phrase wasn’t coined until 2003, when Professor Tim Wu first suggested that it might be in trouble.)
According to Wu’s original paper, a network is neutral if it “does not favor one application… over another.” There are other, similar definitions. Sir Tim Berners-Lee, the Web’s inventor, gave this definition: “If I pay to connect to the Net with a certain quality of service, and you pay to connect with that or greater quality of service, then we can communicate at that level.” I’m personally fond of Wikipedia’s formula: a neutral network does not “discrimin[ate] or charg[e] differentially by user, content, site, platform, application, type of attached equipment, or mode of communication.”
But let’s stick with Wu’s. It’s short, it’s original, and it’s narrow, and if you start using the broader definitions, it gets even messier than it already is. Net neutrality works like this:
If Comcast is net neutral, and if I’m paying the monthly fee for 250 gigabytes of data at a 20 megabit-per-second download rate, then they’ll give me that level of performance, or the closest approximation they can manage under network congestion, regardless of the data I’m trying to access. If I want to spend my entire 250 GB allotment on ASCII art of the Sacred Heart of Jesus, I’ll get it, and I’ll get it at 20 megabits per second. If I want to spend all my data downloading and seeding a (legal) torrent of Weird Al Yankovich’s hit song “Don’t Download This Song”, I won’t be discriminated against based purely on the fact that the content is on the BitTorrent protocol, nor based on the fact that it’s a Weird Al song.
Likewise, when networks interconnect, if they are neutral networks, they will allow any compatible data, and won’t prioritize one kind of data over another kind of data. They will charge you for how much data you want to move and how fast you want to move it – but nothing else.
The incentive for network neutrality is consumer demand. When monopolies take over, this incentive breaks down, and neutrality crumbles.
Returning to an earlier example: let’s say CatsWearingTopHats.com (not a real website – yet) hosts its content on British servers (because of course it would), which are connected to the internet through British Telecom. The CWTH.com admins want to send cat pictures to a Comcast subscriber named John, who is trying to access their site. Since BT and Comcast don’t directly peer, BT buys transit on Level 3’s network to reach Comcast, which passes the cat pix on to John.
Now, suppose, one day, one of the three networks in that chain decides it doesn’t like cat pictures, or that people who download pictures of classy cats probably have extra money laying around and can afford to pay a little more. So this network informs the other participants that, henceforth, they will have to pay a substantial toll if they want to get any more pictures from CatsWearingTopHats.com over to John in the United States.
In a free and competitive market, the answer is easy: change networks. If the discriminator is BT, the CatsWearingTopHats.com changes to a different first-mile network. If the discriminator is Comcast, John cancels with Comcast and is connected to a different last-mile network, at the same price, ten minutes later. If the discriminator is Level 3, it’s a little more complicated – John and CatsWithTopHats have to tell BT and Comcast to raise the issue with Level 3. BT and Comcast can renegotiate, or they change to a different Tier-1 network for transit, or they can tell John and Cats to take their business elsewhere – which they promptly will. Bottom line, discrimination is harshly and immediately punished by competition.
But what if Level 3 owned the only transatlantic fiber cable in the world? Then there would be nothing anyone could do if Level 3 chose to discriminate. Either CatsWithTopHats and BT and Comcast would have to pay the arbitrary classy-cat toll (with John footing the bill in the end, in the form of higher subscription costs), or CatsWithTopHats would have to simply stop sending John any cat pictures, effectively cutting itself off from John’s network – no longer the World Wide Web, but some subgraph thereof.
Unfortunately, this is what we’re starting to see in some corners of the Internet, mainly among last-mile service providers, where competition is – as we’ve discussed – an endangered species. A few years ago, AT&T blocked Apple’s video chat app (FaceTime) for customers who weren’t also paying AT&T for unlimited voice and text messages… even though FaceTime used only data, not voice and text, and even if the customer was already paying for unlimited data. AT&T simply refused to allow competition to exist on their network. An FCC investigation under now-defunct net neutrality rules persuaded AT&T to back down.
In 2007, Comcast throttled all traffic using the BitTorrent protocol, slowing it to a dead crawl. Comcast’s justification was that some BitTorrent users are heavy network users, which was causing network congestion. BitTorrent users responded by pointing out that their connections were advertised as having “unlimited” bandwidth, and, besides, you can’t attack an entire protocol for the activity of a few bad apples. Comcast replied that it could do as it damn well pleased, and its customers could go elsewhere if they were upset – knowing full well that many of their customers had no other broadband options. An FCC investigation under now-defunct net neutrality rules persuaded Comcast to back down.
Most recently – certainly more importantly than other past suspensions of net neutrality – last-mile ISPs have started raising access costs for major content providers. The main target and major headline grabber so far has been Netflix, which hosts its content (mostly) through Level 3 Communications (a tier-1 network). In one example, Verizon, throughout the first half of 2014, publicly demanded that Netflix stop using Level 3 to get to Verizon’s customers. Instead, Verizon demanded that Netflix pay Verizon to host the content on Verizon’s servers, as part of a content delivery network. In the meantime, wherever Level 3’s network interconnected with Verizon’s, Verizon refused to upgrade their routers to absorb the large amount of traffic coming from Netflix… even after Level 3 offered to pay for the (inexpensive) upgrades themselves. As a result, Netflix traffic – plus anything else being transmitted by Level 3 – became very slow on Verizon’s networks, and actually became unusable for some home users, despite the fact that said home users were paying for unlimited data at 75 megabits per second. Verizon’s customers didn’t have a lot of alternative choices in the decreasingly-free ISP market, though and, in the end, it turned out that Netflix needed access to Verizon’s customers more than Verizon needed Netflix on its network. Netflix gave in and started paying Verizon for a CDN. Within a few months, every other major ISP did the same thing to Netflix.
For advocates, this was a fairly clear-cut case of discrimination against a single application. While Verizon insisted that this was a simple case of Netflix trying to take a free ride on Verizon’s network using the net neutrality buzzword for political cover, this interpretation is difficult to sustain, given Level 3’s public offer to pay for Verizon’s network upgrades themselves.
So what do we do?
Option one is we leave things as they are, unregulated. We’ll let the market take us wherever it wants to go – even if that takes us right into the arms of a natural monopoly. This would avoid the many costs inherent in regulation. But it could also impose a natural monopoly regime on us. Perhaps that wouldn’t be a bad thing. A couple weeks ago, when my blog post on net neutrality got picked up, a conservative think-tanker tweeted me a 1968 article by the University of Chicago’s Harold Demsetz, entitled “Why Regulate Utilities?” which argued that doctrines about the danger of natural monopoly, though widely agreed-upon by economists, are wrong, and that there is no need to regulate. The libertarians at the Mises Institute agree, though their arguments are as much moral as economic. I confess I haven’t been able to finish the Demsetz article yet – largely because I was writing this talk!
Option two is to do what we did with Microsoft in the ‘90s: just threaten ISPs with regulation and sanctions, so that eventually they either back down or market forces take over before anyone manages to exercise monopoly powers. Unfortunately, that ship has sailed, and is no longer in the policy toolbox. In the mid-2000s, the FCC issued regulations that mostly enshrined net neutrality, but the regulations were on legally very shaky ground. They worked as long as the ISPs didn’t fight back. In 2007, that’s exactly what Comcast did. In 2014, Verizon won a final court case, and the FCC’s net neutrality regulations were thrown out.
That being said, although most of its regulatory power over broadband was gutted by the courts, the FCC is still trying to impose a very limited form of net neutrality with the authority it has left. This proposal would prevent networks from arbitrarily blocking traffic, but would still allow them to charge content-based tolls on (or accelerate) the data that traverses its network, rather than charging each bit the same price for the same quality of service. We might call the FCC’s halfway-neutral proposal “option two point five.” [EDITOR’S NOTE: Since I gave this presentation, the FCC has abandoned this approach.]
Option three is to ask Congress to do… something or other. There is, of course, the problem of figuring out what to ask them for in the first place. But, beyond that, the current Congress, for reasons well beyond the scope of this talk, is incapable of doing much of anything, particularly when lobbyists oppose taking action. That is especially true in the tech sector, where most Congressfolk are out of their depth – as we’ve seen from Congress’s continuing failure to do anything about software patent trolls, despite the flagrant abuse and obvious damage current patent law is doing to the economy. Moreover, after Barack Obama endorsed net neutrality legislation on the campaign trail in 2008, the issue became polarized along party lines. With divided control of government guaranteed through 2017, Congress is an option that isn’t really an option.
Option four: we could use government, especially municipal governments, to create more competition in the market. Of course, this would only be pseudo-competition: a private monopoly versus an unaccountable public bureaucracy bailed out by taxpayers is not exactly the free market we envisioned when we started out. However, it’s a moot point: thanks to brazen rent-seeking by major ISPs, in nearly half the states, local governments are barred by law from providing municipal internet as a public utility.
Option five: some people suggest breaking up any ISP that gets too big, like Reagan broke up Ma Bell in the ‘80s, restoring competition by taking an axe to the monopolies and near-monopolies. However, there is no obvious legal way to do that. The Bell breakup resulted from a lot of special circumstances, some plain-as-day antitrust violations, and an 8-year court battle. Moreover, breakup would probably not solve the problem: the “wee ISPs” would still have local monopolies in many areas, and economics 101 would force them to immediately begin reconsolidating into new national monopolies (as the Baby Bells are doing today). And even the Baby Bells remain tightly regulated post-breakup. In the long run, the consolidation and price gouging of natural monopolies are probably inevitable.
This brings us to option six: the Federal Communications Commission. I’ll dwell on this option at length, not because it is necessarily the right option, but simply the main option people are talking about today. (Neither net neutrality advocates nor anti-regulation telecoms are happy with the FCC’s proposed regulations under “option two point five.”) As I mentioned earlier, the FCC was created in the 1930s to regulate the natural monopolies in the telephone market. The technical term-of-art used here is “common carrier:” any company that sells bandwidth (such as a 12-baud connection to the phone network, or a cubic foot of space on a freight train) to the public at large counts as a common carrier, and most are susceptible to natural monopoly. Because of their unique, key position in the national transportation infrastructure, they are also required to actually serve the public at large. If you have the money to pay for a ticket on an American Airlines flight, there’s a seat available, and there’s no other justification for denying you a ticket, then American Airlines must sell you that ticket. Everyone must be given equal access to the nation’s transportation networks – as long as they can pay the price. Under Title II of the Federal Communications Act of 1934, the “common carrier” appellation applies whether a service carries physical goods and persons (transmitted by rail and sea) or data (transmitted by phone and telegraph). The FCC’s mandate was to prevent the common carriers of data from arbitrarily denying service to lawful users, or from freezing into a monopoly or cartel. It could even require phone companies to make interconnections between different phone networks, in order to ensure that everyone with a phone could reach everyone else with a phone – if the Level 3/Netflix/Verizon fight we talked about earlier had taken place over phone lines instead of cable, the FCC would have been squarely in the middle of the dispute.
Early ISPs were classified as common carriers under U.S. law. Of course they were. ISPs literally sold bandwidth to the public, and, according to World Wide Web’s designer, Sir Tim Berners-Lee, the World Wide Web depended on the public being given equal access as long as they were willing to pay. Naturally, the FCC would regulate ISPs the same way it had regulated the phone companies for 60 years. And, throughout the dial-up era, it did. In 1996, Congress passed the Telecommunications Act, which updated the FCC for the Internet Age. The “Republican Revolution” Congress under Speaker Gingrich made sure that the updated framework did as much as possible to promote competition in the market – without allowing monopolies to overtake that competition. A few years later, DSL came out. The FCC examined DSL and ruled that it fell under the common carrier provisions. It obviously met the definition, so how could it not?
A little after that, cable broadband internet began rolling out to consumers. The FCC examined it… and a remarkable thing happened. In 2002, the FCC ruled that cable broadband was neither a “telecommunications service” nor a “cable service” subject to common carrier regulation. Instead, cable broadband was solely an “information service,” with no telecommunications or cable element included. Since information services can not be regulated as common carriers under Title II, this freed cable broadband providers from all those regulations.
Of course, this was a ludicrous ruling. The Telecommunications Act of 1996 leaves no wiggle room for cable modem operators: they are clearly telecommunications services. The “information service” classification, by the FCC’s own precedents, was for services like Google, or your library catalog system, or dialing 411, not an ISP; indeed, information services were unregulated precisely because they involved little to no infrastructure and few, if any, barriers to entry. So, you know, the exact opposite of ISPs.
The FCC spent thirty pages producing a – if I may, rather convoluted – rationale for this ruling. Their basic argument was that broadband internet was a telecommunications service which also carried information services on it. They went on to say that the broadband “information service” is not distinguishable from the underlying telecommunications service – they are one and the same, so to speak – and, since the information service is mainly what the consumer sees and understands himself to be paying for, the telecommunications service side of cable broadband fades out of regulatory view. In paragraphs 38 through 40, the FCC argues that, because data travels over the ISP pipes, the pipes themselves are legally the same thing as data. I am trying to be fair here, but there’s not much to work with. I don’t know of anyone who takes this ruling seriously on its own merits.
The FCC’s press release focused on something quite different from the merits, and created the narrative that has, for both sides, defined the ruling ever since: the FCC claimed it was trying an experiment in telecom deregulation, hoping that, by deregulating further, competition (which had not materialized in the wake of the Telecommunication Act) would finally emerge in the ISP market and stop the slide toward monopoly. It had worked reasonably well in the 1980s deregulation of the airline industry, so maybe it would work in telecom, too. To accomplish this, the FCC didn’t technically need to follow their precedents or the Telecom Act; they just needed to find a justification that could survive bare minimum judicial scrutiny – which is not a high bar to clear, because the courts must give overwhelming deference to the FCC and other regulatory agencies. Sure enough, the FCC’s strange ruling survived review in a 6-3 Supreme Court ruling (Justice Scalia’s blistering dissent, where he slams the FCC for unilaterally deregulating the ISP market, is a fun read, as always). A few years later, the FCC extended the same deregulation offer to DSL and phone services. They eagerly took it, escaping the Title II regulation regime. Suddenly, ISPs in America were no longer considered “common carriers” under law (even though they obviously were common carriers in actual fact). Internet regulation, which had been part of the Web’s DNA since its invention, was gone.
In short, the only reason the Internet isn’t protected from monopolies today is because, in 2002, the FCC decided to experiment with not regulating the Internet. Shortly thereafter, the earliest warnings about net neutrality started to show up in the academic literature, and those warnings have only built in the years since, as ISPs have busted apart any regulation trying to keep them on a net neutral regime. The Wall Street Journal regularly argues that the Internet has thrived because ISPs have never been regulated like phone companies. This is false, and the Journal should know better. Indeed, the years of the Web’s most explosive growth and development happened under the auspices of strict common carrier regulation, identical to those of phone companies. (Heck, even today, limited portions of Verizon’s high-speed fiber network, FiOS, fall under Title II – at Verizon’s request!)
If the FCC decided to fully regulate net neutrality, the fix would be very easy. Indeed, several courts have pointed to it over the past several years: simply revisit the strange ruling of 2002. Overturn it, and (correctly) decide this time that Internet Service Providers are “telecommunications services”. Instantly, every ISP in America would go back to common carrier status, and net neutrality regulation wouldn’t just become easy; in many ways, neutrality is baked into Title II.
[EDITOR’S NOTE: A few hours before I posted this, the FCC came out in favor of Title II.]
One issue I should mention that opponents sometimes bring up is “forbearance.” The long and short is, Title II comes with an enormous number of tools and obligations. It would become a factor in all interconnection agreements, it would impose price controls, and it would have veto power over all sorts of network management. Even many net neutrality advocates don’t want to impose all that. They argue that the FCC can simply “forbear” from imposing any parts of Title II it doesn’t want to apply to network access providers. Opponents argue that forbearance only works in limited cases where good reasons exist, and that the ISPs don’t meet forbearance standards. This is a complicated in-the-weeds legal argument which I can’t summarize here, but, for my two cents’ worth, I tend to think that the opponents are right: imposing Title II would impose most or all of Title II – not just the parts that protect net neutrality.
The U.S. Council of Catholic Bishops has repeatedly asked the FCC to do something to protect net neutrality. Being a council of bishops, not a room full of nerds, they have not gone into technical detail, but their call to action is clear. Their most recent piece, this one by Bishop John Wester, chair of the USCCB Committee on Communications (and bishop of Salt Lake City), appeared on September 16th. “Instead of adopting rules that permit the wealthiest companies to purchase the best service,” wrote the bishop, “the FCC should insist on fair treatment for everyone no matter our income. Community-serving organizations – such as the church – should not be treated as secondary “customers” in this digital environment. The content and connections we provide to people are more important than entertainment content — such as movies and television shows — even though we don’t have the resources to compete with entertainment companies to pay more to the Internet providers… Allowing some Internet content to be favored because of its greater ability to pay could result in an even greater divide between the powerful and the rest of a community. Under that scenario, decisions regarding access to public information… would be determined based only on the bottom line of corporations, not to promote the common good.”
The USCCB’s position hardly ends the discussion – I’m sure we can all name at least one policy where we disagree with a USCCB opinion – but it introduces a moral dimension to what has otherwise been a very horizontal issue of economics, law, and technology.
Here is where my blog post roared into a blazing conclusion, making a ringing endorsement of one particular option and scorning all others. But we’re here for a discussion, and so I’ve tried to give this talk at least the sheen of objectivity.
So, instead of ending with a rhetorical flourish, I’d like to end with a few of the questions I hope we’ll examine when we come back from break:
How is our day-to-day work, as computer people, impacted by net neutrality, and how would it be changed if neutrality changed?
What Catholic principles of social justice help guide our action in the realm of net neutrality?
And finally, the big one: what should be done about net neutrality, if anything, and who should do it?
But those are big discussion questions. Right now, in the remaining fifteen minutes or whatever, I’d like to take your questions about the meat of the presentation you just heard. Was I clear? Should I expand on anything? Did I get anything completely wrong?
Thanks for your close attention to that rarest of 21st-century unicorns: an hour-long presentation without an accompanying PowerPoint.
EDITOR’S NOTE: The author didn’t want to come out and say it in this talk, but he supports Title II reclassification, and is very excited by today’s announcement.
CORRECTION 6 Feb 2015: The original article, in one paragraph, conflated two court cases, Comcast v. FCC (2010) and Verizon v. FCC (2014). I regret the error, now corrected.
CORRECTION 1 Jun 2016: Some readers thought that Justice Scalia’s “blistering dissent” in NCTA v. Brand X was in favor of deregulation, apparently because deregulation is generally seen as a right-wing idea and Justice Scalia is generally seen as a right-wing judge. But Scalia was actually arguing against deregulation in his dissent, arguing that, whatever the merits of deregulation, the FCC had no authority to deregulate ISPs without Congress’s express approval. This has since been clarified in the text.