Information versus Telecom

Contents

Overview

In 1897 the British Copyright Commission issued a report. One of the observations is as salient today as it was then:

A limitation of supply by artificial causes, creates scarcity in order to create property. To limit that which is in its nature unlimited, and thereby to confer an exchangeable value on that which, without such interference, would be the gratuitous possession of mankind, is to create an artificial monopoly which has no warrant in the nature of things, which serves to produce scarcity where there ought to be abundance, and to confine to the few gifts which were intended for all.

There is no limited supply of letters of the alphabet or the bits we use to encode information. Yet we have created scarcity by adopting a property model in the form of spectrum allocation and by confining our ability to communicate to narrow pipes as if the very thoughts we communicate are freight to be tariffed by a government commission.

The telecommunications industry is based on this idea that there is a business in transporting meaning be it using the runners in ancient Greece, the telegraph of Napoleon’s era or today’s telecommunications providers with their scarce supply of “minutes”.

The big idea behind the Internet is that we can decouple the exchange of meaning, that is, what we communicate, from the representation or alphabet of bits, ones and zeros.

Thomas Kuhn has written about paradigm shifts – how changes in our understanding of the familiar change the world. We saw this happen in the 16th century. Copernicus looked at the same skies that mankind had observed for millennia but instead of seeing a solar system in which planetary motion was described in complex epicycles he saw planets orbiting around the sun. Nothing changed but our understanding and it is that understanding which gave Newton and others the insights that give us today’s world. Copernicus’ insight and Newton’s Calculus gave us the tools for a dispassionate understanding of the solar system.

As Gleick explains in his book, Information, Claude Shannon’s Information Science has given us a vital tool for understanding how we exchange information. The idea of measuring information in bits seems simple and sensible but understanding how the ideas apply to the real world turns out to be fraught with pitfalls.

James Gleick is a great writer who can translate arcane technical stories into exciting tales for a relatively wide audience. And his story of the rise of the concept of “Information” is exciting in itself.

We can use the science of information and our pragmatic experience with today’s Internet to formulate a policy that relies on markets rather than regulators. We can exchange bits over a common infrastructure just as we use common roads and sidewalks. Just as the road system emerges out of our local streets our networks emerge from our local efforts at networking.

Without the burden of the overhead of maintaining an infrastructure for each service we are free to innovate, taking advantage of the opportunities afforded by this new commons.

Let’s not forget that the United States was founded on the idea of creating opportunity bolstered by the guarantee of freedom of speech. We must not cede our future to the misguided idea that we may run out of words.

Information and Telecommunications

Information

As Gleick tells it, the story goes back to the beginning of written language and its impact. It's about how we use language and communicate among ourselves as he sees it.

The book focuses on Claude Shannon’s development of "bits" as a measure of information. Just as Newton repurposed words like “work” for use as a formal term in physics, Shannon repurposed the word “information”. Information as a science gives us new tools for seeing the world around us.

Gleick uses the example of African drums to explore how we exchange information by using a nontraditional “spoken” language that is well-adapted to the medium. Those who understand the language can hear the message. Yet the colonial Europeans didn’t even realize information was being exchanged and were surprised when villagers along the river already knew the visitors were on the way.

Gleick goes on to discuss the challenge of retaining meaning in language over time and distance. He looks at oral traditions through the lens of information theory and provides us insight into the importance of flowery language.

The drummers can’t reproduce the richness of voice so they use longer phrases to compensate. This is the same technique used in the oral tradition as with Homer’s “rosy-fingered dawn”. If you’ve ever played the game in which you whisper words from person to person you will remember how words changed with each repetition. A longer phrase, especially one with cadence, is resilient. If someone says “rosy-fingered dusk” by mistake it won’t sound right and you’ll know what it is supposed to be. In information science we call this error correction.

The book makes clear why it’s essential to realize that the significance of information is far broader than just exchanging messages; we see this in how the ideas transformed biology and anticipated our discovery of DNA. Information can even be viewed as a different way of understanding physics.

While Gleick gives the nontechnical reader a sense of information science, he focuses on the story of the people involved and the societal implications of the idea of “information”. Shannon’s accomplishments become clear, as do the subtle difficulties that arise when applying his concepts to the real world.

Even at 450 pages the book only skims the surface. Gleick doesn’t even mention his own involvement as an online service provider using a protocol called “Gopher” which can be considered a prototype for the Web.

The full story is far richer. For example, Gleick mentions JCR Licklider (Lick), an acoustic-psychologist with an electrical engineering background who understood Shannon’s ideas. Lick went on to help found Project MAC at MIT and it was during his time at ARPA that he provided the initial funding for the research which led to today’s Internet. I happened across his 1949 paper on understanding voice in a noisy environment when I was taking a class in psycholinguistics.

Applied Information

One if by land, and two if by sea

American school children know Paul Revere’s Ride. It provides a perfect illustration of Shannon’s ideas.

We now recognize that the Minutemen sent one bit of information to distinguish between two choices. The actual meaning of the message was known only to those at the end points who shared the same common understanding. That information did not exist along the path – others would see only one or two lamps whereas the aware observer would see “by land” or “by sea”. To use modern terminology there was a shared dictionary with the two meanings.

Claude Shannon has provided us a way of understanding how we exchange information. What’s interesting about this example is that it seems simple if we measure the amount of information in terms of bits, yet it’s actually not simple at all, and like the earlier drum example, it depends on the recipient’s knowledge, AKA context.

Even if you’re unaware of the code of the lights (1=land, 2=sea), there’s still information contained in seeing them. After all, I now know someone is in the tower. Alternatively if someone had already told me which route the soldiers would be taking or I could see soldiers on the march, then there wouldn’t be any new information about the British for me.

In sum, measuring information is always a measure in context. In this case there are two choices, but it’s always a matter of an abstract measure of the number of distinctions.

That’s why the amount of information measured in bits is quite different from the amount or the value of information that’s meaningful to people. This confusion is endemic to technology because there are so many numbers and many are taken out of context. The CD that used to store 10 songs in 600 megabytes can now store ten or more times as many songs if they’re compressed. Just how many can fit depends on the degree of compression used, and whether the user will accept some difference from the original audio, and if so, how much. The complexity grows when you realize that any translation of music from analog to digital is a form of sampling or compression.

Ambiguities and confusions like these become a serious problem when we make policy because they confuse Shannon’s theoretical model. For example, bandwidth and frequencies are not limited by Shannon, but rather by policy and cleverness. When a carrier claims that a DSL line can carry 5 megabits for one mile, that’s based on the equipment the carrier chooses and technologies that have been developed to support a business model. That very same line was probably once limited to a capacity of 300 bits per second by modems used to send data over it.

The same is true for the capacity of a wireless link. As a parallel, think about the visible portion of the spectrum, how many colors are there? We could say six (Red, Orange, Yellow, Green, Blue, Violet) but we know there are many more shades. We know we can sense many more gradations of color and as technology improves we can detect even more subtle differences. If we’re able to handle more shades, then we can deal with large quantities of data and more subtle information.

Of course we don’t rely on just one color to communicate data or recognize people; we use all the colors together when we look at their faces and their whole selves and gain context that allows us to exchange information and interpret it in in meaningful ways. With wireless, we can do something similar by building so-called “cognitive radios” that use rich context to exchange information rather than relying on the antiquated idea of single frequency signaling.

We need to be careful with this analysis. Though it demonstrates how we can use a channel, it only applies to a single hop. Once we’ve encoded information in bits we can relay the bits over any distance and are not confined to a single path or channel. For a more complete explanation you can read my comments on spectrum policy.

Tele-communications

Tele-communications means communicating at a distance and the implicit assumption is that communicating means exchanging meaningful messages. Claude Shannon developed his science while working for a phone company and the narrow focus was on how to preserve the message along the path being used.

Today we know that the term telecommunication is ambiguous. There’s a big difference between exchanging meaning in the human sense and exchanging the bits that represent a form of that meaning.

But in many ways we insist on following the old ways — the business model goes back to the days of the telegraph and the basic concepts of telecommunications go back thousands of years to ancient Greece where Pheidippides carried the meaningful message that Athens had triumphed at Marathon.

Despite the time that has passed, our entire approach to telecommunications policy is still built on the models developed before digital systems could be imagined. Today we live in a different world, where telecommunications enjoys an abundance/excess of capacity through that digital technology,

In the 19th century the message was the telegram; the idea of using the message-unit for telephone billing came directly from that telegraphy model. But the message is not the telegram – that’s just a transport for the message.

The Telecommunications Regulatorium

The business of messages

We now understand the distinction between exchanging messages (communicating among ourselves) and exchanging bits.

Given this understanding of how we exchange messages, we have to wonder about today’s telecommunications industry.

We can go back to the days of Napoleon’s optical telegraph. At that time bloodletting was standard practice in medicine and there was no concept of germs let alone penicillin. In the early 20th century radio, AKA, “wireless was a new untamed technology.

In the 1920’s the Federal Radio Commission (FRC) was formed to manage “wireless” communications. The idea of assigning a different frequency for each signal was not new. In fact Alexander Graham Bell was researching the harmonic telegraph when he was distracted by his side project – the talking telegraph or telephone. While the technique didn’t work well for telegraphy it was a match for radio with its new technologies and higher frequencies.

Using a single frequency for signaling has a major drawback. You need an authority assigning frequencies so that the signals wouldn’t step on each other. The Federal Radio Commission was formed to bring order to “wireless”. It was recognized that having an authority assigning frequencies raised First Amendment issues but there seemed no choice.

The Interstate Commerce Commission had been created to address the abuses by the railroad robber barons. The FCC was formed in 1924 as the successor to the FRC. It was modeled after the ICC because messages were considered freight.

There was fierce competition among phone companies and you might need to have a phone from each company in order to reach all subscribers. ATT played this game very well and used the network-effect to its advantage. Eventually it convinced the government that the only way to assure everyone would be interconnected was to give ATT control over what it claimed was a natural monopoly. In return ATT agreed to be regulated.

The Federal Communications Commission

The FCC was formed during the Great Depression and the markets had failed. In that climate the idea of managing the marketplace for telecommunications seemed very attractive in the United States. In other countries telephone service was considered to be a form of postal services.

When we have system defined by regulations, what I’m calling a Regulatorium, we rely on economists rather than markets to determine how to charge for services. The actual cost of a single telephone is small compared with the cost of the rest of the system.

In the absence of competition the government created a system of incentives based on the business model and cost assumptions in the 1930’s. Policy has become more about gaming these rules than revisiting the underlying reality.

This created stakeholders whose business models depend upon spectrum policy rather than spectrum physics. We’re forced into a prior-restraint regimen. Pushing the limits in physics and business exposes the limits of our understanding. Pushing the limits of policy is violating the laws and innovation is considered cheating and a violation of the law.

This makes it hard to push the envelope.

Ma Bell

ATT, as a regulated monopoly, was able to spend luxuriously on research and pass the costs on to subscribers. In a sense Bell Labs was the government’s way of funding first rate research.

In fact, Claude Shannon did his work at Bell Labs. Understanding digital encoding of messages was central to their business. They did apply it to their business but they stopped short of embracing packet switching, recognizing it as a threat to their business.

In the 1950’s Bell Labs did a video saying that one day you will be able to go into a phone booth and turn on your sprinklers. The idea is that turning on sprinklers would be an explicit service provided by the phone company.

We can now turn on the sprinklers but only because Bell Labs is not implementing the service. All that Bell Labs is doing is carrying generic bits. We don’t have to depend on ATT deciding that the turning on sprinklers was a profitable service. In retrospect the idea of having to build each service into the network seems strange if not crazy.

In this model the network included the instrument, that is, the telephone. Until the 1970’s you wouldn’t own your own phone but would instead lease it from the phone company as part of the service. This may seem strange but it’s still the model we use today for the set top box and the cellular phone!

This coupling makes it hard for markets to evolve. Whole system engineering seems easier at first but creates dependencies that frustrate innovation. Even today Verizon’s FiOS services depend on their routers thus making it difficult for users to innovate faster than Verizon – a company whose engineering excellence is still stuck in the 1950’s.

The crux of the problem is that if ATT was not providing the service then they were not adding value. But they can’t provide every service. The solutions (and hence the value) must be created outside the network. Yet this wouldn’t become obvious until we were able to send bits through the network very inexpensively.

Modems allowed people to repurpose the phone network for exchanging bits but it wasn’t until packet switching allowed access to the native bit carrying-capabilities that we started to discover what we could do with direct access to the bit-transport.

ATT didn’t like the idea of packet switching but one feature of the thinking of packets as freight is that common carriage laws prevented ATT from blocking them.

But this is getting ahead of the story.

The Intelligent Network

The telephone network was redesigned as an intelligent network for supporting services using generic bits. The current version is known as SS7 for Signaling System 7. A voice path (in the US) is exactly 56Kbps (Kilobits per second). Two voice paths would be combined to support AT&T’s Picturephone service.

Note that the service was originally designed using analog techniques similar to modems perhaps providing a rationale for developing a natively digital alternative.

While SS7 approach was an improvement over the previous analog phone system, the digital system emulated the analog system in reserving capacity in order to support reliable services. It is still very hard for many network designers to grasp the idea that reliability is not in the network but comes from how we interpret the bits outside the network.

In The Deal of the Century, Stephen Cole tells the story of how MCI forced ATT to reinvent itself. Digital technologies meant that MCI could connect circuits in two cities across its private network yet preserve the quality of the call.

The regulatory system had created classes of lines according to their purpose and associated a price according to the purpose. Such fanciful constructs made no sense and MCI cheated by ignoring these distinctions! To put it another way, MCI exposed the artificiality of the Regulatorium to the cold harsh light of reality and ATT collapsed. The real story isn’t quite that simple because the regulatory regimen is still serving its purpose of shielding the telecommunications industry from reality and market processes.

Despite the billions of dollars involved, we maintain the idea that the networks carry messages rather than bits. These assumptions are so implicit that we didn’t revisit the technology.

I remember in the 1990’s when David Reed commented to me that as he researched spectrum policy he realized that everything was based on the assumption of spectrum allocation. We did understand that there were better ways and in the 1960’s in designing the Voyager spacecraft (launched in 1977) the engineers spread the signal over a band of frequencies. The idea was already well-understood. During World War II Hedy Lamarr urged the war department to use the technique for getting signals past the Nazis but she was spurned by the navy.

Divestiture

In 1984 ATT divested itself of its local telephone operating companies using a wholesale/retail model for the business of transporting messages. In 2011 this idea has unraveled as ATT has reconstituted itself as a message-carrying company with the Internet positioned as just another service.

The carriers, with the aid of the FCC, have kept almost all the capacity for their own services and only a small fraction is available for generic Internet connectivity. This gives the prerogatives of a monopoly. Collectively, by having common behavior, the carriers are in effect colluding to control the market with the FCC driving the process.

Because the carriers control the paths, you typically buy all services from one provider. Even if there are other providers, they share the business model so you get little advantage for paying a very high price for buying the services separately.

You can think of the subscription model as just a way to finance infrastructure but it’s a loan you can never pay off.

Yet despite this, change is happening. The cellular telephony showed how we could get abundant capacity by reusing “spectrum”. The next logical step was to provide small cellular base stations in each home. These tiny cellular stations are called femtocells. But just as they were about to take over, the cellular providers started running cellular protocols over the Internet using Wi-Fi. This is called UMA. It just didn’t make sense to build new hardware to do what can be done at absolutely no additional cost using software. The carriers understand bits are bits but pretend that we need an entirely separate (and expensive) system for “mobile”.

It does seem strange that calls between two phones using UMA are billed as minutes. It’s even stranger that the carriers are able to get away with charging us for using our own Internet connections.

The phone companies are competing with “free” and are able to do so because of the control that the FCC has given carriers by virtue of policies that force us to exchange bits via billable (or subscription) paths. If you have two adjacent phones they can only communicate (using cellular protocols) if they generate billable events via distant towers or using UMA protocols.

Understanding the Concepts

Information is not just about computers or networks but is a far more fundamental concept. With insight we can then formulate the new metaphors.

New Paradigms

Thomas Kuhn used the term Paradigm Shift to describe a change in understanding. We are still exchanging messages but instead of treating it as a service we need to recognize that the bits we exchange are distinct from the message. In fact the message itself doesn’t even exist in the middle. It only exists outside the network.

It isn’t easy to embrace new paradigms. It took me years of trying to reconcile the Internet with telecommunications policy to understand how to think about connectivity.

Perhaps my most important skill in this regard is computer programming because it gave me an understanding of effective algorithms and gave me a grounding in operational philosophy. Instead of worrying about what it means to “understand” some abstract sense I can take an operational view. One “definition” is that I understand something when I can make use of the knowledge. But this is only one definition. There isn’t a single definition because it depends on what I am trying to do.

Accounting is closely related because accounting is about finding measures appropriate to a purpose and there isn’t just one purpose. We tend to a naïve view of accounting because we tend to presume a purpose such as “tax accounting” but that isn’t the only purpose. This implicit assumption of context and purpose bedevils us when we try to approach new paradigms because they don’t seem necessary until we realize that there are new possibilities.

Learning from Experience

My own understanding of these connectivity concepts comes from experience with the Internet. It is only in hindsight that I reconsidered the idea of the Ethernet as a network. I realized it isn’t a network in the sense of a telecommunications service – it’s simply a wire we use to do networking.

When I was at Microsoft working on home networking I found myself also involved in home control. It made me think about the problem of turning on a light. How do you define the relationship of a light switch to a light fixture? You can’t use an IP address for two reasons. One is that there aren’t enough IPv4 (32 bit) addresses. But even the new IPv6 addresses are dependent upon the provider. If you take the fixture someplace else the address changes!

Instead you need to use the DNS (Domain Name System) to get a stable name. One problem is that you don’t own those names – you only lease them. The more serious problem is that turning on a light can’t depend on having a live connection to the outside world.

While still at Microsoft I was asked to write a chapter on the limits of Moore’s law. I realized that it was about how markets work and the economic implications of decoupling markets.

I came to realize that home networking had decoupled home networks from the carriers’ business model in which each device would have a monthly fee just like each cell phone does today. It would be hard to have network printers or cameras if you had to pay a monthly fee for each one.

This came together when a friend challenged me to take a constructive approach. I realized that we didn’t need network providers at all and could instead start from scratch using the existing copper fiber and radios to connect our neighborhoods just like we connect the wires in our homes.

I explain this in more detail in my essay on Demystifying Networking.

I’m reminded of this quote from Anatol Holt: “A large number of installed systems work by fiat. That is, they work by being declared to work.” This has special significance to me because he taught a class at MIT that I just happened to take in which he tried to get us to understand the subtleties of “information science” vs. meaning.

Key Concepts

Using these concepts from first principle we can rethink how we support our ability to communicate.

As difficult as it is for those with an understanding of technology and business, it is more difficult for those most concerned with social policy.

I use the term “Interweb” for the confusion between the technology of the Internet and the social uses such as the web. It’s far too easy to continue in the long tradition of the phone company as the service provider.

If you look at a telephone you can’t determine whether the phone call is an application within the phone or if it is built into the network.

The idea that we must separate the technology from the social considerations in order to allow people to be people is counterintuitive. Yet that is just what happens, for example, with the post office. You write a person’s name on the envelope and send it.

Or so it seems. You’re actually writing an address on the envelope. The post office only uses the address. The story is a little more nuanced when we take into account favors such as forwarding mail and looking at names when the apartment number is missing.

Language and Meaning and Ambiguity

When we seek to understand how to exchange information and work with information on a resource we are in the realm of language – that is how we express, exchange and process concepts.

We use the term “language” narrowly for spoken languages like English. We can use the term “language” for the more general mechanism of organizing and exchanging the concepts we use to conceptualize the world.

Gleick does touch upon this in comparing memes with genes but we have to be very cautious about the analogy. We have to be very careful about applying the preciseness of mathematical models to the real world. I use the term “digital” as a measure for the degree to which mechanisms support sharp distinctions. Genes are well-defined though there is enough variation for mutations and their expression – the way the instructions are executed in forming our bodies – is quite complex. The term meme is very loose and more of a metaphor for metaphors.

Gleick observes that phonetic languages have a common ancestor. Coding sounds phonetically as letters is a remarkable advance over pictographic languages. Or so it seems. In this book, The Chinese Language: Fact and Fantasy John DeFrancis makes a strong argument that pictographic is a myth and that Chinese writing is phonetic. This makes sense from a cognitive perspective. 中文 may seem to be a picture but the symbols are quite abstract and the written language is a reflection of the spoken language.

We need to be explicit about the importance of context in communicating. The book, And God Said, exposes the hubris in assuming we can translate ancient texts from cultures we can’t comprehend. We see similar themes in stories about the future in which people come across artifacts from today and misinterpret them because they lack knowledge of the today’s context.

In Moral Politics George Lakoff shows the deep rifts in our supposed shared understanding of the world. Perhaps one of the deepest is in the concept of ambiguity.

We can’t revisit our assumptions in every conversation but we oughtn’t to assume we have a common understanding until we find out where we differ. Yet today’s politics are polarized around ideologies without the inconvenience of understanding.

The kind of insights that Shannon brought to exchanging information also apply to understanding how systems work. If the meaning is not in the bits but in the context then ambiguity is fundamental. I hope to explore these issues in future essays.

Misattribution

It is difficult to shift paradigms when we don’t see any change – a phone call looks the same whether it’s over the classic phone network or over the Internet. We don’t distinguish between networks as service vs. networks as emergent properties.

We need to be very careful about false attribution as when we credit “broadband” with the benefits of connectivity when we are taking advantage of opportunities despite a business model which provides opportunity but confines it to billable paths.

Broadband itself is a repurposing of an existing video delivery infrastructure done via broadband signaling over a coaxial cable. The business started as a shared community antenna that was placed at a high point so it could retransmit a swath of signals from distant television transmitters. The technique of sharing bands of signals on this cable was called broadband.

This is how language works – we repurposed the word for a business model we associate with the technology used for that business. Today the business of delivering content is called “cable TV” even when we use other technologies.

Names and Identifiers

There is a fundamental problem with depending on the network to supply meaningful identifiers. The relationships exist outside the network. Very simply, the relationship identifiers are names. The identifiers we use for routing act like postal addresses.

Once we recognize that meaning comes from context and is necessarily ambiguous, we can see the limitations on any such approach and the damage done by forcing our relationships into the rigid framework of today’s Domain Naming System.

In practice we do get by with ambiguity as in a name like “John Jones” or李 (Li in Chinese).

Conversely simple uniqueness is not enough for people. Trademark law takes into account human understanding whereas the DNS is befuddled by the most obvious typos. Yet we naively confuse DNS names with Trademarks.

Active Information

We aren’t limited to copper, fiber and radios for exchanging information. We can exchange information using any medium available.

Text printed on paper is frustrating because it’s hard for me to use it as information. It’s sitting there encased in ink when it could be actionable if only it were represented as bits that are accessible to software – bits that were available independent of the path. We see some example as in QR codes which allow us to send information encoded in pictures that computers can easily read.

What makes QR codes and other encoding of information so interesting is that we don’t need to solve hard problems. Instead we can work with our current practices. It allows us to tag physical objects with rich information as in the example of bottles of wine.

Amazon’s eBooks are more than just replacements for paper-based publishing. I can start reading my book on one device and then continuing on another. Many book aficionados lamented the loss of marginalia but shared electronic notes have taken it a step further. Amazon tells me if others highlighted the same portion of text I’m interested in.

This is the true face of cyberspace. It’s not a server at a site but the information liberated from the constraints.

A Fresh Start

Once we understand the new structures we can apply it to creating the generative environment which we associate with today’s Internet. We can think of infrastructure as a funding model that supports a commons.

A fresh start doesn’t require a clean slate. In fact we have abundant capacity all round us in existing copper, fibers and radios. It’s just that we are restricted by a funding model that depends on profiting from controlling access to this abundance and an Internet architecture that is an alpha version. It’s an alpha version that has worked, perhaps, too well.

Relationships

We can start from first principles by focusing on relationships.

If I want to send a letter to John Smith or李文 it’s up to me to find the address or other way to reach my friend.

This is a very different approach from traditional networking in that I don’t depend on the network providing unique identifiers. I don’t even depend on a particular transport provider. I could use the post office to send a letter or maybe post a notice on a bulletin board.

If I want to find yesterday’s baseball scores I can use any source. Not only don’t I depend on the path, I don’t even depend on a particular place. I can get the information from anywhere.

What we do need is a way to exchange bits between two points.

Exchanging Bits

How do we exchange bits?

If the end points are nearby we simply put the bits on a common wire (as with Ethernet) or use a simple radio (as with Wi-Fi). If they are further away we extend the range by providing devices to relay or route the bits over multiple segments using wires or radios.

We can extend this model to exchange bits with our neighbors by sharing common facilities. Just as we own the wires in our homes we share the wires and radios in our neighborhood as a commons.

By paying for the infrastructure as a whole we don’t have to worry about restricting the paths and can take advantage of the entire capacity without having to generate a billable event.

Few people would want to be digging trenches and staying up all night to solve problems so we’d typically contract with a company to maintain the facilities. These people are not service providers. We pay them for the work they do, not the services we create ourselves.

This approach is far more cost-effective than today’s system in which we have providers each with their exclusive facilities. Today the providers reserve most of the facilities for their own use. The actual portion allotted to “Internet” is on the order of one percent!

And it gets better because without the need to maintain “pipes” we can use any path and more easily route around failures. Without the need to generate billable events we are free to provide wireless connectivity anywhere.

Providers generate value to their shareholders by taking as much money as they can from the community in the form of billable events.

As owners we maximize the value by hiring companies that deliver the most price/performance. With a simple measure of performance in terms of the capacity to exchange bits we have the kind of transparency that we need for public infrastructure.

Opportunity

The benefit of starting with relationships is that we get the opportunity for innovation without being beholden to a service provider.

To use a very pragmatic example, today if you carry a medical monitor you’d need to have a billing relationship with a cellular phone provider. If the provider didn’t happen to service your locale you would be out of luck.

It’s not just about generating billable events each time your blood pressure was reported; there is also the complexity of implementing the accounting relationships in each device. If you have two devices that means that each one has to have a billing relationship or you can route one device via another as we do with cellular tethering.

Very quickly we find ourselves negotiating a maze of complex passages just to make such devices work at all and if any of the paths fails you would be at risk.

By taking an infrastructure approach your odds of success increase drastically because if any path works you can exchange bits with medical service. And if there are no paths then anyone could add capacity without being told they are stealing money from a provider.

Furthermore you wouldn’t have to worry about whether such monitoring is worth the price the service provider charges.

We see the same problem with the “smart grid”. Why should a meter reading be a billable event when a webcam with millions of times as many bits being exchanged has no additional cost?

New ideas like content-centric connectivity don’t even fit within today’s billable path approach.

Achieving Infrastructure

The key is in understanding the new paradigms. With this understanding we can see that we have a way to use our existing infrastructure to create value and opportunity. But this value is entirely external. Almost by definition this means we have to fund it as infrastructure – that is, fund the whole.

The cost/benefit is compelling and there are no downsides other than to legacy stakeholders. Even the stakeholders know that it’s when, not whether, they must face up to these changes.

In the US there is also the US First Amendment which has been compromised based on our 1920’s understanding. We need to revisit that compromise.

The problem with policies such as “network neutrality” is that they address symptoms and not the root causes. What we have is a problem with the market structure and not a problem of morality.

We have a similar problem with efforts such as wholesale/retail models (often called structural separation) and gaming approaches such as spectrum auction. Here too we see efforts at incremental improvements that don’t address the decoupling bit exchanges from application relationships.

We don’t need a massive transition. Any community can work together. It could be neighbors or an apartment house or a housing development doing their own local infrastructure. It will only take a few examples to demonstrate the power of infrastructure

With such examples we can scale the approach to governments in cities. After all, what is local government but a way for communities act as a group?

In the transition stages at some point there will be a need to purchase transit from a telecommunications provider but aggregating purchasing power yields major benefits. Local bit exchanges wouldn’t have such costs.

The real benefit is in shifting the way we think about communicating. I’ve used the term “Ambient Connectivity” for the ability to presume we can exchange bits wherever we are.

Looking Ahead

One can consider evolutionary processes as a process wherein information is interpreted in various contexts. Evolution is not just about biology but applies to other systems and markets. Telecommunications is just one example of a market we can rearchitect based using these insights.

The so-called smart-grid gives us another system or market that can be made more effective by recomposing it and separating the flow of information from what we do with it.

In 1897 the British copyright report warned us against taking our vast abundance and creating scarcity. It’s about time we got the message.

Related Essays

You can learn more about Content-Centric Networking at http://www.ccnx.org.