Who Ya Gonna Call ?

August 31, 2009

The Christian Science Monitor recently had an interesting article about innovation and the Internet.  Sitting here today, it is sometimes hard to remember life without the Internet, yet all of its development has happened in my adult lifetime.  (Somehow, we did manage life before.)  The Internet”s growth in both size and scope is truly amazing.

Part of this, of course, is due to basic advances in technology.  The laptop computer on which I am writing this is much faster, and has more than 500 times the memory and 10,000 times the disk storage of the first computer I ever used, an IBM 360/91.   But better basic technology is not the whole story.  The Internet has provided an unusual environment in which new ideas could flourish.  The article touches on some  of the reasons for this, although its main focus is the potential for problems down the road due to inadequate planning for the future.

The author, James Turner, points out that much of the infrastructure of the Internet has not changed very much since its early days:

Like a jazzy sports car that has never had its oil changed, the underlying protocols of the Internet have remained largely unchanged since it came into being in the mid-1980s. The Internet can be surprisingly fragile at times and is vulnerable to attack.

As a statement of fact, this is hard to challenge.  Moreover, the basic protocols that the Internet uses were designed for a different world.  I have, in one context or another, administered Internet E-mail lists practically since the beginning.  People often complain that the mail protocols are insecure, not auditable, and so on.  While many of these complaints are valid, the people making them often don’t realize that, back in the 1980s when machines from different manufacturers, using different operating systems, networking software, and even character sets had to be linked together, getting mail delivered at all reliably was a considerable achievement.

One of the ironies of Internet history is that, although the original designers did not envision their work being used on anything like the scale it is today, they did their job so well the structure has survived growing like topsy:

The Internet grew too big too fast, says John Doyle, professor of electrical engineering at the California Institute of Technology in Pasadena, Calif.

“The original was just an experimental demo, not a finished product,” he says. “And ironically, [the originators] were just too good and too clever. They made something that was such a fantastic platform for innovation that it got adopted, proliferated, used, and expanded like crazy.”

The original design was flexible and robust enough to be used to build a vast array of services and facilities, all of which can work together becuase they adhere to certain standards.  That’s the good news.  The bad news is that, because there is no overall authority “in charge”, making a wide-spread change is difficult, even if essentially everyone agrees that it is needed.

The outstanding example of this is the change in Internet addressing from IPv4 (which gives us the familiar 32-bit IP address, usually written in “dotted decimal” notation as four octets, e.g., to IP version 6, which uses 128-bit addresses.  This obviously gives a much larger pool of potential addresses, and in fact the primary motivation for the change is the easily-predictable exhaustion of the IPv4 address space within a few years.  As of December 2008, the new IPv6 protocol had been on the “standards track” for ten years, but for the most part it is yet to be adopted.  Instead, various technical workarounds (such as Network Address Translation) have been employed to stretch out the life of IP v4.  Virtually everyone agrees that the adoption of IPv6 is necessary, but with respect to themselves and their organizations, they seem to be a bit like St. Augustine in his Confessions:  “Grant me chastity and continence, but not yet” (da mihi castitatem et continentiam, sed noli modo).

Some folks, like Professor Doyle, quoted above, think that what is needed is a re-design of the Internet  from the “bare metal” outwards, to take into account how its usage has developed:

“To the extent I’ve been working in this field for the last 10 years, I’ve been mostly working on band-aids. I’m really trying to get out of that business and try to help the people, the few people, who are really trying to think more fundamentally about what needs to be done.”

I can sympathize with the idea, although it would obviously be a mammoth undertaking, and one would be trying to re-design something that will not be obliging enough to stand still while the work is done.

More fundamentally, there is a basic issue here that I think may relate to other issues of our time.  The Internet is, arguably, the most complex artifact ever constructed.  It has flourished because of an odd combination of strict rules (the underlying, basic protocols) and anarchy (pretty much anything else).  The reality does not fit the picture of either a centrally controlled undertaking, or a complete free-for-all; there are elements of both.  It may be that there is a lesson here that is applicable to other areas.  It is not necessarily true that the only alternative to a completely laissez-faire economy is commissars and collective farms.  We need to think of ways in which we can beneficially combine elements of control and freedom, and not just cling to old ideologies because they are familiar.

%d bloggers like this: