20 years in IT history: Connectivity

28.09.2007
Throughout the late 1980s, microcomputers had been understood as toy versions of mainframes and minis; they were standalone devices that attacked problems with processing cycles. In the quaint locution of the day, they were "artificial brains."

PS/2, NeXT and OOPs, 3 Archie, Linux, Windows Mosaic, Spam and More, Convergence The Dotcoms, Distributed Compution, XML Wireless and Y2K, Millennial Change and Angst, Blogs Sarbanes-Oxley, Virtualization, ERP Hangover Multicore Processors, The Network, The

Gradually, the devices acquired a different function. They became smart links, machines that connected devices, data and people. They went from being computing machines to connection machines. Much of the history of the last 20 years can be written in terms of who got this and who did not.

1987: PS/2

IBM's rollout of its PS/2 microcomputer came on two levels, both news. The ads raved about the classy technical specs: a blazingly fast internal architecture, plug-and-play BIOS, keyboard and mouse interfaces that are still in use today (and are still called the PS/2 interface) and a floppy disk format (1.44M) that was so good it lasted as long as the technology.

The analysts saw a different message: had decided to shoo the children away. Where the PC had been wide open, the PS/2 was buttoned tight. Every aspect of it was proprietary, including the operating system. Businesswise, the job of the PS/2 was to yank the rug out from under both the clone manufacturers and that upstart, Microsoft.

There was no reason to bet against IBM. It had the classiest brand, an immense promotional budget and some of the best engineers in the world. Yet, incredibly, after several years of very expensive triage, the PS/2 initiative crashed and burned. The failure was a body blow to IBM and its standing in the industry.

What went wrong The fingers of blame pointed in every direction (silly ads, pricing), but the truth is the PS/2 was the wrong product for a market coalescing around connectivity. Sizzling performance is nice but not essential in a connector because performance is measured against the entire system, not any one part. Blatant assertions of ownership-this is my toy-threatened compatibility, the key virtue in a connection machine.

IBM failed to understand the important difference between a connection machine and a computing machine. And it paid the price.

1988: Next and OOPs

The connectivity story continued with 's Next. When you bought a Next, you got a piece of great (but completely closed) hardware (which, of course, looked totally cool) and an operating system built around a programming philosophy new to micros: Object Oriented Programming.

Steve Jobs's NeXTCube

Where traditional programming focused on logical operations (computing), OOP's great strength was the management of categories or classes, including hierarchies of classes. This made it possible to write programs that pulled more kinds of things (humans, structures, classes, data) into a given computing environment without forcing the programmer to rewrite these environments from scratch. OOP was a programming language for connection machines.

The totally cool but closed hardware totally failed, while OOP went on to become bigger than the Next computer itself could have ever been. Today, many important computing languages (Java, C++, PERL, SmallTalk, among others) come with an OOP toolbox.

1989: Netware 3

The first customers of micros, largely programmer types, found ways to use their new toys on the job. As collections of these machines aggregated at various institutes and centers, the idea inevitably occurred to their owners that it would be neat to be able to hook everybody's micro together (and to the main system).

Ethernet coinventor Bob Metcalfe

With every passing year, the amazing power of connectivity was becoming more evident. At least in theory, every device you plugged into the network inherited the assets and resources of every other machine on that net. In 1980 the inventor of Ethernet, Bob Metcalfe, took a stab at quantifying the gains to networking by proposing a law that the utility of a network went up with the square of the devices connected to it. While people did and do argue over whether Metcalfe got the exponent precisely right, nobody doubts that he nailed the spirit of the thing.

But getting stability, predictability and compatibility out of a grab bag of machines, themselves in constant flux, was not easy. Novell had been working on the problem since 1983. By 1989 enough hair had been trimmed from the software that people of reasonable skill could use it. Netware 3 was networking for the rest of us, plus it was optimized for Intel's very popular 386 processor. As this combo spread throughout the world it took with it the gospel of connectivity, leaving hosts of beleaguered CIOs struggling to migrate their systems from the quiet world of host/terminal to the mosh pit of client/server LANs.

1990: Archie

In the early years of the Internet, the connection between users and resources was quite informal. If A wanted a specific kind of file or program, he asked around, hoping that someone had seen something like that somewhere and remembered the address. If B wrote a cool program that she thought others might like, she tried to find ways to spread the news This was not ideal, but in the early days of the Net everybody knew everybody else (practically), so the problem was not acute.

PS/2, NeXT and OOPs, Netware 3 Archie, Linux, Windows Mosaic, Spam and More, Convergence The Dotcoms, Distributed Compution, XML Wireless and Y2K, Millennial Change and Angst, Blogs Sarbanes-Oxley, Virtualization, ERP Hangover Multicore Processors, The Network, The iPhone

But by 1990 the community was expanding rapidly and finding stuff was getting harder. That year three McGill students, Alan Emtage, Bill Heelan and Peter J. Deutsch, attacked the problem with a program they called Archie (from "archive"). Archie worked by sending a message from your local system to each entry on a list of servers, asking for the public files available on that server. It would then combine the responses into a single master list on your local system, which you would then interrogate with "Find" commands. Archie was crude, but it illustrated two big points about networking.

First, connectivity is self-extending; it creates entirely new objects, which can themselves become subject to connectivity. And if you connect A, B and C, you can create AB, BC, AC, ABC and so on. These newly created objects might be more useful than A or B or C. The master list generated by Archie was the first step in the evolution of the Internet from a network of networks to a library of resources.

Second, on a network, digital resources can be reused, over and over, forever, at next to no additional cost. Put a search engine on that network and you allow this efficiency to scale without limit. This fact would turn out to have huge economic consequences.

1991: Linux Linus Torvalds parties at the University of Helsinki.

A student at the University of Helsinki named Linus Torvalds released a half-finished operating system, hoping that a few hands might be willing to help out. To his surprise, he found hundreds and then thousands of programmers willing and able to work on the program, which he named Linux. As it turned out, a large network is perfect for supporting projects that are themselves networks, projects made up of pieces that can be worked on in isolation and then combined...over the network. These types of enterprises are enormously efficient, leveraging small investments in time and energy by many people into highly useful (and usually free) tools. Linux was one of the first of these massively parallel collaborations, but soon enough they would sprout up everywhere, from cartography ("mashups") to encyclopedias. And the Web itself.

1992: Windows A very young Bill Gates holds a Windows 1.0 floppy disk.

In 1992 Microsoft finally got a functional version of its latest operating system out the door. Windows 3.1 advanced the art in two ways; it was the first version to carry a useful graphics interface, allowing inputs and outputs to be represented and altered by manipulating icons. And more important, Microsoft's immense marketing power meant it went on desktops everywhere in the world, becoming a de facto standard.

1993: Mosaic

Mosaic was released by the National Center for Supercomputing Applications. What Windows 3.1 was to the microcomputer, Mosaic was to the World Wide Web. Together, they acted to standardize the Internet, allowing all the 3.1 installations (and other compatible machines) to talk to each other with reasonable levels of predictability and stability.

PS/2, NeXT and OOPs, Netware 3 Archie, Linux, Windows Mosaic, Spam and More, Convergence The Dotcoms, Distributed Compution, XML Wireless and Y2K, Millennial Change and Angst, Blogs Sarbanes-Oxley, Virtualization, ERP Hangover Multicore Processors, The Network, The iPhone

Metcalfe's law is not automatic. As networks grow, the potential to do more for less rises, but this benefit remains theoretical until the network has passed through a phase of greater formalization. As a systems scientist might put it, the standardization of the core goes hand in hand with differentiation of the edge. Each advances the other. No better illustration of this point exists than the two episodes of standardization above, which kicked off the immense flowering of Internet content known as the World Wide Web.

As the Web appeared, seemingly out of nowhere, people became convinced that something revolutionary was under way. CIOs everywhere arrived at work to find a new item in their job description: responsibility for getting their company a website, beginning with registering the company's name as a URL, and weighing the delicate ethics of swiping those of their more laggard competitors.

1994: Spam and More

Connectivity, we learned, has a dark side.

In 1994 two lawyers began advertising their services by posting to Usenet groups en masse. They were widely reviled (their ISP revoked their access), though in all fairness, someone was going to walk through that door sooner or later. "Green Card" spam (the villains were immigration lawyers) was the opening gun of the age of malware for profit, which eventually evolved into hundreds of flavors of spyware, extortion schemes, Trojan horses, key loggers, zombies, phishers, bots and so on. Today the average CIO probably spends more time and energy worrying about blocking the bad that networks can do than extending the good.

1995: Convergence

The Israeli company VocalTec announced Internet telephony and RealAudio, streaming audio. These two announcements marked the beginning of the great convergence carnival.

The VocalTec rollout presaged the struggle that VoIP was about to catalyze between telecommunications and IT. The core idea is that someday soon the network is going to eat it all up-voice, music, video, news, data. Everything will be connected to everything else. It's inevitable, but that doesn't make dealing with the business, legal, political and technological issues all this raises any easier.

1996: The Dotcoms

Sun Microsystems formed the JavaSoft group in order to develop the Java technology. Java, a language optimized for writing programs intended to run over a network, was (and is) a big deal, but the news of the year was not technical but cultural. This was the year when irrational exuberance slid behind the wheel, the year the dotcom balloon broke free of its moorings on planet Earth.

PS/2, NeXT and OOPs, Netware 3 Archie, Linux, Windows Mosaic, Spam and More, Convergence The Dotcoms, Distributed Compution, XML Wireless and Y2K, Millennial Change and Angst, Blogs Sarbanes-Oxley, Virtualization, ERP Hangover Multicore Processors, The Network, The iPhone

Much of the fever came from the spreading conviction that old business models were dying: Why would anyone ever want to go to a store anymore How could a business compete if it was carrying the overhead of a brick-and-mortar shop All this meant that anyone wanting a return on his investment had to find a place to park it in cyberspace. Somewhere. Anywhere.

1997: Distributed Computing

Jeff Lawson of Distributed.net showed how the Internet could be used to harness a very large number of geographically dispersed microcomputers to attack a single problem-in this case, a ciphertext released as a challenge by RSA (with a US$10,000 prize attached). Today distributed nets are being used to solve problems in protein folding, the search for extraterrestrial intelligence, financial modeling and many other problems. Under the name grid computing, the concept has become a small but important industry, offering companies needing lots of cycles a cheap alternative to supercomputers.

1998: XML

XML, a markup language optimized for the Internet, supporting most known human scripts and compatible across a wide range of languages and platforms, increased the power and capacities of the Net.

1999: Wireless and Y2K

Time magazine named Jeff Bezos of Amazon its "Person of the Year," writing that "e-business is rapidly replacing the traditional kind for almost any purchase you can imagine."

PS/2, NeXT and OOPs, Netware 3 Archie, Linux, Windows Mosaic, Spam and More, Convergence The Dotcoms, Distributed Compution, XML Wireless and Y2K, Millennial Change and Angst, Blogs Sarbanes-Oxley, Virtualization, ERP Hangover Multicore Processors, The Network, The iPhone

Also, on July 21, Steve Jobs demoed the first cheap wireless modem. Wireless networking did not take the world by surprise. For years everyone had understood that the need to embody connectivity in physical wires was an immense constraint on the growth of networking (and a fatal one, in the case of mobile devices). People had been hammering away at the problem for at least a decade, and a few very expensive solutions were running here and there.

What was different about Apple's AirPort was that it was cheap enough for mass adoption. Over the next several years, wireless LANs began to crop up everywhere. They didn't necessarily work perfectly; the technology came with many headaches, beginning with security and dependability, and CIOs were to spend many hours hammering out the bugs. They did not, however, do much of that in 1999, for that year CIOs were preparing for the imminent end of civilization, generally known as the Y2K bug.

2000: Millennial Change and Angst

First, Y2K went off without a hitch, proving that luck is on the side of those smart enough to be working on well-posed problems and establishing ERP as the way businesses organized themselves.

Second, an Internet company (Google) developed a well-grounded solution to the problem of making money over the Net.

Third, the culture decided that the Internet did not mean the end of business as we had known it, and everyone rose, stretched and sold off their tech stocks. Good-bye, Pets.com, Chipshot.com, and the rest.

2001: Blogs

People had been writing diaries on the Net for years, but the form had never taken off. In the late 1990s, editors appeared with several subtle enhancements, including browser-based website editing, comments, permalinks, blogrolls and trackbacks. These fixes turned Web diaries into blogs, and by 2001 blogs had become one of the great networking phenomena of the age.

The development of blogs illustrates a subtle point about connectivity. Conventional measurements of networks count nodes and bandwidth, but connectivity has at least a third dimension: adaptedness. Every object in a network has a trajectory of enhancements that allow it to work better and do more in a networked environment. One of the several ways in which connectivity is self-extending is that it provides an environment that selects for greater adaptivity to networking. As objects move down this path, as they mature, connectivity surges, even if nodes and bandwidth stay the same...which, of course, they never do.

2002: Sarbanes-Oxley

A number of accounting scandals from leading-edge tech companies (Enron, WorldCom, and the like) led to legislation designed to remake the financial reporting practices of public companies from top to bottom. While Sox, as the act came to be known, explicitly targeted the behaviors of CEOs and CFOs, it probably changed the lives of CIOs as much or more.

PS/2, NeXT and OOPs, Netware 3 Archie, Linux, Windows Mosaic, Spam and More, Convergence The Dotcoms, Distributed Compution, XML Wireless and Y2K, Millennial Change and Angst, Blogs Sarbanes-Oxley, Virtualization, ERP Hangover Multicore Processors, The Network, The iPhone

Sox required that every act in a company's financial life be documented and that every document be auditable, forcing CIOs to supervise a massive increase in documentation and in the control of that documentation. Change management, in particular, went from something the CIO could do on his or her own in an afternoon (for reasons best known to the CIO) to an agenda item for the Change Management Committee.

The scary part is that, given how integrated IT has become with financial reporting, if a CEO or CFO were to be indicted for Sox violations, the CIO is at risk of being sucked into the same prosecution, as a coconspirator.

On the other hand, CIOs are now right at the heart of the enterprise. The bean counters used to complain that IT was all cost and no benefit. Thanks to Sox, IT can now point to a benefit the most obtuse bean counter is likely to appreciate: keeping him out of jail.

2003: Virtualization

Imagine you have two (or more) IT objects, A and C. You want to hook A and C together so they can send signals to each other. Alas, they are incompatible, perhaps because they come from different manufacturers. Virtualization is the business of making a third object, B, that you slip between A and C to fool each of the original objects into thinking the other is speaking its own language, creating compatibility where before there was none.

The IT objects could be anything at all: servers, operating systems, routers, applications, hard disks, caches, whatever. Virtualization allows you to hook anything up to anything else and force the combo to work harmoniously.

Starting in 1999, a company named committed itself to the technology. The early years were slow. People complained that everything had to be done twice (first by A or C and then by B), which meant that everything took twice as many cycles and burned up twice as many resources. The process added complexity. But by 2003 the world was beginning to understand how versatile and powerful a solution this was. One of signs of this dawning comprehension came at the end of 2003 when EMC, a huge storage company, bought a big piece of VMware. In 2007, VMware went public and, in a generally listless market, had the biggest tech stock debut since Google. Virtualization had arrived.

2004: ERP Hangover

In 2004 (or thereabouts), enterprise resource planning (ERP), fell off the hype cliff and (perhaps this is the fairest way to put it) became subject to the net of its positives and its negatives.

ERP is the art of framing a single formal definition for every object and act in a company so that everything can be managed together, top down. For instance, pre-ERP, each department or division in a company usually defined the term "employee" differently. These differences might be tacit and hard to define and perhaps not even known to top management, but they would usually matter. Once ERP came to that company, "employee" would mean the same everywhere, and every aspect of that identity would be explicit and transparent. There would be one database for the entire company and one interface to that database. A manager setting policies for "employees" would know exactly what he or she was doing. ERP is an instrument for bringing companies to a higher degree of integration.

The great virtue of ERP lies in how well it supports compliance with companywide policies. A given change just radiates across the company, with every division learning about it at the same time and in the same way. In the case of Sarbanes-Oxley, which mandates a specific framework for financial reporting, ERP seems essential to getting to compliance at all.

All good. However, as experience with the technology accumulated, downsides swam into view, among them a loss of flexibility and weakest-link exposure risks-if one department enters information inaccurately or imprecisely, everybody suffers. There were others.

ERP is connectivity taken to the extreme; and while the program has applications that are important and useful, it also teaches that there are limits. Connectivity is not the solution to all problems.

Sometimes it is even best avoided.

2005: Multicore Processors

AMD, the microprocessor manufacturer, announced the first multicore microprocessor. Shortly thereafter, Intel follows suit.

PS/2, NeXT and OOPs, Netware 3 Archie, Linux, Windows Mosaic, Spam and More, Convergence The Dotcoms, Distributed Compution, XML Wireless and Y2K, Millennial Change and Angst, Blogs Sarbanes-Oxley, Virtualization, ERP Hangover Multicore Processors, The Network, The iPhone

Multicore computing is understood as a new solution to the problem of improving processor performance, but it might be much more than that.

For decades computer scientists have known there's an alternative to traditional computing: having many processors working on the same problem at the same time. But programming for parallel processing is much harder than programming for a single processor, and that difficulty has discouraged us from exploring that technology.

Truth is, we never really had to go there because single processors got faster so quickly, we never really needed an alternative.

However, ironically, it's the need for processor speed that is now forcing us to figure out parallel computing. The faster processors run, the more power they consume and the more heat they generate. Both of these are limiting factors. Because multicore gets its edge by running more processors, not faster ones, it allows the core to stay cool and energy-efficient. Many analysts expect multicore to dominate processor design from now on, with the number of connected cores per motherboard rising steadily as we get better at solving the programming problems presented by this new architecture. A decade from now parallel programming will be the standard and perhaps we will be a lot closer to matching the skills of the human brain.

2006: The Network

The growth in average traffic level (75%) outpaced the growth of capacity (47%) on the world's Internet backbones for the third consecutive year.

For the last few decades, the world's networking engineers have done prodigies in building track just in front of the advancing locomotive of Internet traffic. But recently the locomotive has been gaining. We might be just a few years away from a new kind of Internet, one in which applications are triaged and bandwidth is metered and everybody has to make do with performance levels far below ideal.

While there are many culprits, the biggest appetite out there belongs to video. (In March 2007 more than 100 billion videos were watched by users in the U.S. alone. That's a lot.) Almost every day brings news of a new Internet video application, from movies on demand to TV-over-IP to civic and or educational applications of YouTube. NBC has just announced that it is planning to use the Internet to carry more than 2,000 hours of Olympics coverage next summer, and NBC will not be the only network streaming from Beijing.

Comparable developments are unfolding in many enterprises as video is used for more functions, from videoconferencing to speeches by management to remote attendance at important conferences. Eventually "bandwidth rationing" will probably arrive even in companies with 10-gigabit LANs, and you can guess on whose shoulders the task of imposing that rationing is going to fall.

That's right. The CIO.

2007: The iPhone The iPhone: The last word in connectivity Or the beginning of a new chapter We'll see.

Steve Jobs demonstrates that he finally and totally and conclusively gets the difference between machines that compute and machines that connect.