Monday morning, my publisher Paul Becker got up at an ungodly
hour to pick me up at 7 A.M. and bring me to breakfast. One of the
secrets to supporting yourself as a writer is to get your publisher to
buy you as many meals as possible, although with Prentice Hall this
is certainly no guarantee of fine dining.
Paul drove me up towards Harlem, near my first appointment at
Columbia University. We found a cheap diner and I had my first
American breakfast in weeks, greasy eggs and limp bacon.
I grabbed my bags and carried them across the street to the
Seeley W. Mudd building, home of Columbia's
Center for Telecommunications Research
(CTR). CTR is one of six NSF-sponsored Engineering Research Centers and specializes in optical networks.
The main CTR project is called
Acorn or Terabit, depending on
whom you ask. The Terabit name aptly summarizes the goal of the
project, to develop a scalable optical network providing gigabits to
the user and terabits of aggregate bandwidth by exploiting the full
capacity of optical fiber.
I rode up to the 12th floor of the Mudd building, thinking what
a silly ring the name had, and reminding myself never to allow a
building to be named after me. Not that the subject has come up
frequently.
Since I was early, I went into the office of W. Clayton Andrews
to wait for him, resisting the temptation to walk over to the workstation and Telnet to Boulder to read my mail. A few minutes later,
Clayton Andrews came in.
A distinguished-looking professor, Andrews was formerly director of an important IBM laboratory in Switzerland. He had joined
CTR as an associate director, working alongside such optical luminaries as founder Mischa Schwartz and the current director, Anthony Acampora.
Acorn had been successfully deployed in the laboratory when I
visited, and Andrews and others were making plans for further deployment into the real world. They were pursuing two different
sites to test the hardware and software.
One would use the dark fiber that Teleport had deployed in
lower Manhattan. The other use was on the Columbia campus network, which was getting ready to lay multimode fiber around campus. CTR was trying to get some single-mode fiber pulled at the
same time, because of the minuscule marginal cost.
Acorn is based on the linear lightwave network (LLN) concept
developed by Thomas Stern, another Columbia professor. The LLN
uses different wavelengths of the fiber for different signals. By combining wavelength multiplexing with a star-like topology, circuits
can be set up between different nodes on the LLN.
The architecture is scalable, allowing stars of stars to be built,
potentially supporting a network of a million or more nodes. Circuit establishment is provided by electro-optical filters located
throughout the LLN which establish a path from a source to the
destination.
Networks that actually do something are certainly more useful
than those that run test patterns, so Acorn is pursuing a half-dozen
test applications to help shake out the network. At the edge of the
LLN network is a Network Interface Unit (NIU). The NIU links up
to the user workstation using local interfaces such as HIPPI.
Andrews brought me into the laboratory to see the prototype
network. The entire network was on a single laboratory table, with
a few racks of electronics located nearby to test and drive the network. The system had been tested with the transmission of High
Definition TV over the network, using ATM cells to move data at
800 Mbps. The network has also been tested at the full speed of 1
Gbps.
The Acorn project would deliver this gigabit capacity to the door
of several organizations on campus, who would then work with
CTR to try and exploit the bandwidth to do something useful.
The applications included classical gigabit projects such as medical imaging and quantum chemistry. It would also bring bandwidth
to places like the law school, which was trying to move large
amounts of data around for full text search and retrieval. Of course,
the bottleneck in the law school application would probably be the
complicated keyword searching of large optical jukeboxes, not the
network.
Another application, possibly the most useful, was a proposal by
the Teacher's College at Columbia University to use computers
more in K-12 programs. Teachers College hoped to link two schools
in New York, one in the inner city and the other an exclusive private
school, to multimedia resources.
In 1990-91, a prototype of the system was put in at the private
school, allowing sixth-graders to communicate with a CD-ROM containing a data base on ancient Greek culture called the Perseus Project. The software allows groups of students to perform a group
excavation of a hypothetical site in Greece.
The Acorn project would set up a library of resources at Columbia and make them available to students at the schools. The project
would buy as many off-the-shelf resources as possible and store
them in the library. When a user wanted to access a resource, the
data would be retrieved, compressed, and sent over the network to
the user's workstation.
Throughout the twelfth floor, we saw laboratories filled with
esoteric-looking hardware. This was definitely a hands-on research
center. In one room was a molecular beam epitaxy system, in another a testing laboratory for HDTV, and in a third an experiment to
provide different guaranteed classes of service on complex, high-speed networks.
The real, operational nature of CTR was a sharp contrast to many
computer laboratories, where hands-on usually meant typing on a
keyboard. It was an even sharper contrast to my afternoon appointment at the
Columbia Institute for Tele-Information
(CITI), located
in the Business School.
CITI, as you might guess from a place with Tele-Information in
the name, is not the kind of place with lots of computers. Instead,
the offices are stuffed with paper, books, and loads and loads of
reprints of CITI publications.
CITI is an economist think tank about networks. The founder is
Eli Noam,
a lawyer and an economist. A former member of the
New York State Public Service Commission, his is one of those ubiquitous figures in the discussions that hover around the FCC and set
much of our policies for public networks.
CITI had been trying to drag me into one of their policy studies,
a grandiose project called Private Networks and Public Objectives.
My contact was an ebullient young Irish Marxist,
Áine M. N&iaute;
Shúilleabháin, an able policy analyst putting in time at CITI to pay
the rent.
Public Networks and Private Objectives is characteristic of the
studies conducted at think tanks like CITI. Lots of people attend a
few conferences and seminars and some papers are prepared. The
papers are then sold as part of the huge Working Papers series.
Then, after a little massaging, some of the papers are collected together and published as a book.
This particular study was going to look at a taxonomy of networks, develop a general research framework, turn that into a conceptual model of networks, and finally come up with a policy
agenda for the future.
The problem CITI was having was that they were getting lots of
speakers, but very few of them knew much about networks.
Granted, many were specialists in networks, but most were economists specializing in issues like public utility pricing models. Occasional terms like "star topology" or "coaxial-fiber hybrid system"
would enter the conversation, but the general discussion was a bit
removed from reality.
Áine was certainly aware that in discussing the policy fate of
technical networks, it helped to know a bit about the technical nature of those networks. She had been trying hard to recruit some
more technical people to participate and help lend a note of reality
to the process.
I ended up waffling out of the study. Having been trained as an
economist and then escaped, I was not anxious to get back into the
world of simplistic theoretical models based on unrealistic assumptions. Besides, I had to finish my trip and then get ready for two
more circuits around the world.