Backbone to the future

Sept. 1, 2001

SPECIAL REPORTS / Fiber and Cable

Traditionally, long-haul networks served as the proving ground for technologies that migrated to other parts of the network.

FRANK G. WAKEHAM, Corning Inc.

The telecommunications landscape has undergone a remarkable conversion in transport technologies over the last 20 years. At the core of these changes is fiber optics, perhaps the most significant development in communications since the invention of the telephone. Many fiber-optic technologies have proven effective in long-haul networks, then migrated in some form to shorter-distance applications. In the first commercial systems, however, the opposite was true.

After a lightning strike destroyed equipment and severely limited critical radio communications used by the Dorset police in Southern England, Standard Telephones and Cables installed the first fiber-optic communications system in September 1975. In 1977, live telephone traffic was transported over optical fiber in separate field trials held by AT&T, General Telephone and Electronics, the British Post Office, and others. These multimode systems, which provided the framework for the earliest long-distance systems, used LEDs or gallium arsenide lasers, transmitting at 850 nm over graded-index fiber.

At that time, scientists around the world recognized that lower attenuation and the elimination of modal dispersion enabled a light source transmitting at 1,300 nm over singlemode fiber to travel much farther and at a greater data rate than the 850-nm sources over graded-index fiber. However, the challenges of coupling light into the relatively tiny core of singlemode fibers and the lack of dependable lasers at the higher wavelengths outweighed the advantages.

The migration of fiber-optic technologies from long-haul applications to the metro, access, and premises networks:

  • Fiber type. Graded index → standard singlemode → dispersion-shifted → nonzero dispersion-shifted
  • Light source. LEDs → lasers
  • Wavelength region. 850 nm → 1,300 nm → 1,550 nm
  • Channel count. Single channel → coarse WDM → dense WDM
  • Signal strength. Frequent regeneration → erbium doped fiber amplifiers (EDFAs) → Raman amplifiers with EDFAs
  • Modulation rates. 45 Mbits/sec → 180 Mbits/sec → 400 Mbits/sec → 2.5 Gbits/sec → 10 Gbits/sec → 40 Gbits/sec
  • Signal routing. Optical-electrical-optical conversion → optical add/drop multiplexing → optical switching → all-optical network
In 1980, AT&T filed plans with the Federal Communications Commission for the first large-scale fiber project. The 611-mile system would connect major cities along the Northeast corridor from Boston to Washington. Although AT&T was still committed to graded-index fiber for terrestrial long-distance, the designers planned to use three channels on each fiber-one of the earliest WDM applications. By the time the system was finally completed in 1984, AT&T decided to only use two wavelengths, one at 825 nm and another at 1,300 nm.

This landmark system quickly became outdated, however. Improvements in singlemode-fiber coupling and advances in 1,300-nm lasers gave a leg up to competitors, making AT&T's Northeast corridor the last long-haul multimode system built in the United States.

In 1982, after securing 100,000 km of singlemode fiber, MCI began building a network along the same route. At 1,300 nm over singlemode fiber, MCI's system was able to carry data 50% faster on one wavelength than AT&T's graded-index fiber carried it on two wavelengths, and MCI increased repeater spacing from 7 to 30 km. MCI's network proved that singlemode fiber was superior to graded-index fiber for terrestrial long-haul applications. Other long- distance companies quickly shifted to singlemode fiber.

Meanwhile, transatlantic cable operators were losing market share in the late 1970s to satellite communications companies. Submarine coaxial-cable technology could no longer effectively compete, so cable operators turned to fiber for the needed capacity. In 1988, after years of planning and development by a consortium of companies led by AT&T, the first transatlantic fiber-optic cable (TAT-8) was completed and placed into service. TAT-8 consisted of three singlemode-fiber pairs operating at 1,300 nm with repeaters spaced about 60 km apart.

Standard singlemode fiber and 1,300-nm lasers had become the standard for long-haul systems by the early 1980s. Telephone and cable TV companies adopted that technology and installed fiber over much shorter distances to connect switching locations, central offices, headends, etc. By 1984, almost 250,000 miles of fiber cable were installed in the United States alone. The fiber-optic market was booming, and both fiber design and photonic technology were advancing at an incredible pace.

The emergence of the Internet and rising data traffic during the 1990s caused an explosion in the demand for bandwidth. The capacity of the existing infrastructure was strained and carriers were experiencing bottlenecks on critical routes. Companies poured money and research into finding ways to put more information on each strand of fiber and stretching the distance between optical-electrical-optical (OEO) conversions. New entrants seized the opportunity to take part in the lucrative high-capacity transport market by constructing new networks. The new players were determined to go beyond the limitations of the legacy systems already in place. In response, system planners once again redefined the long-distance architecture.

The minimum attenuation of standard singlemode fiber is about 0.25dBm/ km and occurs at about 1,550 nm, and the minimum dispersion is about 0 psec/ nm-km at 1,310 nm. To enable higher modulation rates and greater regenerator spacing, long-distance carriers needed a way to minimize both attenuation and dispersion at a single wavelength.

An early answer was dispersion-shifted fiber (DSF), a technology first introduced in 1985. DSF was designed to move the zero-dispersion wavelength to 1,550 nm, where attenuation was already at its minimum. That offered great value for carriers transmitting a single channel on each fiber; it enabled them to increase their data rate and repeater spacing. Two very potent technologies soon followed, however, that proved incompatible with DSF.

DWDM and erbium-doped fiber amplifiers (EDFAs) became the fundamental drivers for the next generation of long-haul networks. Commercialized in the mid-1990s, the two technologies enabled carriers to pack far more data on each fiber and keep the signal in the optical layer for much greater distances before requiring regeneration.

Because EDFAs amplify in the 1,550-nm window across a broad amplification band, DSF initially appeared to meet the requirements for long-distance, high-rate applications. But non-linear responses caused by the increased power from optical amplifiers and the spacing of the multiple-channel wavelengths needed for DWDM limited the promise of DSF. The most troubling non-linear effect was four-wave mixing, which occurs when evenly spaced wavelengths mix and produce additional channels that overlap with and drain power from the original signals.

Four-wave mixing is most pronounced at the zero-dispersion wavelength, where it directly conflicts with the efforts to minimize dispersion to get the most out of a fiber's transmission capacity. In 1994, a nonzero dispersion-shifted fiber (NZDSF) was introduced. This design shifted the zero-dispersion wavelength to just outside the EDFA's operating gain band, striking an efficient balance between minimizing non-linear effects and maintaining dispersion low enough to permit long-distance transmission at high bit rates.

As non-regenerated distances, channel counts, and bit-rates continued to climb and migrate into the regional and metropolitan spaces, designers began developing fibers that were optimized for specific applications.

An NZDSF with a large effective area specifically designed for long-haul carriers reached the market in 1998. Because non-linear interactions are a function of the power density in the fiber, the larger effective area flattens the power distribution across the core and allows greater overall power in the fiber while minimizing the corresponding non-linear effects and performance degradation found in conventional NZDSF.

An NZDSF specifically designed to meet the unique demands of metropolitan and regional optical networks reached the market in 2000. This fiber's negative dispersion in the 1,550-nm window enables it to carry signals over 400 km using less expensive lasers without incurring the cost of dispersion compensation or regeneration.

Researchers have made tremendous strides in increasing the non-regenerated reach and capacity of fiber. And that focus continues today.

Over the past several years, extraordinary advances have also been made in the end-to-end optical network, where signals are added, dropped, routed, combined, and split, all in the optical layer. Advances in WDM, add/drop multiplexing, optical crossconnects, and optical switching enable the transmission and routing of signals packaged in wavelengths, which eliminates the burden of OEO conversion. This technology, although still emerging, is migrating from long-haul faster than earlier innovations. In many cases, the technology is being developed and used concurrently in multiple application spaces.

Before 1977, nearly all communications were carried over copper-from long-distance to the customer premises. Because of the relentless demand for increased bandwidth and the tremendous advances in fiber optics, copper is steadily being replaced by fiber. As this transition occurs, much of the technology that originated for long-haul applications is migrating into the regional and metro spaces.

Single-channel, non-amplified transmission at 1,300 nm over standard singlemode fiber was proven in the long-haul market in the 1980s, then quickly adapted for regional and metro applications. Long-distance carriers successfully demonstrated multiple-channel, amplified transmission in the 1,550-nm region over NZDSF in the 1990s. These technologies are now replacing the first generation of regional and metro systems. We are also beginning to see coarse WDM systems in the premises and access spaces, 1,550-nm transmitters, and even some advanced fiber types and laser-optimized multimode fibers.

Many technologies have started the march from long-haul to metro to access and premises (see Figure). Some of these technologies may not complete the journey. As these technologies mature, and the demand for bandwidth climbs, we will continue to witness the technology voyage through the application spaces, a course charted by the earliest long-distance systems.

Frank Wakeham is a market development specialist at Corning Inc. (Corning, NY).

Sponsored Recommendations

The AI and ML Opportunity

Sept. 30, 2024
Join our AI and ML Opportunity webinar to explore how cutting-edge network infrastructure and innovative technologies can meet the soaring demands of AI memory and bandwidth, ...

How AI is driving new thinking in the optical industry

Sept. 30, 2024
Join us for an interactive roundtable webinar highlighting the results of an Endeavor Business Media survey to identify how optical technologies can support AI workflows by balancing...

Advances in Fiber & Cable

Oct. 3, 2024
Attend this robust webinar where advancements in materials for greater durability and scalable solutions for future-proofing networks are discussed.

On Topic: Optical Players Race to Stay Pace With the AI Revolution

Sept. 18, 2024
The optical industry is moving fast with new approaches to satisfying the ever-growing demand from hyperscalers, which are balancing growing bandwidth demands with power efficiency...