Characterization is key to exploiting fiber potential

Th 208484

The dizzy days of the telecom boom saw much talk of 40-Gbit/sec transmission. A number of laboratory systems were demonstrated-often running over spools of the latest cutting-edge optical fiber. There were even a few field trials, usually over modern fiber infrastructure, but no commercial deployments of 40-Gbit/sec systems across an older fiber base.

In the sobering up period that followed the telecom crash, no one dared to mention 40 Gbits/sec-it would have been too much like playing very loud music on the morning after a heavy party session. But at the 2005 ECOC trade show in Glasgow, the talk was returning to 40-Gbit/sec systems. And in recent years, a number of technologies that address the technical challenges of 40-Gbit/sec transmission over older fiber have been quietly maturing in the background. Such developments include forward error correction, advanced modulation formats, and adaptive dispersion compensators.

So, with new and improved tools available to help to overcome 40-Gbit/sec signal impairments, as well as the availability of lower-cost components, could the deployment of 40-Gbit/sec systems soon become a cost-effective option? Perhaps now is the time for operators to start taking 40 Gbits/sec seriously and assess whether their installed fiber infrastructure is capable of supporting 40-Gbit/sec transmission and other next-generation optical networking technologies.

The latest generation of optical fibers has been specifically designed to cope with the stringent requirements of next-generation optical networks. They boast very low levels of polarization-mode dispersion (PMD), can handle high optical power, and have chromatic dispersion characteristics optimized for DWDM operation. The performance of such fibers is specified over broad wavelength ranges (S-, C- and L-bands for G.656 fiber) with loss specifications that often span 1260 to 1625 nm (G.652.C and D).

Operators in the fortunate position of being able to install brand new fiber in their networks have little to worry about-provided, of course, that the installation process is quality controlled. However, in reality, most telecom networks contain a mixture of various fiber vintages, possibly from a number of suppliers and sometimes including different types of singlemode fiber.

Complicating matters further, the original installation may not have been performed to current standards, while the infrastructure may have deteriorated over the years. As a result, it’s far from certain that next-generation technologies can be deployed with confidence over existing infrastructure. And digging up the streets to install new cable is an expensive option that few telecom companies can afford nowadays.

In many areas of Europe, there is already plenty of installed fiber, much of which is not yet lit. Wherever possible, operators are using this dark fiber to create new networks and new links, even if it means stitching together a link using different suppliers for different sections of the route. However, this approach can lead to uncertainties with respect to the end-to-end performance of the network, not to mention contractual issues if a problem arises somewhere down the line.Th 208484

Clean sweep: Video images showing connector end-faces before (left) and after (right) cleaning.

Another characteristic of the modern telecom world is that it’s an industry in a state of flux. Many telecom companies have experienced changes in ownership, sometimes following bankruptcy or other financial difficulties, while massive levels of redundancies have led to numerous changes in personnel. As a result, it’s often the case that the documentation, records, and test results of the fiber infrastructure have not survived these changes. That’s assuming that the records existed in the first place-many new networks were deployed at breakneck speed in the boom time and the paperwork didn’t always catch up. In some extreme situations, even the fiber type is unknown.

For operators looking to deploy advanced networking technologies, it’s clear that what’s needed is a full fiber characterization that can determine all of the key performance attributes of their existing fiber infrastructure. They can then compare this set of parameters with transmission and standards requirements, both current and future, and assess whether or not the network will support next-generation systems satisfactorily. And there are many fiber characteristics that can affect optical network performance.

Chromatic dispersion. Next-generation optical networks are often highly sensitive to chromatic dispersion levels-too much dispersion means that compensation will be needed; conversely, if the dispersion is too low, then effects such as four-wave mixing may cause problems in DWDM systems.

The amount of dispersion tolerated by a system decreases in proportion to the square of the bit rate. So at data rates of 10 and 40 Gbits/sec, it is far more likely that some form of chromatic dispersion management will be needed. For 10-Gbit/sec systems, it is usually possible to calculate the chromatic dispersion with sufficient accuracy for compensation, assuming that the fiber type is known. But for 40-Gbit/sec transmission, the low level of residual dispersion that is required means it’s normally necessary to measure the chromatic dispersion of a link in order to apply the correct amount of compensation.

For DWDM systems, it is desirable to avoid wavelength regions with very low dispersion in order to avert four-wave mixing. In the case of G.653 fiber, which has a zero dispersion wavelength in the 1550-nm region, chromatic dispersion measurement will allow the identification of safe operational wavelengths where there is sufficient dispersion to mitigate four-wave mixing.

Chromatic dispersion measurement is also the most effective way to identify an unknown fiber type. G.652, G.653, G.655, and G.656 fibers all have distinctive dispersion characteristics, as do the various brands of G.655 fiber from different manufacturers.

Polarization-mode dispersion. This is one of the most likely causes of transmission impairment within 10- and 40-Gbit/sec systems, particularly if the fiber is of an older vintage or was procured during the boom years. At that time, high-quality fiber from established manufacturers was in short supply. Many telcos were forced into buying fiber and cable sourced from outside Europe, where some fiber manufacturers did not have well-controlled processes. This resulted in fiber that had high PMD levels.

PMD is a particularly difficult problem to deal with from a systems perspective because each channel in a DWDM system may experience a different level of differential group delay (DGD), and the level of DGD on each channel may vary as a function of time. This means that, if PMD compensation is to be attempted, it must be carried out on a channel-by-channel basis and it needs to be dynamic, making PMD compensation both difficult and costly. In addition, PMD compensation can only deal with relatively small amounts of dispersion, so it will not help in situations where the fiber is of a very bad quality.

Another tricky characteristic is that the amount of PMD can vary significantly from fiber to fiber, even within the same cable. This calls for an extra dimension in a cable’s records to keep track of each fiber’s PMD performance and grade it according to the data rate that it can support. This will help operators to allocate their fiber routes efficiently by using poorer-performing fiber for lower-data-rate circuits, while reserving the best ones for future high-data-rate usage.

Optical losses. There are many other features of an existing fiber infrastructure that may contribute to the increased loss of an optical link. Some of these will introduce losses at a particular wavelength or wavelength range, while others may affect all wavelengths equally. The most common problem encountered in optical systems is that of dirty connectors, which can cause several decibels of insertion loss.

Dirty connectors also cause troublesome reflections. If too much light is reflected back into the laser, then its performance may become unstable, thereby resulting in problems with the system. In addition, if there are a number of high-power reflections, then some of the backreflected light may be reflected forward again and appear as noise at the receiver.

Bends and kinks in the fiber can also introduce extra losses, particularly at the longer wavelengths often deployed on high-channel-count DWDM systems (extending into the L-band) or within CWDM systems that may operate up to 1611 nm. The problem of fiber bends affecting performance is not a new one. However, recent studies by BT have shown that tight fiber bends have the potential to cause catastrophic failure when the high power levels associated with Raman amplifiers are used (see FibreSystems Europe/Lightwave Europe, January/February 2006, page 9).

Older fiber types are also likely to suffer high loss at wavelengths around 1383 nm at the location of the water peak (a peak in spectral attenuation due to light absorption by hydroxyl ions from water impurities). This can limit the usable number of CWDM channels and may restrict the wavelength range available for future DWDM systems.

Nonlinear effects. With next-generation optical networks using higher-powered lasers and amplifiers, as well as more DWDM channels with closer channel spacing, the impact of nonlinear effects may become more pronounced. Problems are particularly likely to occur with G.653 fiber and other fibers with smaller core areas, where the concentration of light energy is at its greatest.

So what exactly is meant by “fiber characterization” and how does it differ from fiber testing? An important distinction is the purpose of conducting the analysis. When the cabling was originally installed, it should have been tested to verify that it met its installation specifications and that the workmanship was of a good standard, such as meeting splice loss specifications. The main test tool used for such commissioning tests is the optical time-domain reflectometer (OTDR), typically backed up by insertion loss measurements.

Fiber characterization, however, is the process of conducting a comprehensive suite of tests that fully assess the capabilities of a fiber infrastructure. The purpose here is primarily focused on supporting transmission systems, though obvious defects in the infrastructure should be reported and rectified.

Often the systems supplier will also use these characterization data to help to configure its equipment for optimal performance and/or cost. For example, chromatic dispersion parameters allow the correct amount of dispersion compensation to be applied, while loss measurements may indicate that it is possible to use lower-power and lower-cost optical transceiver modules.

The first stage in a fiber-characterization project is to define its extent accurately. The scope of work should identify which applications, systems, and/or standards need to be supported by the fiber infrastructure. It may be that the project focuses on one particular application, such as determining the ability of a network to support 10-Gigabit Ethernet, or assessing it for next-generation network operation, including ITU OTN levels 1, 2, and 3, plus DWDM and CWDM.

Clearly the scope of work also needs to include details of the fibers to be characterized, how many fibers there are on each span, and how many spans, nominal distances, and fiber types, where they are known. From this information and with knowledge of the requirements of the applications, a fiber-characterization project plan can be devised that defines the range of tests and test methods to be applied. It’s also important to ensure that the test equipment used is capable of measuring the links by establishing that it offers sufficient dynamic range for the parameters under test, for example.

A comprehensive fiber characterization should include all of the following tests:

• The video inspection of connector end-faces-dirty connectors will adversely affect many of the test results.
• Insertion-loss measurements, typically at 1310, 1550, and possibly 1625 nm-this is the most accurate way of determining the total link loss.
• OTDR testing, typically at 1310, 1550, and 1625 nm-this will identify any problem areas, including bends, bad patching, and high-reflectance connectors.
• Spectral attenuation measurements from 1200 to 1625 nm-these measure the extent of the water peak and verify CWDM operational losses.
• Chromatic dispersion measurements-a multiwavelength OTDR may be used for links with standard fiber over distances up to 100 km (in addition, phase-shift or differential phase-shift measurements are sometimes carried out).
• PMD measurements-there is an extensive choice of test methods that may be applied, including interferometric techniques, fixed analyzer method, or the Jones Matrix Eigen-analysis method if it’s necessary to measure the DGD as a function of wavelength.
• Nonlinear effects-susceptibility to nonlinear effects can be investigated using an optical spectrum analyzer and suitable high-powered light sources.

A full fiber characterization traditionally required a large amount of test equipment. Fortunately, several highly capable multipurpose test platforms are now available, such as the JDSU (Acterna) MTS-8000, which can carry out all of the tests using a pair of units equipped with appropriate modules. Test systems are also available from Agilent (the Modular Network Tester), Anritsu (the NetTest CMA5000), EXFO (the FTB), and PE.fiberoptics (the ONA600, with integrated dispersion measurement).

The fiber characterization process can produce a lot of data, so it is important that they are processed, analyzed, and summarized into an approachable and understandable format. Results can be reported using an interactive web browser that allows access to different levels of information according to the needs of the viewer. Another option is to integrate the results into a database and information-management system, such as the JDSU (Acterna) OFM500 system.

Of course, it is important that the characterization is carried out by competent engineers to ensure that the right test program is drawn up in the first place, that measurements are performed properly, and that the measurement data can be validated in the field and then analyzed to produce a comprehensive and authoritative report.

U.K. fiber-optics training company Optical Technology Training has been working with the major fiber-characterization equipment vendors to ensure that training and assessment is available to cover all of these tasks and to give end users confidence in the outcomes of the characterization. The result of this initiative is the Certified Fiber Characterization Engineer scheme, which was launched in Europe in autumn 2005 and is scheduled for launch in the U.S. during 2006.

Richard Ednay is technical director of Optical Technology Training (Skipton, UK; www.ott.co.uk). He has more than 20 years of experience in fiber optics and is the U.K.’s principal expert in the fiber testing of installed links. This article first appeared in the January/February 2006 issue of FibreSystems Europe in Association with Lightwave Europe, page 14.

More in Network Design