SPECIAL REPORTS: Optical Networking & WDM
Several factors contribute to increase the difficulty of 40-Gbit/sec design-test equipment should not be one of them.
DR. GUY FOSTER, Agilent Technologies Lightwave Division
Many of the challenges that face designers at 10 Gbits/sec are magnified at 40 Gbits/sec, for many reasons. The obvious one is the very high serial speed of the digital devices, and that is pushing the limits of device and material technology. But there are also less obvious effects. Systems designed to work at these speeds are employing every technical trick that can be engineered to operate successfully. That means passive and active optical components have to meet tighter specifications for parameters such as loss and polarization-dependence.
Dispersion is also a much bigger problem here than at lower speeds, with much discussion surrounding management of chromatic and polarization-mode dispersion in the network. And of course, return-to-zero (RZ) format is being used for long-distance transmission at 40 Gbits/sec and is unfamiliar to a lot of engineers. The switches and routers that will power the core of the network are immensely powerful and must be loaded up with test traffic to ensure proper operation.
Each of the problems listed here also places higher expectations on the test tools. Usually test equipment is required to have better accuracy, lower noise floor, wider range, and more flexibility than the devices being tested. At a time when engineers the world over are struggling to push device per formance farther than ever before, test equipment needs to be even further ahead.
Design and simulation
Starting with design and simulation tools, components such as ICs can be honed for the best performance before being committed to prototyping. Usually, components are the enablers that must be in place before modules and line cards can be constructed into systems-so they come first in the development cycle.
So why is the situation different at 40 Gbits/sec compared to 2.5 and 10 Gbits/sec? A good analog or digital designer can design a 2.5-Gbit/sec amplifier, but it requires a top microwave designer using microwave techniques to design an amplifier for 40 Gbits/sec. The designer is usually dealing with different device types than at lower frequencies, such as high-electron mobility transistors (HEMTs) and metal-semiconductor field-effect transistors (MESFETs), in new materials like indium phosphide (InP). Therefore, new device models are required.
Signals at 40 Gbits/sec have significant harmonics beyond 120 GHz, making it vital to simulate the high-frequency effects of interconnects and the distributed effects of any transmission-line structure whether it is on chip or in a module. Even
DC bias to circuits must be resonance-free, which can be a significant achievement. A less obvious aspect is that 40-Gbit/sec devices are typically smaller than those at lower frequencies, which has a number of implications. They have lower voltage handling capabilities and are more challenging to design for the required output power levels. Thermal management becomes more of an issue than at lower speeds. Devices such as level-shifters become very difficult.There are also problems associated with small input-signal levels, such as ensuring that transmission lines are as loss-free and perfect as possible, with the absolute minimum levels of mismatch.
Simulation tools are developing to meet the challenge, and their benefits are obvious. Issues sur rounding timing, jitter, and phase noise in components like clock recovery circuits and multiplexers can be simulated and "what if" scenarios examined, without the need to build prototypes (see Figure 1). Once prototypes are built and performance measured, results can be fed back into the models to refine them further. Simulation tools are used regularly at all speeds and have been applied to the design of laser drivers, multiplexers, transimpedance amplifiers, modulator drivers, and many other devices.
RF and microwave test
Engineers with a microwave background are used to the world of S-parameters, and a network analyzer is a tool of choice in the electrical domain. The major difference for them is that digital transmission requires components to work from kilohertz to gigahertz, rather than a narrower window around a giga-hertz carrier. Many others working in this arena gave up microwave courses at college at the earliest opportunity in favor of digital design.
For high-speed data transmission, there is no clear boundary between the microwave and digital worlds. Design at 40 Gbits/sec forces consideration of the smallest parasitics, interconnect, substrate, and packaging. Network analyzers are excellent tools for characterizing these effects. They can provide a window on performance up to 110 GHz.
At very high frequencies, transmission theory is required. The S-parameter model of energy flow through and off of devices and components can be the most accurate method for describing performance.
BER testing
Digital systems have many figures of merit that are used to indicate performance. Examples are signal-to-noise ratio and extinction ratio. The ultimate measure is bit-error rate (BER)-whether the system passes bits correctly. BER testers (BERTs) are often used as a data-signal source for eye-diagram testing of components as well as complete test solutions for measuring BER.
Transmission at 40 Gbits/sec pushes the limits of materials such as silicon germanium, and people are using more exotic materials such as InP. Using such materials, they are designing multiplexer chains that involve more stages, at higher speeds, than have been attempted before. These chains have complex timing challenges that must be understood for the component to function properly. Engineers are able to exercise their prototypes on high- and low-speed sides using parallel BERT systems. These systems are able to walk individual multiplexer channels through time and voltage to see the effect on BER on the 40-Gbit/sec side to identify the weakest channel as a candidate for redesign and rerouting.
Serial optoelectronic components such as modulators and receivers require an optical interface as well as an electrical one to effectively test their performance. That is usually achieved by using high-quality, instrument-grade optical interfaces that have known and calibrated characteristics.
Frequency domain testJust as network analyzers are commonly used to assess the performance of electrical components, an equivalent analyzer is used to assess the same characteristics of active optical components. Lightwave com ponent analyzers are network-analyzer-based and use a precisely known and calibrated test set to enable measurement of the optical and electrical sides of suitable components (see Figure 2). These instruments give a measure of the small- signal modulation performance, indicating the frequency range for which lasers, modulators, and photoreceivers efficiently convert electrical signals to modulated light and modulated light back to an electrical signal. They are ideal for examining the performance of the new device types used at 40 Gbits/sec, such as pseudomorphic HEMTs and MESFETs.While at lower bit rates a majority of design work is still carried out by pushing the limits of circuit-board design using specialist materials, at 40 Gbits/sec the environment tends to be based on alumina substrates and "gold brick" packaging. Network analyzers easily show the effects of parasitics and other subtle effects on electro-optic-component performance as well as allow real time tuning of these "handcrafted" early market devices.
Lasers are the optical analogs of microwave oscillators and have a unique set of defining characteristics. RIN (relative intensity noise) is effectively the laser's carrier-to-noise and is a measure of the source's dynamic range. For long-distance transmission, better carrier-to-noise values translate into cleaner transmissions, fewer errors, and longer spans, making it highly desirable to use very low RIN sources in 40-Gbit/sec applications where link margin is scarce. RIN and other critical laser parameters such as linewidth are often measured using highly accurate microwave spectrum analyzers with well-known very-low-noise optical-receiver front-ends.
Time domain testingWhatever the background of engineers working in 40 Gbits/sec, they are probably very comfortable with using an oscilloscope. Specialized oscilloscopes (called digital communication analyzers-DCAs) have evolved for use in the high-speed digital and fiber-optic industry and are optimized for wide bandwidth and calibrated optical eye-diagram measurements (see Figure 3).For short-distance transmission at 40 Gbits/sec, nonreturn-to-zero modulation is being adopted, and that fits well with practices at lower transmission speeds. It yields eye-diagrams that are familiar to a lot of people and have industry-accepted figures of merit by which to judge performance.
For longer-distance transmission at 40 Gbits/sec, the industry is converging on flavors of RZ modulation, and that is less familiar to most engineers in this industry. RZ gives a different-looking eye-diagram, and new measures to distinguish good performance from bad are starting to gain acceptance. DCAs are increasing in bandwidth and intrinsic jitter performance to present the clearest picture of genuine device performance. They are also available with time domain reflectometry capabilities that greatly assist in the design of the high-speed backplanes needed in 40-Gbit/sec systems.
Optical-component testing
Transmission at 40 Gbits/sec places a number of constraints on the system that are different from transmission at lower speeds. Dispersion is an obvious factor, as are higher tolerances placed on every optical component in a 40-Gbit/sec network for parameters such as loss and polarization-dependence. A common plan for upgrading a network to 40 Gbits/sec will require extra gain to be used between terminals, and designers are turning to Raman amplification as a way of keeping signal strengths above levels where noise becomes a problem.
Two forms of dispersion are problematic at 40 Gbits/sec. Chromatic dispersion (CD) is already a problem at 10 Gbits/sec but is managed with schemes such as dispersion-compensating fiber that correct CD across a broad band of wavelengths. At 40 Gbits/sec, the distorting effect is more significant but is predictable and can be corrected. Polarization-mode dispersion (PMD) has received more attention, because its effect at any given moment or location is not predictable. This form of dispersion arises through different polarization components of a signal having different propagation velocities. Unfortunately, the degree of PMD can change with the bending of the fiber, temperature, and connector vibration. It is this variability that has led to the development of active PMD compensation devices to correct such degradations as they happen.
Raman amplifiers are being incorporated into system gain modules, alongside erbium-doped fiber amplifiers. They use the fiber that is already part of the link as the gain medium, meaning these amplifiers are difficult to test in isolation, because their correct operation needs to be established in the exact environment where they are to be used. Additionally, getting gain out of a material as linear as the glass used for fiber is tough. That is achieved by using very high pump powers and wavelengths in the 1400-nm band. Both factors require new capabilities of the equipment used to test the components that go into these amplifiers and verify correct operation of the finished unit. Equipment is now available that ranges from component test systems to sophisticated amplifier test systems (see Figure 4).
Line cards and systems
As development activities move farther up the food chain, the requirements placed on testing are different. Typically, the integration of optical-transceiver components and electrical communication semiconductor components such as framers into line cards and systems add to the level of complexity of the assembly, and the testing moves from parametric performance to higher-level testing. That usually means addressing the device under test in the right language (protocol) and checking that frames are passed correctly, alarms operate as required, and the device displays the behavior needed to form a valid part of the network.
In the early stages of development of new cutting-edge technology, the components required to design a protocol analyzer are usually not available. During this pioneering phase, people often turn to flexible BERTs to fill the gap. These instruments are able to send static RAM-based patterns that are often enough to fool a network component adequately to perform simple testing. That is one of the methods being used in 40-Gbit/sec development at the moment.
Another aspect of system test is to overwhelm switches and routers with traffic to check that they continue to operate correctly, even under the maximum level of stress they're liable to encounter. Before full-rate test solutions are available in the marketplace, this testing tends to be done by loading up multiple lower-speed tributaries.
Dr. Guy Foster works in marketing at Agilent Technologies Lightwave Division (Santa Rosa, CA) for communications test products.