Answers at hand for 100-Gbps test challenges

Th 315767

by Meghan Fuller Hanna

Overview
It's hard to test phase-modulated signal formats with instruments designed for amplitude modulation. Help for technology developers may finally be on the way, however.

Sales of the current generation of optical transport systems may have slowed in the face of the economic downturn, but interest in the next generation continues unabated. Research into cost reduction for 40-Gbps systems and initial development of 100-Gbps technology therefore remains unchecked as well.

Although test vendors have traditionally delivered capabilities ahead of their customers' demands, technology researchers at component, subsystem, and system houses in this instance find themselves ahead of their test set suppliers. That has left instrument manufacturers scrambling to catch up.

The shortcomings have less effect on the client side. In DWDM applications, particularly with SONET/SDH or native Ethernet transmission, client-side interfaces will likely center around 10 Gbps and the necessary test equipment is readily available today.

But for those working on line-side applications, the challenges are numerous at 40 and especially at 100 Gbps. While many network operators say they need to migrate to higher-speed transmission, they would also like to use the existing fiber, frequency range, and, most important, channel spacing of 50 and 100 GHz per channel. Essentially, they would like to maintain their existing infrastructure and switch just the terminal equipment.

Unfortunately, it's not easy to retrofit a 40- or 100-Gbps signal onto the 50/100-GHz ITU grid. Long-haul transport of 100-Gbps traffic in particular will require multilevel modulation to be compatible with existing long-haul DWDM networks.

Thus, the industry is leaning toward the use of modulation schemes in which the signal information is encoded in the phase rather than the amplitude (although modulation formats that use both amplitude and phase are also subjects of study). One advantage of phase modulation is that it lowers the baud rate to less than the data rate. In quadrature format, for example, 100-Gbps traffic can be transmitted at a modulation rate or baud rate of about 25 Gbps (4x25 Gbps), while a 40-Gbps signal can be transmitted at a 4x10-Gbps modulation rate.

The key challenge facing high-speed component manufacturers is that the transmission schemes they're researching are based on phase modulation; readily available test gear will only test amplitude-modulated signals, particularly NRZ on/off keying.

Until recently, equipment vendors worked around this problem by cobbling together their own test beds, which often involves conjuring a way to translate what they're working on into something the test gear can measure. Reports Niall Robinson, vice president of product marketing at Mintera (www.mintera.com), “Essentially, what we did with the phase-shift keying, we have used...a very well understood phase-to-amplitude converter, which we built ourselves. So that means we can take the phase-modulated transmitter, we go through the amplitude converter, we understand the transfer function of this converter, and then we can use traditional amplitude test sets.

“For example,” he continues, “you might want to measure the extinction ratio of the transmitter. To measure the extinction ratio, you need to measure the energy in the zeroes and the energy in the ones. The only way to do this is to convert the signal from its phase modulation to amplitude modulation.”

Test challenges

There would appear to be a market for new instruments that relieve equipment designers of such burdens. However, test vendors say the biggest challenge to creating such tools is not necessarily producing phase-friendly platforms but figuring out which modulation schemes to target, particularly at 100 Gbps.

At 40G, the industry has moved on from amplitude-based schemes such as optical duobinary to consensus around variants of differential phase-shift keying (DPSK) or, in certain applications, differential quadrature phase-shift keying (DQPSK). At 100G, the Optical Internetworking Forum (OIF; www.
oiforum.com) is focusing on QPSK with coherent detection, but there are several others under development, including the mix of amplitude and phase-shift keying known as APSK.

Th 315767

Agilent Technologies' N4391A Optical Modulation Analyzer offers a time-domain -based approach to the measurement of phase-modulated signals.

Several companies are looking at encoding information in different polarization states. The OIF's variant of QPSK for 100G uses polarization multiplexing, for example, which adds another layer of complexity.

“When you look at dual-pol[arization] QPSK,” says Patrick Drolet, product manager at EXFO (www.exfo.com), “I wouldn't say all hell breaks loose, but there's dual-pol QPSK. There's dual-pol DQPSK. There are people who want to do QPSK on two very closely adjacent channels. Everyone wants to have their own flavor of 100 gig,” he asserts.

Without a well-defined or even de facto standard, test equipment manufacturers have three choices. They can choose a major customer and design test gear specifically for that customer. They can design a large number of products to satisfy as many needs as possible. Or they can design a “super box” that tests a large number of applications and parameters. For obvious reasons, none of these options is very attractive, and none provides a sufficient return on R&D investment.

Each modulation format will present its own unique set of test challenges, but there are three other factors inherent in high-speed networking that also must be accommodated. First is increased complexity. Consider, for example, a 4x25-Gbps implementation. Today, each of the four sets of transmitters and receivers must be tested individually.

Second, currently available electrical test gear is frequently as inadequate as its optical counterparts. The typical bit-error-rate tester (BERT) tops out at 12.5 Gbps, though some are available up to 25 Gbps. For a 4x25-Gbps QPSK implementation, for example, equipment developers could probably make do with currently available high-end BERTs. But if they are adding forward error correction (FEC), which boosts the data rate to 28 Gbps, then they exceed the capabilities of current equipment. For an implementation of, say, 2x56 Gbps, the situation only becomes worse.

Third, the testing of optical impairments and other parameters currently must be done on an individual basis. To test the quality of the optical link, equipment developers today must repeat tests for each impairment -- like optical loss, polarization-dependent loss, chromatic dispersion, polarization-mode dispersion, or jitter testing. Running a jitter test alone is a time-consuming process; testing each of these parameters individually consumes more time than a developer can afford to spend.

The missing element

Thus, whatever the modulation scheme or schemes ultimately adopted by the industry, test vendors will have to develop new gear to meet all of the challenges associated with high-speed networking technology.

In some ways, the lack of consensus on the technology developers' side has created some agreement among test vendors. Summarizes Brandon Collings, director of JDSU's (www.jdsu.com) optical network research lab, “It's not expected that only the phase is going to be modulated in a one or zero pattern [in] two potential states, but four potential states or eight or larger than that, perhaps. So you need to measure the phase, but you need to measure it in a very analog way that shows what that phase is and which of the possible 4, 8, or 16 states the signal is supposed to be representing,” he explains. “And the way that is done conveniently is called a constellation analyzer, which is basically a way to measure and project the phase of the signal.”

Ubiquitous in the RF domain, the constellation analyzer was developed in the 1970s to provide diagnostic information for more complex quadrature-amplitude-modulated (QAM) signals. Test vendors like JDSU and Agilent Technologies (www.agilent.com) believe that, with some tweaking, the same basic concept could be applied to phase-modulated signals in the optical domain.

Instead of analyzing one eye diagram or testing one bit-error rate at a time, the optical constellation analyzer would conduct eye-diagram measurements and bit-error-rate tests in parallel for each phase. The devices would result in faster testing and enable component manufacturers to test directly in the optical domain.

Andreas Gerster, marketing manager in Agilent's Digital Photonic Test Division, says the optical constellation analyzer is “one of the missing elements in the whole test and measurement portfolio that is available today.” Agilent claims to be the first to market with this missing element, unveiling the N4391A Optical Modulation Analyzer at last month's OFC/NFOEC Conference.

While the N4391A does more than provide a constellation analysis function -- it offers analysis of both amplitude- and phase-modulated signals via 20 customizable data-analysis tools, including a new error vector magnitude function that displays error in comparison to an “ideal” signal -- the time-domain-based analyzer leverages Agilent's RF microwave analysis work to provide a new capability to optical design engineers. This includes a mask that represents the different polarization states. To help engineers interpret the perhaps foreign measurements the new mask illustrates, the instrument also provides a scalar description of the constellation diagram's parameters.

Like any new product, the adoption of constellation analyzers will depend almost as much on cost as function. “My only worry is that you could buy a decent-sized house in Texas for the price of a 40-gig test set a few years ago,” says Mintera's Robinson. “I don't know what these new test sets are going to be priced at in the market yet.”

When Agilent's N4391A begins shipping this September, it will carry a starting price of $195,000. However, that price doesn't include the complementary oscilloscope and internal local oscillator the analyzer requires.

That price tag, and the price of similar instruments, should shrink as test technology advances and more instruments appear. For example, Anritsu has introduced the MP1800 Signal Quality Analyzer Series for modulator and other optical component applications, while Ixia (www.ixiacom.com) offers the 100-GbE Development Accelerator System to generate and analyze Layer 2 traffic. Yet regardless of the cost, it appears that a lack of test equipment will no longer slow high-speed component, subsystems, and systems development.

Meghan Fuller Hanna is senior editor at Lightwave.

Lightwave Webcast: Test Strategies for 40G/100G Technology Development

Lightwave: Anritsu Intros 40G/100G Tester

Lightwave: Agilent to Debut Optical Modulation Analyzer for 40/100G Physical Layer Test at OFC/NFOEC

More in Test