Optical constellation analyzers tackle complex measurements

Dec. 1, 2009
The optical constellation analyzer measures complex signals that are encoded not only in amplitude but also in phase, and it can depict these signals in both the horizontal and vertical polarizations of the fiber.

By Meghan Fuller Hanna

Overview

The optical constellation analyzer measures complex signals that are encoded not only in amplitude but also in phase, and it can depict these signals in both the horizontal and vertical polarizations of the fiber.

In test applications, typical binary modulation schemes such as on/off keying (OOK), in which only one bit is transmitted per symbol, are amplitude-based and can be directly detected using a high-speed photodiode with a sampling oscilloscope. The measurement results from this digital communications analyzer (DCA) are well understood; high power is a one, no power is a zero.

Thanks to ever-escalating bandwidth requirements, traditional binary modulation schemes are no longer sufficient. Carriers need to migrate to higher-speed transmission and carry that over their existing 50-GHz DWDM systems and outside plant. Such a feat will require multilevel modulation; instead of transmitting two levels of information—power or no power, one or zero—advanced modulation schemes make it possible to send multiple levels of information, perhaps multiple levels of amplitude and/or multiple phases of the signal.

“The price we have to pay for this is that the transmitter and receiver equipment becomes much, much more complicated, much more expensive, and much harder to test,” admits Andreas Gerster, marketing manager in Agilent’s Digital Photonic Test Division (www.agilent.com). “But what you can achieve is a much higher spectral efficiency because you can, for example, transmit 100 Gigabits over a 50-GHz channel if you are just able to transmit four bits per symbol.”

Unfortunately, the conventional method of direct detection is insufficient for the measurement of phase-modulated signals. Consider a photodiode used in the traditional way, as a receiver set up to detect OOK or amplitude modulation. When presented with a phase-modulated signal in which the optical carrier is modulated in phase but not in amplitude, that photodiode will respond with all ones; it sees a constant amplitude signal.

“Conventional eye-diagram analyzers are useless in their current configuration,” reports Daniel van der Weide, vice president of engineering at Optametra (www.optametra.com). “Not that the eye diagram itself is a useless concept anymore; it certainly is useful, but you can’t plot the conventional quantities on it.”

What is required instead is a piece of test equipment known as an optical constellation analyzer that derives phase information by mixing the incoming field or signal under test with a local laser that operates at a fixed wavelength. The constellation analyzer allows the user to look at the optical signal in the complex plane, meaning both the real and imaginary parts of the signal. This combination of amplitude and phase is also referred to as a vector, a two-number set that describes the magnitude (amplitude) and angle (phase) relative to the reference signal produced by the local laser.

“By capturing the mixed signal of the local laser and the signal under test, you can retrieve the full information about the field of light, namely amplitude and phase,” explains Peter Andrekson, director of EXFO Sweden (formerly chief technology officer of PicoSolve, which was acquired by EXFO in February—www.exfo.com). “Having recovered this signal, you can display it in many different ways. One very popular way in the research community, at least, is to display the complex coordinates, X and Y, both the real and imaginary part, in a so-called constellation diagram...[which looks] like dots around a circle. Each of these dots represents a piece of information in the complex plane. So we are going away from analyzing data on just the real axis, which contains the intensity, and analyzing information in both quadratures, both the imaginary and the real part, so that you get the full field of information.”

Included in this field of information is the measurement of the electric field. Unlike traditional DCAs, which measure intensity or power, the constellation analyzer measures the electric field of the incoming optical signal. According to van der Weide, the electric field is a more fundamental measurement of what is taking place in the fiber because power can be derived from the electric field but the electric field cannot be derived from the power.

“Once you have that fundamental information,” says van der Weide, “you can calculate the power. You can calculate the eye diagram that you would have gotten with older, more primitive instrumentation. You can calculate all kinds of derivative quantities once you have that fundamental information.

“The whole idea behind this is not only do you get the opportunity to modulate and detect the electric field as opposed to the power, but now you’re sending a vector, not just an on/off,” adds van der Weide. The vector, then, is not just a bit; it is a two-bit symbol. Thus, the symbol rate is now half the bit rate.

Optical constellation analyzers like the Coherent Lightwave Signal Analyzer from Optametra enables the user to see the amplitude- and phase-modulated data in two polarizations and displays this information in four eye diagrams.

“Once we decide what the vector is, then we can decide, ‘Does that vector represent a one-zero or zero-one, etc.,’” he continues. “At that point, then we can start to get a bit-error rate. Once we make a decision, we can compare it to the information that was supposed to be sent. And that, of course, is the crux of communications systems. What is the bit error rate?”

Finally, the constellation analyzer depicts not only the magnitude and phase-modulated data in one polarization of the fiber but also in the orthogonal polarization of the fiber. Some advanced modulation schemes also add polarization multiplexing to the mix, such as the OIF-backed dual-polarization quadrature phase-shift keying (DP-QPSK). Taking advantage of the horizontal and vertical polarizations of the fiber doubles the symbol rate again, enabling the transmission of four bits per symbol.

The constellation analyzer presents the results in two different constellations, one referring to the horizontally polarized light and the other referring to the vertically polarized light. Each of those has its own eye diagram, for a total of four eye diagrams. In the case of the OIF-proposed DP-QPSK, where four 28-Gbps signals are carried over a single 50-GHz DWDM channel for a total of 112 Gbps, the eye diagram of all four 28-Gbps tributaries would appear on the screen simultaneously.

The need for flexibility

While the OIF’s DP-QPSK proposal has emerged as the frontrunner, it is by no means the only viable option for transmitting high-speed signals—and it may not be sufficient to meet the industry’s needs even in the relatively short term.

“Many industries and academic institutions today are very much looking into next-generation Ethernet,” Andrekson reports. “They are looking at 400-Gigabit Ethernet and even Terabit Ethernet. DP-QPSK will not allow that, so people are already thinking ahead.”

Other complex modulation schemes under discussion include QAM-16 or -64 and optical orthogonal frequency-division multiplexing (OFDM). It is incumbent on the test equipment manufacturer, therefore, to build a high degree of flexibility into the constellation analyzer so it can handle all possible modulation schemes.

“From a test instrument manufacturer perspective, we have to take this into account,” admits François Robitaille, senior line product manager in EXFO’s Optical Business Unit, “and make sure that the hardware we are designing our instrument around is going to be futureproof and will be compatible with all those even more complex modulation schemes. The part that we have left when those new modulation schemes start to appear is to develop the proper algorithms to test them.”

Because the industry has yet to rally around a single modulation scheme, any sort of standards activity has also been demonstrably absent.

“There are no standards from the ITU, IEC or IEEE that say, ‘Well, if you have a good 100-gig signal, this is what your constellation should look like,’” muses Robitaille. “There isn’t even a working group yet within those standards bodies to work on definitions of what would be an acceptable SNR [signal-to-noise ratio], for example, or what would even be the definition of SNR. We know the traditional definitions of [SNR] when we look at an eye diagram or just simply the intensity pattern, but do those definitions still apply to a constellation diagram?”

Despite uncertainty over which modulation scheme, if any, will win the day or what specifications the standards will include if and when they emerge, several companies introduced constellation analyzers this year, including Agilent, EXFO, and startup Optametra. Agilent and Optametra used OFC/NFOEC 2009 to launch the N4391A Optical Modulation Analyzer and the Coherent Lightwave Signal Analyzer for 40/100G physical layer test, respectively. EXFO’s PSO-200 Optical Modulation Analyzer was showcased at ECOC.

The test companies report interest from all sectors of the optical communications market, including the R&D labs of the component, subsystem, and system manufacturers as well as academic organizations, research centers, and government agencies around the world. They say constellation analyzers are also gaining traction among the service providers themselves, a development Robitaille calls “surprising,” given that commercial deployment is still several months or more away.

That said, he also notes that “there’s a big difference between what we’ve seen with 100 gig and what we’ve seen with other upgrades in technologies. There is a real need from the market, from the industry, for 100-gig capacity. That is really a market pull more than a technology push.”

This technology push has put pressure on the test equipment manufacturers to develop constellation analyzers capable of making complex measurements of various advanced modulation schemes. And while the constellation analyzer marks a departure from the conventional methods of measuring optical signals, its mission is not dissimilar from previous generations of test gear.

“A constellation analyzer should be able to analyze whatever is coming through the fiber and present it in a human, readable way,” summarizes van der Weide. “That is ultimately what any instrument is about, and the constellation analyzer is no different.”

Links to more information

Lightwave Webcast: Test Strategies for 40G/100G Technology Development
Lightwave Online: Agilent to Debut Optical Modulation Analyzer for 40/100G Physical Layer Test at OFC/NFOEC
Lightwave Online: EXFO Launches Turnkey Optical Modulation Analyzer for 100-Gbaud Signal Characterization

Meghan Fuller Hanna is senior editor at Lightwave.

Sponsored Recommendations

Scaling Moore’s Law and The Role of Integrated Photonics

April 8, 2024
Intel presents its perspective on how photonic integration can enable similar performance scaling as Moore’s Law for package I/O with higher data throughput and lower energy consumption...

Coherent Routing and Optical Transport – Getting Under the Covers

April 11, 2024
Join us as we delve into the symbiotic relationship between IPoDWDM and cutting-edge optical transport innovations, revolutionizing the landscape of data transmission.

Constructing Fiber Networks: The Value of Solutions

March 20, 2024
In designing and provisioning a fiber network, it’s important to think of it as more than a collection of parts. In this webinar, AFL’s Josh Simer will show how a solution mindset...

Moving to 800G & Beyond

Jan. 27, 2023
Service provider and hyperscale data center network operators are beginning to deploy 800G transmission capabilities – but are using different technologies to do so. The higher...