By Stephen Hardy
Millions of words, bits, pixels, and other carriers of information have sacrificed their lives over the past several years to promote the benefits of coherent receiver technology paired with phase-modulated transmission for 40- and 100-Gbps signal transport. We’ve learned that (1) the coherent/phase-modulation combination should overcome the barriers chromatic and polarization-mode dispersion might present, (2) carriers shouldn’t need to redraw existing 10-Gbps network architectures, and (3) after extensive development efforts, such technology has reached deployment.
What remains something of a mystery is how to test the new technology in the field. This information gap is partly due to the fact that test equipment manufacturers haven't quite figured out how to deliver field-friendly versions of the instruments they currently provide to technology developers. Yet, more tellingly, the test and measurement community also isn’t sure what functionality carriers will require. The cause for this uncertainty lies in the attributes of the coherent technology under discussion – which may well mean that the providers of this technology will bear more responsibility for planning, monitoring, and troubleshooting next-generation coherent networks than previous technology generations required.
So what’s the problem?
As a large percentage of the aforementioned words, bits, and pixels have described, the latest generation of single-wavelength 100- and 40-Gbps transmission technology proved necessary when conventional on/off keying (OOK) amplitude modulation formats proved inadequate for carrier requirements. These requirements included the ability to match the reach of existing 10-Gbps transmission without requiring planners to reconfigure their networks.
To overcome the limitations of amplitude modulation, most systems vendors adopted implementation agreements from the Optical Internetworking Forum (OIF) that called for the use of dual-polarized quadrature phase shift keying (DP-QPSK, sometimes referred to as PM-QPSK for "polarization multiplexed") as a better modulation format. These agreements also paired DP-QPSK with coherent detection, a receiver technology lifted from the wireless realm.
DP-QPSK replaced amplitude modulation with phase modulation; the "quadrature" element means the signals reside in four quadrants created by the different phases, which lowers the baud rate of each signal's stream to lessen their susceptibility to impairments. Meanwhile, coherent detection uses a combination of digital signal processing, advanced algorithms, and enhanced forward error correction to recover the signals from those four phases, regardless of the chromatic or polarization-mode dispersion encountered within the link.
|The price of high-end real time oscilloscopes for use with optical modulation analyzers has come down significantly – but is still too high for most field applications. Courtesy of LeCroy Corp.|
Coherent detection's advanced processing capabilities present an immediate potential benefit to network planners within the context of test and measurement. If the technology works as advertised, then it really doesn't matter how much chromatic or polarization-mode dispersion is present on a given 100-Gbps link. That means it should no longer be necessary to test for such dispersion before putting a new route in place (provided that the link won't also be expected to carry lower-speed OOK-modulated traffic). And if you want such measurements anyway, there probably is a way to extract that information from the transmitters and receivers on the line.
|Optical modulation analyzers present significantly more complex information than the average carrier field technicians would likely need to see. Courtesy of Agilent Technologies|
Early adopters of coherent technology already have made this realization. For example, while he admits that Verizon still performs dispersion testing on new fibers for coherent 100-Gbps links, Glenn Wellbrock, director of optical-transport-network architecture and design at the international services provider, reports, "I'm pushing back on the guys on that because the new transmitter/receivers will give us just as accurate of a measurement as the test equipment will on what the chromatic dispersion is and also what the differential group delay is from an instantaneous point of view. And they can also average it out over time and we can get the polarization-mode dispersion coefficient. So it will give us as much information as the test equipment – actually more, because it's running on it full time."
However, such in-system intelligence comes at the price of extreme technology complexity, compounded by the fact that each systems vendor has its own proprietary means of coherent detection. And Wellbrock, for one, has no interest in keeping track of these different implementations. "We tried it at 40G and it was a disaster," he says. “For an operator, it just doesn't make any sense, because every one of them [i.e., the 100-Gbps systems] is different; it’s really, really expensive; and even if you are to look at [the received signal], since you didn't design the equipment, there's really not much you can determine from it."
But the problems with testing the coherent signals don't end with complexity and proprietary implementations. According to Francois Robitaille, senior product manager within the Optical Business Unit of EXFO Inc.'s Wireline Division, three other areas remain challenging from a test and measurement perspective:
1.How do you test optical signal-to-noise ratio (OSNR)? As its name implies, OSNR requires the ability to differentiate signal from noise. That was relatively straightforward up to 10 Gbps with amplitude modulation because the signal width was significantly narrower than the channel width. This difference between signal and channel width narrowed with the advent of the alphabet soup of wider 40-Gbps transmission formats (DPSK, DQPSK, etc.). Therefore, a new method of determining OSNR, called “polarization nulling” or “polarization diversity,” came into favor with the blessing of the IEC. However, the use of polarization multiplexing in coherent-enabled transmission renders this new method ineffective – and nothing has yet been agreed upon to replace it for coherent applications.
2. How do you detect interference between coherent and conventional traffic?Despite claims to the contrary from some systems vendors, coherent and conventional streams don’t mix well on all links, Robitaille asserts. In particular, the relatively greater power of conventional transmissions can cause crosstalk and other impairments on coherent signals – impairments that current field test equipment can’t detect. Guard bands around the coherent transmission would be a common solution to the interference problem – but on long links, how can you tell which route segments need guard bands if you can’t determine where the problems are arising?
3. Relying on the systems’ transmitters and receivers to monitor performance is fine – as long as they’re working properly. But what happens when they're not? Robitaille wonders.
The answer to this last question – as well as questions 1 and 2 eventually – probably is a new test instrument or two specifically for coherent applications, he believes. But what would such instruments look like?
The future faces of coherent test strategies
Technology developers, of course, can buy optical modulation analyzers from a handful of vendors – including Agilent Technologies, Aragon Photonics Labs, EXFO, Tektronix, and, most recently, Southern Photonics – to evaluate coherent technology performance. The instruments work with oscilloscopes of different types (depending upon the analyzer vendor) to enable characterization of DP-QPSK and other phase-modulated signals. Each analyzer currently on the market has its pluses and minuses; all enjoy good reputations.
Robitaille says that a field version of such an instrument set would solve many of the challenges discussed above, including the OSNR quandary. However, there are several very difficult challenges that would have to be solved to make a field instrument viable:
- The obvious first step is to combine two bench-top instruments into a single hand-portable unit – which no one has done so far.
- The resulting instrument must be easy for field technicians to use. The developers who use the current generation of optical modulation analyzers and oscilloscopes have engineering degrees – and need them.
- The cost of the instrument must reach palatable levels for field applications. Right now, the bill for a full-featured oscilloscope and optical modulation analyzer can reach into multiple hundreds of thousands of dollars.
Currently, Robitaille and others in the test-instrument field with whom Lightwave spoke indicate that a solution to all of these problems won’t be found anytime soon. But that doesn't mean test equipment companies aren’t working on interim steps in the meantime. For example, while he’s careful not to tip EXFO's hand in terms of upcoming products, Robitaille says an optical spectrum analyzer (OSA) with “advanced features” might make sense in terms of addressing OSNR. “The challenge is to bring a technology in the same price range as an OSA using relatively similar approaches,” he allows. “Does it exist today? So far the work we’ve done doesn’t give us clear answers if it's going to work or not.”
Another approach would use a standard OSA in combination with some other, new unit that might create some sort of reference signal from which actual system results could be predicted, he adds.
|The advent of 100-Gbps coherent transponders may make it easier to develop field test platforms for next-generation networks. Courtesy of EXFO Inc.|
Meanwhile, a bit-error-rate test set (BERT) that could work with coherent technology also would make sense, Robitaille believes. “We think there will eventually be a need for a kind of a coherent-based tester for the field, just to give a BERT certificate, an independent view of your signal, view of your network, allowing you to test how your network will behave along the way,” he explains.
The advent of 100-Gbps coherent transponders, expected in the spring, should hasten development of such a BERT instrument as well as other test equipment for coherent networks. However, one problem would remain unsolved: the proprietary nature of most coherent-enabled transmission systems. A transponder-based test set would be able to tell you how a coherent system might operate on a specific link, but wouldn't be able to guarantee that your system would, unless that system uses the same transponder.
What’s a carrier to do?
The current uncertainty in the test equipment community about how to address coherent networks in general and proprietary approaches to coherent technology in particular would appear to leave carriers in a tough spot. Fortunately for them, the 100-Gbps coherent systems vendors themselves appear ready to plug much of the test-capability gap.
As previously discussed, the intelligence within the transmission systems themselves can provide a fairly clear view of what's happening on the link. Wellbrock reports that systems vendors are incorporating even more performance measurement capabilities in their systems. For instance, he expects vendors to deliver early this year the ability to generate pseudo random bit sequence (PRBS) patterns into potential new links. The feature would significantly ease the process of acceptance and commissioning tests for new line cards. Use of a patch cord would enable round robin testing that would include the client-side interfaces.
Meanwhile, Wellbrock has no qualms about relying on the network planning and provisioning software tools most vendors provide with their systems to tell him how well a coherent-enabled link should work in a particular scenario. “From our point of view,” he explains, “we're looking at it saying if we've got a problem in the field, then that means the planning tool was wrong – and that means we’re going to get the systems supplier involved in it. And expect them to bring the test equipment that's necessary in order to troubleshoot it.”
In essence, Wellbrock has decided to embrace the fact that what happens at the signal level will remain a mystery. “If we do have errors on the line side, the only thing we can do really is swap the card,” he says.
Such a philosophy naturally would limit the number of 100-Gbps test sets Verizon might buy, a fact Wellbrock acknowledges. “I know that’s not what the JDSUs and the EXFOs and all those guys really want to hear,” he adds. “Their equipment doesn’t go away; I don’t want to mislead you or have that take away. Just that we would use it on special occasions and not for normal test and turn-up and troubleshooting. So we would still have this stuff around, and we would ship it out when we needed it or each region would have one. But you wouldn’t have to have a 100G test set in everybody’s truck like we used to do with 10G test sets.”
While Verizon might be able to make more demands on its equipment suppliers than your average carrier, it seems likely that systems vendors would make the same test-friendly features available to other customers. That would make 100-Gbps technology deployment simpler than might be assumed – perhaps, in some ways, simpler than initial 10-Gbps deployments 10 years ago.
That, of course, is good news for carriers, but not so good for test equipment vendors, who may have to sell gear to systems vendors and systems integrators to recoup their R&D investments. How well that will work remains uncertain – which, right now, remains the general state of coherent-enabled technology testing in the field.
Past Lightwave Articles