The continued evolution and growing use of high-bandwidth services should continue to drive the need for increases in optical network capacity for several years to come. Today 40G systems are being deployed in volume, 100G systems are starting to be introduced, and even 400G and 1000G systems are being contemplated in research laboratories. Yet despite these efforts to bring high-bit-rate systems to market, it remains obvious that the majority of deployments for many years to come will be based on proven and cost-effective 10G technology.
Nevertheless, ensuring a viable migration path from 10G, potentially through 40G, up to 100G is of the utmost importance for carriers to avoid costly new builds. Consequently, the new 40G and 100G systems are normally required to be capable of coexisting with 10G systems and function within 10G design rules. Because 10G transport technology relies heavily on optical dispersion compensators (ODCs), it is vital to take dispersion compensation strategy into account when contemplating future upgrades for existing and planned 10G deployments.
The majority of the ODCs in legacy terrestrial networks are based on dispersion compensating fiber (DCF) technology, which has significant impact on system and link design rules. In recent years, fiber Bragg grating (FBG) technology has found applications in optical transport networks in the form of dispersion compensating modules (DCMs). The introduction of FBG-DCMs has improved the cost and performance of today’s networks by providing features such as low latency, low insertion loss, full dispersion matching, superior form factor, and absence of non-linearities. As I will discuss, these features should benefit coherent-based systems as well as today’s direct-detection signaling formats.
FBGs and coherent technology
The catalyst for the recent interest in coherent technology is not necessarily a need to address chromatic dispersion (CD) impairments; cost-effective solutions for CD already exist for high-bit-rate systems. Rather, coherent technology is attractive for its ability to compensate for the polarization mode dispersion (PMD) issues that can arise depending on the quality of the available fiber as bit rates increase.
Although optical PMD compensators may improve the situation somewhat, they have limited compensation range. There are also questions regarding their ability to respond to fast variations of the polarization state within operational networks. Thus, despite higher complexity and the cost of implementation, coherent’s promised ability to address all fiber types has proven alluring.
It seems the transition point to coherent technology will be at 100G. For more challenging links it is believed coherent will also find a place at 40G. However, the existence of working direct-detection systems at a more attractive price point may prove a difficult hurdle for 40G coherent to overcome.
In greenfield applications, coherent’s digital signal processing also can compensate for bulk CD even in quite long links. This, in principle, should make inline optical compensation redundant in such scenarios. However, as will be discussed later, there may be instances where it still may be beneficial to use dispersion compensation in combination with coherent detection.
Another important property to consider when comparing coherent with classical direct-detection systems is latency. It is often asserted that latency will be much lower in a coherent system than a direct-detection system because the former does not require optical dispersion compensation. One should remember, however, that such statements are made based on the latency introduced by DCF-based dispersion compensation, not FBG-based dispersion compensation. The fact is that a classical direct-detection system using FBG-DCMs actually yields lower latency than a corresponding coherent system, even if one removes the ODCs from the coherent link. The delays associated with the computational requirements for impairment mitigation in the digital signal processor (DSP) as well as the more complex forward error correction (FEC) strategies typically used account for coherent’s greater latency.
Technology and transmission
As just mentioned, digital signal processors (DSPs) in the coherent receiver compensate for linear impairments such as residual CD and PMD as well as some intra-channel non-linearities – and they do so very efficiently. It will be difficult for channel-specific tunable optical CD and PMD compensators to compete with coherent, at least for the very high bit rates.
In the case of mixed traffic, however, all 10G and non-coherent 40G links traversing the same fiber as a coherent bit stream will still require inline optical compensators. This poses quite a challenge since the advanced modulation formats and detection schemes used for coherent transponders still may be quite sensitive to non-linearities. In particular, coherent polarization-multiplexed quadrature phase-shift keying (PM-QPSK), the most commonly used modulation format for both 40G and 100G coherent applications, suffers from cross-phase modulation (XPM), induced four-wave mixing, timing jitter, non-linear phase noise, and polarization scattering. As a consequence PM-QPSK channels are very sensitive to surrounding 10G channels and may require guard bands—an issue that not only complicates things for operators but also reduces spectral density.
In a comparison between DCF, a Periodic Group Delay (PGD, such as a channelized FBG) dispersion compensator, and pure DSP compensation for 43G and 112G PM-QPSK transmission, it was found that for 43G transmission there was a 1-dB improvement in non-linear tolerance with pure DSP compensation compared to inline DCF compensation (C. Xie, ECOC 2009, paper P4.08). With PGD compensators, however, a further improvement of 3 dB compared with DSP compensation was found. At 112G the differences were smaller, but there was 1-2 dB to be gained from using PGD compensation.
These numbers were derived from a system carrying only coherent channels. When neighboring channels were replaced with amplitude-modulated 10G channels, the performance of the pure coherent transmission was severely degraded. In this case, a 5-dB increase in non-linear tolerance could be obtained for both the 43G and 112G PM-QPSK signals by using in-line PGDs. Furthermore, Xie has also shown that by modifying the PM-QPSK format, by introducing a time-interleaved polarization scheme, a higher tolerance for non-linearity penalty can be achieved in a dispersion-managed link than in one without any dispersion compensation at all.
When deciding whether or not to use dispersion compensation in a pure coherent build there are a number of tradeoffs to consider. Handling bulk dispersion for long links in the DSP may require unnecessarily long/dense finite impulse response (FIR) filters that imply increased chip complexity, size, and power consumption. In the case of the latest generation of ultra long-haul systems, i.e., submarine links without dispersion-managed fibers, the need for optical bulk dispersion compensation further becomes evident when a dispersion value of several hundreds of thousands of picoseconds per nanometer needs to be addressed.
The problem of noise when compensating for optical impairments in the electrical domain also needs to be considered, e.g., laser phase noise impact on tap coefficients. It can be argued that the fastest and cleanest way of compensating for bulk dispersion is in the optical domain and that at least systems with tight requirements on reach and OSNR would benefit from FBG-based optical bulk compensation, especially by virtue of the FBGs’ linear compensation behavior.
One should also remember that the DSP is an integral part of the receiver. Operating on a per-channel basis, the increase in total power dissipation in a fully loaded node may be significant. Conversely, inline passive optical bulk compensation can handle one or numerous channels simultaneously while consuming no power whatsoever.
Tailored pre/post FBG compensation may further make life easier for coherent systems designers when it comes to ensuring a favorable transmission case, especially in terms of non-linearity management.
For direct-detection long-haul systems, residual dispersion and phase ripple have always been concerns when using FBGs. For 10G systems, this has not been a major issue since there are FBGs available that are specified to ensure a low enough penalty. The situation, however, is more challenging for 40G and 100G transmission. Although fixed FBG-DCMs (in combination with tunable dittos) are already deployed in various 40G long-haul networks, the majority of 40G/100G FBG-DCMs are believed to be deployed in shorter (metro or regional) networks.
As the price, quality, and availability for 40G/100G FBG-DCMs steadily improve, it is highly likely that direct-detection using FBG-based dispersion compensators will remain the most cost-effective optical transport strategy, at least for 40G. Where the actual inflection point in terms of technology capability and price will occur must today be deemed uncertain, however.
When considering coherent technology and high bit rates in long-haul networks, accumulated phase ripple and residual dispersion actually become less of a problem than for the 10G equivalent network. The ability of coherent techniques to perform fast signal processing on the received optical field means the influence of the phase ripple can easily be mitigated, hence making the FBG-DCM a close to perfect component for use in these more challenging networks.
As the migration upwards in bit rates continues, the technology choices will as always be governed by economics. The approach that satisfies a specific customer need at the best price per bit will surely win the largest market share in the long term.
That said, aspects like ensuring viable migration paths, bit rate co-existence, environmental responsibility, and specific service offerings, etc., will still be important factors that will greatly influence the technology choices for specific network upgrades and build-outs for the foreseeable future.
To answer the question posed in the title, i.e., is there a place for FBG-based dispersion in future high bit-rate networks, the answer is “yes” for a number of reasons:
- Complexity: Compensating for large amounts of dispersion in the electrical domain adds complexity to the DSP. The high tap count associated with the long FIR filters necessary to deal with dispersion can be reduced considerably if the DSP was designed only to deal with residual and time-varying dispersion.
- Noise: Many noise-related issues are directly associated with impairment mitigation of optically induced phenomena in the electrical domain. At least handling the bulk of the dispersion for a link in the optical domain can prove vital to optimize reach and OSNR for coherent systems.
- Power: The importance of power consumption is not likely to lessen in the future. The ability to, in the optical domain, simultaneously compensate all channels in the C-Band with a totally passive device will remain of interest. The power savings are tremendous considering that all power reductions that can be achieved for the DSP should be multiplied by the number of channels in the system.
- Latency: Adding latency in never good practice. FBG-based dispersion compensation is virtually latency free— a property especially important considering such growing application areas such as cloud computing, video broadcasting, high-frequency trading, online gaming, etc.
It has been argued that the introduction of coherent technology will be the end of optical dispersion compensation. However, given the unique properties of FBGs and the various optical transmission challenges that need resolution, I am convinced that optical dispersion compensation indeed has an important role to play in future optical network deployment, especially as bit rates increases.
Fredrik Sjostrom is product line manager, DCM at Proximion Fiber System AB.