Improving the spectral efficiency of tomorrow's optical networks

Th 99416

SPECIAL REPORTS: Optical Networking & WDM

The deployment of line rates higher than 10 Gbits/sec will be driven more by economic factors than by capacity issues.

BRIAN LAVALLÉE, Nortel NetworksTh 99416

Illustration by Gregor Bernard

Even with the recent financial downturn in the global telecommunications industry, next-generation optical networks implementing higher bit rates in a structured manner still make good economic sense by better utilizing the available transmission spectrum. The spectrum use can be effectively described using the concept of spectral efficiency. In this context, the increased bit rate of 40-Gbit/sec systems alone does not guarantee the successful commercialization of these new solutions. Rather, it will be the seamless and cost-effective evolution from existing 10-Gbit/sec optical networks to a next-generation network that, regardless of the bit rates chosen, improves the overall spectral efficiency in a structured manner that will ensure success.

Spectral efficiency is defined as a measure of how well a transmission system, in this case an optical network, uses the available transmission spectrum to transmit information. It is defined as follows: Spectral efficiency = total data bandwidth transmitted/required transmission bandwidth. For example, optical networks using 40-Gbit/sec line rates at 100-GHz channel spacing yields the following: Spectral efficiency = 40/100 = 0.4 bit/sec/Hz or 40%.

Subsequently, this 40-Gbit/sec optical network has an architecture that is 40% efficient-from a spectral efficiency viewpoint. The rule-of-thumb for conventional optical transmission systems is that the minimum acceptable channel spacing for successful transmission is roughly 2.5 times the chosen bit rate. By "conventional," we mean systems that use intensity modulated direct detection in conjunction with today's filter technologies. Subsequently, the minimum channel spacing for 40-Gbit/sec optical systems would be 100 GHz and for 10-Gbit/sec systems, 25 GHz.

The actual bit rates chosen are inconsequential to achieving higher spectral efficiencies. For example, the same spectral efficiency of 40% can also be achieved using 20-Gbit/sec bit rates and 50-GHz spacing. The goal is to maximize spectral efficiency while at the same time balancing the cost and performance of the network architecture. Generally speaking, the higher the bit rate, the better the spectral efficiency for the same channel spacing. The limit of today's filter technology will likely limit the minimum channel spacing of 10-Gbit/sec wavelengths, making the shift to 40 Gbits/sec at a larger spacing more enticing.

Spectral efficiency goal
Comparing the minimum channel spacing for 40- and 10-Gbit/sec systems yields a curious result: The same spectral efficiency is obtained for 10-Gbit/sec channels spaced at 25 GHz and for 40-Gbit/sec channels spaced at 100 GHz. However, the economic and operational advantages of 40-Gbit/sec systems become clearer when the difficulty and expense of achieving these equal results are examined.

Reducing 10-Gbit/sec channel spacing to an extremely tight 25 GHz quickly incurs optical transmission issues that must first be solved before commercial viability is achieved. For instance, nonlinear effects such as cross-phase modulation and four-wave mixing become much more predominant and severely debilitating if not properly managed. Chromatic dispersion will also increase in prevalence as a result of this extremely aggressive channel spacing, thereby increasing the possibility of intersymbol interference. The optical components required for 25-GHz spacing also become increasingly complex to manufacture, especially with respect to the extremely precise multiplexer/ demultiplexer filters required for combining and separating these very tightly packed wavelengths. Complex (and therefore costly) active control systems will likely be required to guarantee the expected end-of-life performance of the optical link, given the changing of the physical characteristics of the components/modules (lasers and filters) due to the expected varying ambient conditions and component aging over time.

The question is not whether that actually can be technically achieved, but whether the cost in terms of operational expenditures (opex) and capital expenditures (capex) is justified. That's a matter of heated debate in the telecom industry and will continue to rage for some time-just as the argument raged between 2.5 Gbits/sec and newly introduced 10 Gbits/sec a few short years ago. From a spectral efficiency viewpoint, these two competing solutions are the same; but from a cost versus performance standpoint, the debate will continue for some time until both technologies mature.

Improved spectral efficiency
The primary (and obvious) benefit of 40-Gbit/sec bit rates is that four times as much information is carried on a single wavelength than with 10-Gbit/sec systems. Generally speaking, significant opex and capex savings, so crucial to the economic viability of service providers, will also be achieved over the long term. The old cliché that "a penny saved is a penny earned" is extremely applicable to today's telecom industry and is forcing the entire industry, carriers and vendors alike, to be extremely diligent with respect to the financial implications of tomorrow's network offerings.

The ability to carry quadruple the amount of information on a single wavelength yields an optical line system that will ultimately require far less equipment (e.g., transmitter lasers and receiver detectors) when compared to a comparable 10-Gbit/sec system. Vendors introducing 40-Gbit/sec equipment will offer denser systems in a physically smaller footprint for a given amount of capacity. That frees crucial floor space, which for many carrier sites has become an overriding concern due to the lack of space to grow and generate new revenues.

Less equipment also means less electrical power, thereby decreasing heat dissipation requirements and reducing the strain on ambient control systems. As systems become increasingly dense from consolidated electronics, the overall network power consumption will decline for a given capacity. This will decrease opex, yielding a more efficient and competitive network.

Denser equipment also means less overall equipment to manage by operations personnel. For an equivalent overall network capacity, the number of required wavelengths is reduced by a factor of four for every 40 Gbits/sec of capacity deployed. That quickly reduces the complexity of the deployed network from an operations perspective, since there are far fewer wavelengths to monitor and maintain. There is also a reduced burden with regard to the number of coupler ports required, since four 10-Gbit/sec wavelengths are now represented by a single 40-Gbit/sec wavelength.

Since 40-Gbit/sec systems will likely be spaced at 100 GHz, they will be based on enhancements of existing 100-GHz filter technology-rather than the development 25-GHz filters that require a significant leap in filter technology as well as a prove-in period in live field applications. An added benefit of 40-Gbit/sec line systems is that the number of connections to be cleaned is also reduced by a factor of four, thereby improving network deployment time and cost.

The ever-important factor of overall system reliability is also improved, given that much less equipment will be required for 40-Gbit/sec line systems. A significant reduction in overall required electronics coupled with the related power consumption reduction will ultimately improve the overall system reliability. With 40 Gbits/sec of information now riding on a single wavelength, reliability and availability of these new line systems are obvious concerns for service providers and consequently a main priority of all 40-Gbit/sec line-system equipment vendors. Reduced reliability is simply unacceptable.

Are there any significant hurdles to be surpassed before achieving the successful commercialization of optical networks offering higher spectral efficiencies than those available today? Of course, but these hurdles have been thoroughly researched over the past few years, resulting in novel approaches successfully demonstrated in customer/vendor labs and field trials. New technologies typically follow an S-curve that is started by an invention commonly patented. In the early stages of introduction, innovations based on the original invention are fast and significant (steep slope), start to slow down with less significant advances (reduced slope), and finally level off as the technology finally matures. Once a technology matures, it becomes very well understood and quickly attracts (too) many players into a once underserved market, where cost soon becomes the competitive weapon. Glancing at the current 10-Gbit/sec industry quickly reveals plentiful vendors in the value chain from components to systems, giving a good indication of where 10-Gbit/sec technology is now located within the technology S-curve.

Opex, capex drivers
The introduction of 40-Gbit/sec systems will start another S-curve, whereby technological advances will result in significant cost reductions when compared to existing 10-Gbit/sec technology. The primary hurdle remaining for implementation of more spectrally efficient technology is to significantly reduce the inherent material and manufacturing costs of the high-performance components required for 0.4-bit/sec/Hz systems, regardless of the actual chosen line rate, be it 40 Gbits/sec or 20 Gbits/sec. As with any newly introduced technology, however, costs will surely decrease when component designs and manufacturing costs are simultaneously improved as the industry quickly moves up the S-curve. Similar hurdles were encountered and subsequently surpassed when carriers migrated from 2.5-Gbit/sec to 10-Gbit/sec line systems.

Implementing 0.4-bit/sec/Hz line systems will lower the opex and capex of service providers; such cost reduction thus will be the primary business driver for these higher-bit-rate systems. Operating networks that are more efficient than competitors' network is a distinct competitive weapon in the arsenal of service providers and therefore is a much sought after goal.

From an opex perspective, a reduction in the amount of required transmission equipment will result in the significant ongoing savings so critical to the long-term health of a service provider. As mentioned previously, a reduction in floor space requirements also reduces opportunity costs, since additional equipment could now be installed in the liberated central-office space to offer more services and increase revenues. Existing equipment also could be consolidated into these denser platforms so as to further liberate floor space and reduce overall power consumption. The consolidated (replaced) line systems could then be redeployed in other parts of the network to avoid stranded assets.

As additional equipment is installed to keep pace with steadily increasing bandwidth demands, the power consumption of these network elements quickly becomes a significant opex.

As previously mentioned, denser 40-Gbit/sec architectures require less overall (redundant) electronics. Not only is the cooling-system electrical power requirement reduced, but likewise so are the physical and electrical requirements related to rectifiers, backup battery strings, standby generators, and the physical power distribution plant. Incidentally, that will also reduce the capex for power feed equipment as the network grows.

Less equipment to maintain implies reduced operational activities (e.g., fewer wavelengths to turn up, maintain, and troubleshoot). That yields noteworthy savings in the time and personnel required to maintain healthy and profitable networks. Since overall reliability is improved due to less network equipment, the number of inevitable failures and maintenance activities is subsequently reduced as well. The much quoted number of five-nines (99.999% network availability) still results in some failure scenarios, but the number of failures is quickly reduced as overall system reliability improves.

From a capex perspective, significant benefits will also be achieved when deciding to deploy 40-Gbit/sec optical line systems. Less network equipment needs to be purchased at 40 Gbits/sec than at 10 Gbits/sec for an equivalent capacity. The inherent costs of 40-Gbit/sec components will inevitably be reduced as designs and manufacturing processes are improved, yielding very favorable economies of scale. That will result in rapidly declining costs for 40-Gbit/sec optical line systems that will be very similar to what was experienced in early 10-Gbit/sec line-system deployments.

On the horizon
Higher supported bit rates (and resulting capacity gains) alone will not drive the demand for 0.4-bit/sec/Hz optical networks. In fact, the specific line rate itself is rather inconsequential to its commercial success. The decision to jump to higher line rates will ultimately be financial in nature, due to the achieved capex and opex savings.

Carriers will have the option to continue deploying existing 10-Gbit/sec systems or deploy more spectrally efficient solutions. Since each network deployment is unique (due to the amount of mitigating factors influencing related route link budgets), there will surely exist cases in which 10 Gbits/sec are optimal as well as cases where higher line-side bit rates are optimal. Thus, some network greenfield deployments will lean toward higher-bit-rate architectures, while others will lean toward 10-Gbit/sec approaches. That is quite normal, as history has consistently proved that one technology rarely, if ever, completely obsoletes another.

So the obvious question is, where exactly is this decision crossover point from the existing 10-Gbit/sec systems to the next generation of 0.4-bit/sec/Hz systems? The answer is not easy, given that numerous tradeoffs must first be balanced and are driven by the specific short- and long-term network needs of the service provider. For instance, how much remaining capacity is available on the carrier's existing 10-Gbit/sec network and how much increased capacity is forecasted and when? Will this be a completely new build? Is the carrier obligated to use an old fiber plant?

There exists a myriad of specific network scenarios making a general statement about a specific crossover point unrealistic. The decision of the bit rates chosen to implement the improved spectral efficiency of 0.4 bits/sec/Hz will ultimately be determined by the business case incorporating the best combination of capex and opex savings that are specific to the current and future needs of each service provider.


Brian Lavallée, PE, is senior manager of systems engineering in the Nortel Networks (Brampton, Ontario) optical Internet line of business.
He can be contacted at brianlav@nortelnetworks.com.

Reference
G. Simo, "Economic and Technical Strategies for Migrating from 10-Gbit/sec to 40-Gbit/sec Network Backbones," Nortel Networks, NFOEC 2001.

More in Earnings Statements