Traditional BERTs evolve to diagnostic tools
By MEGHAN FULLER
Long a staple in both the R&D and field environments, bit-error-rate testers (BERTs) are evolving to what one vendor dubbed "a BERT on steroids." A simple BER measurement, though still necessary, is no longer enough for today's end users. "Where it really starts to get interesting is in what you do with the error-rate measurement once you've made it," contends Tom Waschura, co-founder and principle engineer at SyntheSys Research (Menlo Park, CA).
The BER itself is a ratio, explains Waschura, "and the actual measurement is very simple. It's the number of bits you have received in error divided by the number of bits total that you've tried to send."
Traditional BERTs are sort of like thermometers, he surmises. "You stick a thermometer in your mouth and it says, '103 degrees.' And then it's done. Is that good? Is it bad? It's totally up to you," Waschura offers.
Diagnostic device testing
BERT users in the R&D environment are asking for improved diagnostics, claims Michael DeBie, director of business and market development at WaveCrest (Eden Prairie, MN). "They tend to be focused, first and foremost, on functionality. Does the thing actually work? Does it transition from 1s to 0s and 0s to 1s accurately, and does it stay in that transition for a long enough duration so that a downstream receiver will be able to recognize the data accurately?"
End users want to be able to study error locations and determine the correlations, pattern sensitivities, and frequency content of errors to isolate a systematic error versus a random error. "Being able to know 'when' is a very important little feature when it comes to debugging the problem," says Waschura.
Driving this trend toward more diagnostic information is the greater need for interoperability, which is driven by the activities of the various standards bodies, says John Perlick, director of product marketing at WaveCrest's optical division. "The key point is to be able to plug in many different manufacturers' parts, and the system still works," Perlick explains. "The standards bodies have been pushing really hard to go beyond a basic functional BERT that says 'go/no go.' Instead, they've been forcing manufacturers to get down to the next level of diagnostic information to be able to say, 'This is an acceptable part,' or 'No, this isn't an acceptable part.'"
On the system level, says Patric Ostiguy, product manager at EXFO Electro-Optical Engineering (Quebec City), "a BERT normally comes with more functionality. In the same gear, you usually have some inherent protocol analysis functions."
Now, however, many end users in the field are also demanding greater integration of BERT functions and optical parametric testing, including optical spectrum analysis, polarization-mode dispersion analysis, and chromatic dispersion analysis. "As DWDM and some of the optical-based technologies become more widely deployed in the field, it's the same technicians in some instances that maintain both the optical testing and the BER testing," observes Zvonimir Turcinov, director of marketing at Acterna (Germantown, MD). "Sometimes they do use OSAs [optical spectrum analyzers] and traditional optical test equipment at the same time as they use BERT equipment, so it makes sense to combine the two."
The move toward optical testing makes sense, concurs Bhavik Vyas, product manager at Agilent Technologies (Palo Alto, CA). "A lot of the signals that you are testing are actually optical. Historically, BERTs on the physical layer have been all-electrical, which means that you always have to put an optical-to-electrical converter or an electrical-to-optical converter in front of it so it can read the right signal," Vyas adds.
Integrating optical and BER testing also lends itself to improved diagnostic analysis. If operators are dedicating 12 hours for a BER test, they want to get as much information as possible-"not just a go/no go," says Ostiguy. "The fact that you can integrate both optical measurement parameters and the BERT in the same test platform does allow you to have that. It allows you to have a graph over time of your BER as well as a graph over time of your OSNR [optical signal-to-noise ratio]. Then you can correlate the events in time and associate, pinpoint, and troubleshoot the problem."
Combining several measurements into one instrument also leads to lower initial costs for the end user. BERTs tend to be expensive. "Typically," says SyntheSys Research's Waschura, "the prices are around $45,000 for 1.5 Gbits, $95,000 for 3 Gbits, $200,000 for 13 Gbits, and $900,000 for 43 Gbits."
While a 40-Gbit BERT provides the requisite functionality and flexibility for the R&D lab, "when you move to manufacturing, you don't want all that cost in an instrument that doesn't do anything else but work at one rate," reasons Vyas. "As the cost of components comes down over the years, if your manufacturing costs stay the same, then you've got quite a problem in trying to keep your profits up and your efficiency of operation up."
While BER testing is historically done with pseudo-random data, future incarnations may test systems and devices against purposely degraded signals, say the folks at Circadiant Systems (Allentown, PA). Driving this trend is a new draft from the 10-Gigabit Ethernet Alliance, 802.3ae, which specifies that a product should be tested against degraded optical signals. "[The telecom people] want to guarantee performance under adverse conditions" says Joey Thompson, chief technology officer at Circadiant. "As the telecom industry moves to higher and higher speeds, [service providers] are asking the designers, developers, and vendors to guarantee their equipment under really bad conditions."
According to EXFO's Ostiguy, the emergence of Gigabit Ethernet (GbE) is driving a new trend in system-level BER testing. The RBOCs and other service providers have begun installing GbE links over DWDM, but the existing GbE test equipment is almost exclusively geared to the testing of routers or Ethernet switches. Because there isn't any equipment designed for GbE BER testing, operators are using equipment designed to test routing and switching. "They have no measurement of BER, which is what they need to certify the physical layer or the ability for the medium to carry the information transparently," explains Ostiguy. While this trend is relatively recent, he claims, it has been generating a lot of interest among end users.
Also generating interest is the move toward 40-Gbit/sec devices and systems. A year and a half ago, it was very popular to be working at 40 Gbits, admits Agilent's Vyas, and while the consensus is that volume deployment of such devices has been pushed out until 2003 or 2004, development work still continues. "We have a 40-Gbit BERT that has been very popular, especially for component manufacturers who are trying to get their component costs down and get their performance up to adequate standards," Vyas reports.
In long-haul transmission, where there is always significant signal impairment due to the fiber media and the long distances that are traveled, there will always be a need for measuring BER, claims Mike Botham, hardware engineering manager at Tektronix (Beaverton, OR). "Measuring the BER," notes Botham, "is still really the only summary measurement that you can make that gives you a good indication of what the quality of the transmission is over a long-haul span."
Waschura agrees, adding that the future of the BERT segment will most certainly hold continued innovation. "There are definitely more tricks up our sleeves," he promises.