The key to reducing the cost of production test is the selection of flexible, integrated, throughput-optimized optical test equipment.
CHRIS LOBERG and AL FOX, Tektronix Inc.
In a market that demands reductions in equipment prices, vendors have placed the price tag of test under intense scrutiny. Industry experts advise that testing can account for one-third of an optical product's total cost of production. Clearly, that implies a requirement to trim costs, without a corresponding reduction in test coverage or product quality.
Undoubtedly, there are hundreds of ways to reduce the cost of test; however, any decision about test methods or test equipment must consider not just cost but throughput and quality. There is a complex interplay among these three factors: Too much emphasis on throughput reduces quality; blind fixation on cost compromises throughput and quality; and a conscious balance of cost, throughput, and quality must be calculated to produce a competitive advantage.
The pragmatic industry veteran knows that the actual capital cost of test equipment is only a fraction of the total cost of production test. Experienced test engineers try to forecast the cost of ownership rather than capital cost. Equally important, they look for "hidden" efficiencies that increase throughput, reduce rejects, and maximize the productivity of those who program and operate tools.
Reducing the cost of test is a focal point for equipment vendors attempting to satisfy their clients' desire for network elements that cost less and deliver more. There are test equipment attributes that predictably reduce the cost of producing optical-network equipment.
Test equipment cost of ownership is the foundation of a profitable manufacturing business. There are three key questions surrounding the candidate test instrument (assuming the basic measurement performance requirements are met):
- Does it have a future? A test instrument's longevity depends on its ability to adapt and reconfigure to meet changing needs. A fixed configuration that is ideal today cannot realistically keep pace with increasing performance requirements of tomorrow's optical equipment. Moreover, entirely new protocols and standards are sure to emerge that will deliver unforeseen compliance requirements. The cost of completely retooling the production test department every few years is clearly something that manufacturers prefer to avoid, if possible. A favored solution is the modular instrument architecture. This approach lends itself to upgrading and updating as the needs of the target device evolve. Should a new requirement emerge, updating the system is as simple as adding a module.
- Is it reliable? Test equipment reliability is measured in meantime between failures (MTBF). An equally important factor is meantime to repair (MTTR). That is where the modular architecture shines-the modular tool is easier and faster to maintain, reducing MTTR and therefore downtime. Another facet of reliability is repeatability. In fact, many test managers claim they would rather see a hard failure than measurements that vary from time to time or results that can't be duplicated from test stand to test stand. The test instrument's accuracy, resolution, thermal stability, and even its mechanical integrity affect its ability to produce repeatable results. Valuable technician time is spent on test equipment calibration exercises, for example. Equipment that provides automatic calibration and easy performance verification will increase repeatability and reduce costs in a manufacturing operation.
- Is it easy to use? In today's production test department, "ease of use" is almost synonymous with intuitive programmability. A substantial cost of production is the engineering expense of programming tools. A consistent user interface across instrument types saves time and builds a reservoir of expertise among users. If the test platform is modular, this knowledge base, like the instrument itself, can be updated incrementally as new requirements emerge. Software scripting tools further enhance ease of use, providing a consistency of look and feel, even when used in highly specialized applications. The use of industry-standard operating systems and user interfaces is another test instrumentation trend. Test tools that implement software in a familiar, PC-like programming environment tend to reduce the learning curve, and provide simple connectivity to standard peripheral drivers and applications.
In a manufacturing environment, productivity is often quantified as a throughput figure. The term "throughput" usually denotes the number of salable devices produced. The connection between productivity (throughput) and manufacturing cost therefore is evident. Once a test instrument's features and performance are established, its throughput must be evaluated.
A critical element of throughput is test time: the amount of time required to run required tests on the device under test (DUT). A measurement on an instrument such as a sampling oscilloscope may require thousands of acquisitions. The acquisition process that occurs within the oscilloscope therefore has a profound impact on throughput. Some instruments offer enhanced measurement speed for production test applications (note that this rate has nothing to do with the instrument's bandwidth, which is presumed to be sufficient for the DUT data rates).Figure 1 compares the performance of two real-world sampling oscilloscopes when testing a complex 10-Gbit/sec laser module on the production line. Both instruments run a test consisting of four measurements, including autoset, 1,000 acquisitions per measurement, results calculation, and display. Instrument #1 demonstrates a dramatic advantage in test execution time. The total throughput of instrument #1 is more than 11 units per hour, while the throughput of instrument #2 is less than eight units per hour. Even when all ancillary steps are added, the raw throughput of sampling oscilloscope #1 is more than 50% higher than that of its competitor. While these test-time results pertain to the sampling of oscilloscopes, other instrumentation platforms are also making strides toward higher throughput. Technologies such as sweeping tunable lasers and fast wavelength calibration techniques are currently being used to reduce setup and measurement time.
An instrument's test execution is not the only measure of its throughput or productivity. Once test time is under control, "ancillary steps" account for a large portion of the cost of test. Particularly in the optical test arena, setting up for test becomes critical. Fiber is delicate, and each connection must be made with care. DUT handling comes at a high price in time and associated labor. Every connection cycle increases time spent and the risk of fiber or connector damage.
Consider the test requirements for an arrayed waveguide grating (AWG), a multichannel optical component. The AWG requires a battery of precise tests, including insertion loss, center wavelength, ripple, and passband. Normally, these functions require a matching arsenal of dedicated measurement tools and several insertion/removal cycles to complete the test across all instrument sets.
An emerging alternative is a modular solution that integrates diverse measurements into one mainframe with one connection to the DUT. The simplification and unity of this test setup eliminates costly labor and improves yields by reducing potential DUT or fiber-handling errors.
Another time-saving test technique is known as parallel testing. Once the province of memory test systems capable of testing multiple DUTs in one pass, parallel testing has arrived at the optical production test line, where multiple channels are a challenge. The modular integrated test system's expandability makes this trend practical. If 16-channel AWG manufacturing is common today, 40- to 80-channel testing will likely be common within a few years. Adding test channels to the same platform preserves the original investment in training, programming, and test process integration and maximizes throughput.
DWDM technology has multiplied the number of data channels that must be tested on a DUT. Testing these channels one at a time, as was necessary in the past, linearly multiplies the test time (or, reciprocally, divides the throughput). A new generation of configurable modular instruments provides channel expansion capability to exercise many DWDM ports at once, all synchronized and controlled within a single integrated system.
A notorious throughput bottleneck in optical production is bit-error-rate (BER) testing. The BER test sends and receives huge volumes of data to assimilate network operating conditions and understand BERs over time. In this case, the test methodology and the capabilities of the measurement instrument work together to reduce test time and improve measurement confidence.
The usual method of BER testing is to send a clean pseudo-random bit sequence test signal to the DUT and measure the resulting BER, which can produce a misleading error-free performance. Recent developments have shown that stressing an optical DUT provides a more meaningful result in less time.
This new model involves adding stress to the signal at the digital receiver end of the circuit to determine how much stress must be applied to reach a certain BER value. A signal that requires higher stress to create, for example, a BER of 10-6 is considered more robust than a signal producing a 10-6 BER at lower stress levels.Some test equipment provides a means of changing the decision threshold at the receiver to simulate the needed stress. Using such a tool, it is possible to test to error rates of 10-8 and extrapolate to the 10-14 level. At 2.4 Gbits/sec, this method reduces the time needed to verify 10-14 BER with confidence by a factor of 100 or more. Significantly, this method of testing is very repeatable (see Figure 2).
In today's cost-conscious optical-networking market, there is considerable pressure on equipment vendors to reduce the price of their products, even as technical demands grow more stringent. Product price pressures result in a corresponding emphasis on the reduction of production test costs, since a substantial portion of the overall manufacturing cost of optical-networking equipment involves test.
The use of a modular optical test platform across many production test requirements allows the manufacturer to distribute the cost of capital (i.e., depreciation expense) over many different products, resulting in the smallest cost per DUT markup for test capital. In doing so, the manufacturer reserves the right to invest at the appropriate level for the optimal test equipment solution set, which may not necessarily be the least expensive.
The key to reducing the cost of production test is choosing appropriate measurement strategies and solutions. Flexible, integrated, throughput-optimized optical test systems are emerging to replace the instrument clusters of years past. And high-performance instruments optimized for manufacturing throughput are delivering both measurement confidence and productivity.
These instruments do more than just apply a series of tests quickly. They treat the whole issue of manufacturing test, reducing setup, acquisition, and test programming time.
During the current downturn, optical equipment manufacturers will be successful in delivering lower-cost systems by paying close attention to their testing strategy. They will achieve success by automating processes, increasing reliability, and planning for maximum optimization of their test equipment capital.
Chris Loberg and Al Fox are product-line marketing managers for the Optical Business Unit at Tektronix Inc. (Beaverton, OR).