New approaches improve testing of laser diodes

Oct. 1, 2003

There are several sources of inaccuracy in testing high-power laser diodes used in telecommunications applications. These sources include problems with coupling high-current pulses to the device under test (DUT), optical-detector coupling, and both slow response and inaccuracy in the detector itself. Handling each of these problems can result shorter test times, more accurate results, and lower reject rates.

The fundamental test of a laser diode is a light-current-voltage (LIV) curve, which simultaneously measures the electrical and optical output power characteristics of the device. This test can be used at any stage of the process but is first used to sort laser diodes or weed out bad devices before they become part of an assembly.

The DUT is subjected to a current sweep, while the forward voltage drop is recorded for each step in the sweep. Simultaneously, instrumentation is used to monitor the optical power output. For several reasons, this test is best done in a pulsed fashion early in production, before the laser diode is assembled into a module. For diodes still on the wafer (VCSELs, for example) or in a bar (edge-emitting lasers), pulse testing is essential because at that point the devices have no temperature-control circuitry. Testing with DC would, at the very least, change their characteristics; at worst, it would destroy them. Later on in production, when they've been assembled into modules with temperature controls, the devices can be DC-tested and the results compared to the pulse test. In addition, some devices will pass a DC test and fail a pulsed test.

The LIV test data is analyzed to determine laser characteristics, including lasing threshold current, quantum efficiency, and the presence of "kinks" (nonlinearities) in the output (see Figure 1).
Figure 1. Light-current-voltage test data is analyzed to determine laser characteristics, including lasing threshold current, quantum efficiency, and the presence of "kinks" (nonlinearities) in the output.

Testing a laser diode properly requires a current pulse of the right shape. It should reach full current fairly quickly, then stay flat long enough to ensure the result accurately represents the laser diode's true output. For early-stage testing, it's common to use pulse widths of 500 nsec to 1 µsec, with a duty cycle of about 0.1%. Currents can range from a few tens of milliamps to 5 A. That can put great demands on the system, especially with respect to impedance matching.

Given that it's necessary to deliver a high-speed current pulse to the laser diode and avoid problems with reflections, intuition would indicate using a transmission-line structure—probably a piece of coaxial cable. But the most familiar type of coax has 5-W impedance, and the diode's impedance is closer to 2 W, a terrible mismatch. Although it would be possible to put a 48-W resistor in-series, that creates its own problems; pushing 5 A through a 50-W system takes a generator output of 250 V, which can be dangerous to people and equipment. In addition, the laser's dynamic resistance decreases with current, so the characteristics of the test setup change over the course of the test.
Figure 2. One way to connect a laser diode for testing is to make it electrically part of the center conductor of a 10-W line and feed it with a pulsed current source.

The use of low-impedance coax is one possible solution, but that has problems related to changing dynamic resistance in the laser diode. There's another alternative: Make the laser diode electrically part of the center conductor of a 10-W line (see Figure 2) and feed the whole thing with a pulsed current source. That way, it only takes <10 V to push 5 A through the diode. And because the system has a current source, we avoid problems with changing dynamic resistance in the laser diode.

Even the most careful impedance matching is not perfect, however, so it's good practice to keep all transmission lines short to minimize the effect of reflections and ringing. Another reason to keep the connections short is to minimize the area (and, hence, the inductance) of the loop formed at the connection to the laser diode. Remember that the voltage across an inductor is given by:

Reaching the desired slew rate of 5 A/50 nsec requires 10 V/nH. Unless the loop area is kept very small, the required voltage can easily exceed the compliance voltage of the instrument, dramatically extending the settling time.

Measuring voltage and current in the laser diode fed with high-speed pulses isn't easy. Even putting a scope probe on the diode to measure the voltage can cause problems. For one thing, where should the ground be connected? It may be necessary to float the scope or run it off batteries. The probe must be good to 1 GHz, but it's important to remember that the probe, regardless of its impedance, has a certain physical size, so it can act as an unshielded, unterminated transmission-line stub with electrical characteristics that vary wildly with frequency.

Measuring current is a little more straightforward. A low-value resistor (one with a value much lower than the resistance of the laser diode) connected in-series will work, but it must have low capacitance and inductance. A wire-wound resistor, which is basically a lossy inductor, is no good for high frequencies.
Figure 3. The choice of detector depends largely on the wavelength of light involved. At wavelengths <~800 nm, silicon is the only choice. Indium gallium arsenide would appear to be the best option for 1300–1700 nm, but this material has a problem with pulse response.
There are three common detector materials: silicon, germanium, and indium gallium arsenide (InGaAs). Each has its advantages and disadvantages. As Figure 3 shows, the choice of detector depends largely on the wavelength of light involved. At wavelengths <~800 nm, silicon is the only choice. But much telecom work is done between 1300 and 1700 nm, where it would appear that InGaAs would be best, because its response is fairly uniform and it holds up well to about 1700 nm. However, InGaAs has a problem with pulse response. It's desirable to test with a pulse short enough to avoid overheating the laser diode, yet long enough for it to reach some sort of steady state before the pulse is over.
Figure 4. In this pulse trace of a 1-mm indium gallium arsenide detector with 35-MHz cutoff frequency, the detector has a fast rise time, but its output never settles during the duration of the pulse.

As Figure 4 indicates, an InGaAs detector does not "settle," even within a 10-µsec pulse. If the pulse width were decreased to 1 µsec, the problem would be even worse. Germanium does not suffer from this effect, so it is preferable for short pulses.

There are several ways to couple the output from the laser diode to the detector. One way is simply to put the laser right against the detector, but this method has several drawbacks, among them being that not all the light may reach the detector. For example, if the laser's output beam is elliptical, or the beam diameter is larger than the detector's active area, or the beam is not exactly centered on the detector, then an unknown fraction of the light will be missed. In addition, some detectors are polarization-sensitive, which can cause further inaccuracy. On top of that, today's high-power laser diodes produce enough output to saturate many detectors.
Figure 5. An integrating sphere solves the problem of coupling instrumentation to the output of the laser diode.

Often, an integrating sphere is the best solution: a hollow ball coated on the inside with a diffuse reflecting material and equipped with a mounting for a detector and a port for feeding in the light to be measured (see Figure 5). That is far from a new invention—Philadelphia Electric Co. used large integrating spheres mounted on trucks to test carbon-arc streetlights around 1915. The integrating sphere accepts all the light from the source, randomizes its polarization, and distributes it evenly over its inside surface. A detector mounted through the side of the sphere then "sees" a measurable and repeatable fraction (about 1%) of the light fed into the sphere. There's plenty of light to measure but not enough to overpower the detector.

There was a time when demand for fiber-optic telecom equipment exceeded supply and manufacturing efficiencies were a secondary consideration. Today, testing like everything else must be fast, accurate, and inexpensive. That means an optical power meter is a bad choice. This instrument integrates light output over time—and with a low duty cycle input, that can be a long time indeed. In addition, the accuracy of the measurement depends on how accurately the duty cycle of the pulses is known and how closely the duty cycle of the light output matches that of the electrical input.

To get around this problem, it has been standard practice to use a rack full of equipment, including a pulse source, optical measurement components (photodiode detectors, etc.), a pair of high-speed current-to-voltage (transimpedance-amplifier) converters, and a high-speed multichannel digital sampling oscilloscope (DSO). The pulse source produces a pulse, and the other instruments measure the electrical and optical response, feeding the results via GPIB to a PC.

This process may take a few thousand pulses. Sometimes, there will be a few hundred pulses at each current level, with a boxcar averager integrating them. That would seem to improve sensitivity, resolution, and accuracy, but it can cover up problems with waveform distortion. It is also a lengthy process, taking anywhere from tens of seconds to several minutes per DUT. The system can do perhaps 2,500 parts per day and costs up to $150,000 per test stand.

Newer alternatives include all the instrumentation functionality in a single instrument. This type of instrument is essentially a pulsed source-measure unit with an output impedance and cabling that closely matches the impedance of the laser diode. The measurement portion of the system incorporates multichannel data acquisition, dedicated timing circuitry, high-speed current-to-voltage converters, and a digital signal processor (DSP) that emulates DSO functionality and controls much of the measurement sequence.

With this instrument, the sequencing of the LIV sweep is orchestrated by an internal DSP programmed only once via the GPIB bus for a given test sequence. Once programmed, the DSP can execute complete pulsed LIV sweeps without interaction with other equipment or the control computer. In fact, the instrument provides control signals directly to the component handling system via a digital I/O port.

By having the DSP as an integral part of the digitizing channels, fast analyses of captured pulse measurements are made without the time-consuming analysis sequence described previously. That reduces a pulsed LIV test time to only a few seconds and greatly minimizes software complexity.

With individual test times of only a few seconds, up to 15,000 devices per day can be tested, even when assuming only 85% system utilization due to work in progress flow irregularities, maintenance, etc. Such systems cost a fraction of the price of older ones and have higher throughput, so it's possible to cost-justify purchasing additional makeup capacity to reduce production planning uncertainties.

One approach to designing this type of system is to include both pulsed and non-pulsed operating modes. This dual functionality allows both types of LIV sweeps to be performed on a single platform, using the same measurement channels. Comparing pulsed and non-pulsed test results provides more complete information on DUT performance. Also, the dual-mode source can be located in a remote test head, which shortens the distance between the pulse source and the laser diode without the need to locate the instrument at the laser test station. The shorter cable length reduces measurement settling time and helps improve accuracy.

By incorporating all the relevant functions in one instrument, a third-generation LIV test system can greatly accelerate test throughput.

Paul Meyer is a product marketer in the Optoelectronic Component Test Group at Keithley Instruments (Cleveland). He can be reached at 440-498-2773 or [email protected].

Sponsored Recommendations

ON TOPIC: Innovation in Optical Components

July 2, 2024
Lightwave’s latest on-topic eBook, sponsored by Anritsu, will address innovation in optical components. The eBook looks at various topics, including PCIe (Peripheral...

PON Evolution: Going from 10G to 25, 50G and Above

July 23, 2024
Discover the future of connectivity with our webinar on multi-gigabit services, where industry experts reveal strategies to enhance network capacity and deliver lightning-fast...

Advancing Data Center Interconnection

July 24, 2024
Data Center Interconnect (DCI) solutions provide physical or virtual network connections between remote data center locations. Connecting geographically dispersed data ...

The Journey to 1.6 Terabit Ethernet

May 24, 2024
Embark on a journey into the future of connectivity as the leaders of the IEEE P802.3dj Task Force unveil the groundbreaking strides towards 1.6 Terabit Ethernet, revolutionizing...