- 7,916 visits since January 2015
Connecting Theory and Practice in Optoelectronics
Ever since Shuji Nakamura mentioned in his Nobel lecture in Stockholm that lasers are the future of lighting, I am puzzled by this claim, especially by the now widely circulated statement that laser diodes are free from efficiency droop. It suggests an advantageous energy efficiency of laser diodes, as shown in the last figure 17 of his lecture (which is actually invalidated by his own reference). If you are familiar with the much debated efficiency droop burdening GaN-LED lighting, you will agree that the underlying carrier loss mechanisms are also present in laser diodes. Even worse, laser diodes require a higher carrier density in the active layers and therefore exhibit stronger Auger recombination and possibly also electron leakage already at lasing threshold.
Thus, in contrast to LEDs, major efficiency loss is built into laser diodes from the get-go. At high current, GaN-lasers also exhibit a substantial droop of the electrical-to-optical power conversion efficiency (PCE) due to the rising bias, even without self-heating (see picture). The maximum PCE measured on GaN-lasers is about 40%, much less than the record 84% recently reported for GaN-LEDs up to high temperatures of 85°C. In comparison, today’s most efficient laser diodes are based on GaAs and achieve up to 76% PCE at room temperature. But such record numbers seem out of reach for GaN-lasers in view of their much higher series resistance and internal absorption.
Certainly, an advantage of laser-based lighting is the narrowly focused light beam, which is already utilized in the headlights of some high-end cars. The small footprint of lasers is also beneficial but it goes hand-in-hand with a high heat power density. Light power roll-off induced by self-heating is a well-known problem with laser diodes. Thus, it seems that efficiency limitations will keep haunting us, even with laser-based lighting applications.