The Secret of SiC’s Sudden Success

  • Jul 11, 2018
  • UnitedSiC

By Anup Bhalla, VP Engineering at UnitedSiC

Silicon carbide has offered plenty of promise as a power-semiconductor wonder material, but has only recently taken off commercially. What has happened to kickstart SiC’s success, and what are its prospects for the future?

Although silicon carbide (SiC) has existed since the dawn of time (it occurs naturally, as a product of cosmic phenomena like supernovae), it was first synthesized in 1891: the unintentional product of American inventor Edward G Acheson’s attempt to produce artificial diamonds by heating aluminum silicate and carbon. Discovering that the shiny hexagonal crystals were almost as hard as diamond, and could be manufactured as chips or powder in industrial quantities, Acheson had set the scene for SiC’s first career as an incredibly effective abrasive.

To today’s electronics engineers, of course, SiC has an entirely different role; it could hold the key to the future of sustainable energy. SiC power semiconductors can boost energy-conversion efficiency, withstand higher voltages and currents in relation to their size, and survive higher operating temperatures compared with conventional silicon-based devices, all of which offers important benefits for equipment like data center power supplies, wind or solar power-conditioning modules and electric vehicle traction inverters.

Timely. But why now? Researchers have been looking into the electrical properties of SiC since the early 20th century: at that time, experiments to investigate the properties of crystals as electrical rectifiers demonstrated its light-emitting properties. These properties were modest, however, and SiC was soon eclipsed by other compounds like gallium arsenide and gallium nitride. With 10-100 times better output, they would go on to become the first LEDs, while SiC remained in the lab – a synthetic semiconductor material looking for an application.

All this was 40 years before the discovery of the transistor, but SiC’s properties offered tantalizing theoretical benefits as semiconductor technologies advanced. Its thermal conductivity is 3.5 times better than silicon. It can be heavily doped for high electrical conductivity, yet on the other hand can sustain high electric field strength without experiencing breakdown. In addition to operating up to high temperatures it is also mechanically very stable and has a low coefficient of thermal expansion. So how did silicon get the gig at the dawning of the transistor revolution?

Economics is the short answer to that question. Historically, SiC’s Achilles’ heel has been its relatively poor high-volume manufacturability and therefore, producing high-quality SiC crystals was difficult. A large variety of defects such as edge dislocations, various types of screw dislocations, triangular defects, basal plane dislocations and others, could occur in high numbers across even a small-size wafer. Effects included poor reverse-blocking performance in transistors or diodes, which used to effectively render the device inoperable. There were also problems interfacing SiC with silicon dioxide (SiO2), which is needed to fabricate MOSFETs or IGBTs.

Silicon Running Out of Road

While these challenges effectively prevented chip makers from realizing the full performance, power-density and reliability gains possible with SiC, silicon semiconductors have proved easier to manufacture at commercial yield levels and so have dominated the power electronics scene. Now, however, silicon technology is running out of road: unable to deliver the ongoing improvements demanded in sectors like data center power, automotive, and renewable energy.

Fortunately, researchers’ efforts to overcome the traditional barriers to commercializing SiC are now paying off. SiC wafer purity has been improved, which has increased yields and enabled manufacturers to move from 4-inch to 6-inch wafers. This is reckoned to enable a 20-50% reduction in device costs. 6-inch wafers entered production in about 2012. Also, the development of processes like nitridation – annealing in nitrogen dioxide or nitrogen oxide – has made it possible to grow silicon dioxide films onto SiC substrates, which is needed to produce high-performing power MOSFETs or IGBTs.

While silicon-based devices are more or less at the limit of their performance potential – although inventive designers continue to squeeze the last drops of performance – newer SiC technology is advancing more quickly: optimizing device architectures and dimensions, improving parasitic diode behavior and new package structures are realizing large gains that extend SiC’s inherent performance advantages over silicon as the high-performance, high-efficiency, ultra-rugged successor in today’s most demanding applications. High-voltage devices have also been introduced, such as UnitedSiC’s 1200V SiC JFETs.

As progress continues, presenting an increasingly compelling case to power-systems designers, some barriers remain: many projects don’t have the luxury of starting from a clean sheet, some SiC MOSFETs are not a simple drop-in upgrade for an incumbent silicon device and demands re-optimization of the circuit, revised gate-drive voltages, and higher switching frequency to maximize the performance gains.

UnitedSiC’s cascodes can provide the bridge from silicon past to SiC future, which some systems vendors need. Co-packaging a low-voltage silicon MOSFET with a SiC JFET to do the heavy lifting creates a usable drop-in upgrade that minimizes the commitment required to access SiC’s efficiency, ruggedness and power-density advantages. Companies like Sweden’s Micropower Group, a manufacturer of industrial battery back-up systems, successfully replaced obsolete silicon MOSFETs with UnitedSiC’s SiC cascodes, and instantly realized a 10% increase in light-load efficiency and a 1% increase at typical loads.

After more than 100 years, and a career change, SiC’s time as a valuable power-semiconductor technology is now.