Advertisement
Singapore markets close in 7 hours 54 minutes
  • Straits Times Index

    3,367.90
    +29.33 (+0.88%)
     
  • Nikkei

    40,402.68
    +327.99 (+0.82%)
     
  • Hang Seng

    17,769.14
    +50.53 (+0.29%)
     
  • FTSE 100

    8,121.20
    -45.56 (-0.56%)
     
  • Bitcoin USD

    61,993.88
    -863.55 (-1.37%)
     
  • CMC Crypto 200

    1,337.26
    -7.24 (-0.54%)
     
  • S&P 500

    5,509.01
    +33.92 (+0.62%)
     
  • Dow

    39,331.85
    +162.33 (+0.41%)
     
  • Nasdaq

    18,028.76
    +149.46 (+0.84%)
     
  • Gold

    2,340.30
    +6.90 (+0.30%)
     
  • Crude Oil

    82.98
    +0.17 (+0.21%)
     
  • 10-Yr Bond

    4.4360
    -0.0430 (-0.96%)
     
  • FTSE Bursa Malaysia

    1,597.96
    -0.24 (-0.02%)
     
  • Jakarta Composite Index

    7,125.14
    -7,139.63 (-50.05%)
     
  • PSE Index

    6,358.96
    -39.81 (-0.62%)
     

Onsemi aims to improve AI power efficiency with silicon carbide chips

FILE PHOTO: Illustration shows AI (Artificial Intelligence) letters and robot hand miniature

By Stephen Nellis

(Reuters) - Onsemi on Wednesday unveiled a lineup of chips designed to make the data centers that power artificial intelligence services more energy efficient by borrowing a technology it already sells for electric vehicles.

Onsemi is one of a handful of suppliers of chips made of silicon carbide, an alternative to standard silicon that is more pricey to manufacture but more efficient at converting power from one form to another. In recent years, silicon carbide has found wide use in electric vehicles, where swapping out the chips between the vehicle's battery and motors can give cars a boost in range.

Simon Keeton, president of the power solutions group at Onsemi, said that in a typical data center, electricity gets converted at least four times between when it enters the building and when it is ultimately used by a chip to do work. Over the course of those conversions, about 12% of the electricity is lost as heat, Keeton said.

ADVERTISEMENT

"The companies that are actually using these things - the Amazons and the Googles and the Microsoft - they get double penalized for these losses," Keeton said. "Number one, they're paying for the electricity that gets lost as heat. And then because it gets lost as heat, they're paying for the electricity to then cool" the data center, Keeton said.

Onsemi believes it can reduce those power losses by a full percentage point. While a percentage point does not sound like much, the estimates of how much power AI data centers will consume is staggering, with some groups estimating up to 1,000 terawatt hours in less than two years.

One percent of that total, Keeton said, "is enough to power a million houses for a year. So that puts it into context of how to think about the power levels."

(Reporting by Stephen Nellis in San Francisco; Editing by Chris Reese)