Tuesday, January 27, 2026

Microsoft announces powerful new chip for AI inference with over 100 billion transistors

Good news! Very impressive! Let a hundred chips bloom!

"... The 200, which follows the company’s Maia 100 released in 2023, has been technically outfitted to run powerful AI models at faster speeds and with more efficiency, the company has said. Maia comes equipped with over 100 billion transistors, delivering over 10 petaflops in 4-bit precision and approximately 5 petaflops of 8-bit performance — a substantial increase over its predecessor. ...

Microsoft’s new chip is also part of a growing trend of tech giants turning to self-designed chips as a way to lessen their dependence on Nvidia, whose cutting-edge GPUs have become increasingly pivotal to AI companies’ success. Google, for instance, has its TPU, the tensor processing units — which aren’t sold as chips but as compute power made accessible through its cloud. Then there’s Amazon Trainium, the e-commerce giant’s own AI accelerator chip, which just launched its latest version, the Trainium3, in December. In each case, the TPUs can be used to offload some of the compute that would otherwise be assigned to Nvidia GPUs, lessening the overall hardware cost. ..."

Microsoft announces powerful new chip for AI inference | TechCrunch

No comments: