#maia-200

[ follow ]
Tech industry
fromTechCrunch
5 days ago

Microsoft won't stop buying AI chips from Nvidia, AMD, even after launching its own, Nadella says | TechCrunch

Microsoft deployed Maia 200 inference chips, claims performance advantages over Amazon and Google chips, will purchase others while using Maia for frontier models.
Artificial intelligence
fromwww.bloomberg.com
1 week ago

Microsoft unveils latest AI chip to reduce reliance on Nvidia

Microsoft is deploying the Maia 200 AI chip in data centers to reduce reliance on Nvidia, improve inference efficiency, and power Copilot and cloud AI services.
Artificial intelligence
fromComputerworld
1 week ago

Microsoft launches its second generation AI inference chip, Maia 200

Maia 200 is a high-performance, energy-efficient inference accelerator optimized for large reasoning models, delivering superior FP4/FP8 throughput and memory compared with rival cloud accelerators.
Artificial intelligence
fromTheregister
1 week ago

Microsoft looks to drive down AI infra costs with Maia 200

Microsoft unveiled the Maia 200 inference accelerator: 144 billion transistors, 10 petaFLOPS FP4, 216GB HBM3e (7TB/s), and 750W power consumption.
Artificial intelligence
fromTechzine Global
1 week ago

Microsoft unveils new proprietary AI chip Maia 200

Maia 200 is a high-performance AI accelerator delivering superior throughput, efficiency, and large-model support with extensive memory, networking, and Azure SDK integration.
Artificial intelligence
fromComputerWeekly.com
1 week ago

Microsoft introduces AI accelerator for US Azure customers | Computer Weekly

Azure US Central is first to receive Maia 200 inference accelerator, offering FP8/FP4 tensor cores, HBM3e memory, and improved cost and performance.
fromThe Verge
1 week ago

Microsoft's latest AI chip goes head-to-head with Amazon and Google

Built on TSMC's 3nm process, Microsoft says its Maia 200 AI accelerator "delivers 3 times the FP4 performance of the third generation Amazon Trainium, and FP8 performance above Google's seventh generation TPU." Each Maia 200 chip has more than 100 billion transistors, which are all designed to handle large-scale AI workloads. "Maia 200 can effortlessly run today's largest models, with plenty of headroom for even bigger models in the future," says Scott Guthrie, executive vice president of Microsoft's Cloud and AI division.
Artificial intelligence
[ Load more ]