
"Microsoft has announced progress on a new chip cooling approach that could help address one of the biggest bottlenecks in scaling AI infrastructure: heat. The company's researchers have successfully demonstrated in-chip microfluidic cooling, a system that channels liquid coolant directly into etched grooves on the back of silicon chips. Traditional cooling methods in data centers, such as cold plates, dissipate heat by transferring it through multiple material layers."
"By etching microchannels-each about the width of a human hair-directly into the silicon, coolant can flow over hotspots inside the chip itself. Early lab-scale tests showed that this design removed heat up to three times more effectively than cold plates under certain workloads. Microsoft also reported a 65 percent reduction in maximum GPU temperature rise. The work is still experimental, but researchers argue the technique has practical implications."
Microsoft demonstrated in-chip microfluidic cooling by channeling liquid coolant through etched microchannels on the back of silicon chips. The approach directs coolant over internal hotspots, avoiding heat transfer across multiple material layers used by cold plates. Lab-scale tests removed heat up to three times more effectively under certain workloads and reduced maximum GPU temperature rise by 65 percent. The technique could enable higher-density server configurations, lower cooling energy, and extend chip performance ceilings, making controlled overclocking more viable. Engineering challenges include channel depth trade-offs to prevent clogging while maintaining silicon structural integrity; design iterations and collaboration optimized channel patterns.
Read at InfoQ
Unable to calculate read time
Collection
[
|
...
]