
"Priced at $185, above the marketed range, the wafer-scale chip company opens trading on Thursday at a $56.4bn valuation. The OpenAI deal is what got the book covered. The customer concentration footnote is what the next quarter has to answer."
"Cerebras designs the Wafer Scale Engine , a single piece of silicon the size of a dinner plate that holds more than four trillion transistors. The pitch to investors is inference, not training: the workload where AI models run rather than learn, and where speed and unit cost matter more than raw compute. Where Nvidia owns the training side, Cerebras is selling the proposition that the next bottleneck is the one in front of the user."
"Revenue went from $24.6m in 2022 to $290m in 2024 to $510m in 2025, with the bulk of 2024 revenue coming from a single customer: G42, the Abu Dhabi AI conglomerate that accounted for 85% of that year's sales. The 2024 prospectus stalled when CFIUS opened a review of G42's minority stake. The 2026 refile cleared after G42's holding was restructured into non-voting shares."
"What changed the IPO's gravity was the January contract with OpenAI : a multi-year agreement for 750 megawatts of inference capacity, expandable to two gigawatts by 2030, with a contracted value over $10bn at sig"
Cerebras Systems priced its IPO at $185 per share, above the marketed range, raising $5.55bn and valuing the company at $56.4bn on a fully diluted basis. Trading began on Nasdaq on Thursday under the ticker CBRS. The company designs the Wafer Scale Engine, a single silicon chip the size of a dinner plate with more than four trillion transistors, focused on inference workloads rather than training. Revenue rose from $24.6m in 2022 to $290m in 2024 and $510m in 2025. In 2024, most revenue came from G42, which previously faced a CFIUS review of its minority stake. A January OpenAI contract for 750 megawatts of inference capacity, expandable to two gigawatts by 2030, helped drive investor demand.
Read at TNW | Investors-Funding
Unable to calculate read time
Collection
[
|
...
]