← back to the chart
cnbc_topnews apr 22, 2026

Google unveils chips for AI training and inference in latest shot at Nvidia

Google unveils chips with ample SRAM for AI training and inference, following Nvidia's plans.

impact+0.70 sentiment+0.30 n=3
google is splitting its ai chip work into two separate processors, a new shot at nvidia's dominance. the eighth-generation tensor processing unit (tpu) will have one chip for training ai models and another for handling inference, or serving them to users. both chips will be available later this year. amin vahdat, a google svp, said the move is because "the community would benefit from chips individually specialized." the new training chip offers 2.8 times the performance of its predecessor, the ironwood tpu, for the same price. the new inference chip, called tpu 8i, offers 80% better performance and relies on static random-access memory (sram), containing 384 megabytes per chip. ceo sundar pichai wrote the architecture is designed for running "millions of agents cost-effectively." adoption is growing: citadel securities uses tpus, all 17 u.s. energy department national labs use software built on them, and anthropic has committed to using multiple gigawatts worth of the chips. google remains a large nvidia customer but offers tpus as an alternative in its cloud.
read at cnbc_topnews →