Cerebras Systems has launched its Wafer Scale Engine 3 (WSE-3), a revolutionary AI wafer-scale chip that offers double the performance of its predecessor, the WSE-2.
The WSE-3 has 4 trillion transistors, 900,000 AI cores, 44GB of on-chip SRAM, and a peak performance of 125 FP16 PetaFLOPS. It trains some of the largest AI models and powers Cerebras’s CS-3 supercomputer.
The CS-3 can store massive models in a single logical space without partitioning or refactoring, streamlining the training process and developer efficiency.
Check Out The New TalkDev Podcast. For more such updates follow us on Google News TalkDev News.