Chipmaker Cerebras Introduces A New AI Processor

Cerebras Systems, an artificial intelligence business, unveiled a new iteration of its dinner-plate-sized processors on Wednesday, promising double the performance for the same price as the previous model.

The AI chips made by Santa Clara, California-based Cerebras compete with Nvidia’s cutting-edge technology, which aids OpenAI in creating the underlying software that drives apps like ChatGPT. Cerebras has took risks that its about foot-wide chip can surpass Nvidia’s clusters of chips, avoiding the need to piece together thousands of chips in order to construct and execute AI applications.

Important Phrase

“So the largest chip that we made was our first generation. People said we couldn’t make it,” Cerebras CEO Andrew Feldman said to reporters on Tuesday. “Eighteen months later we did it in seven nanometer. Eighteen months (after that), we’ve announced a five-nanometer part. This is the largest part by more than three and a half trillion transistors.”

Setting

One major issue with AI processing is power consumption. In an era where the cost of power to develop and execute AI applications has skyrocketed, Cerebras’ third-generation chip consumes the same amount of energy to deliver higher performance. Although Cerebras does not sell the chips directly, it claims that the systems built around them are a more effective way to develop AI applications through a process known as training.

Based on the figures

With 4 trillion transistors, the new Wafer-Scale Engine 3 (WSE-3) can compute at 125 petaflops. It was constructed using the 5nm manufacturing technique from Taiwan Semiconductor Manufacturing Co.

Cerebras is cash flow positive, according to Feldman.

What comes next?

Additionally, Cerebras said on Wednesday that it would be selling its WSE-3 systems in conjunction with Qualcomm AI 100 Ultra chips, which are used to support the inference process in artificial intelligence applications.

Komal Patil: