Ironwood: Google launched a new chipset to compete with Nvidia, the speed of AI tools will increase..

Alphabet's company Google has introduced its seventh-generation artificial intelligence (AI) chip 'Ironwood'. This new processor has been specially designed to increase the speed of AI applications. The Ironwood chip meets the data processing needs that arise when users enter queries in AI tools like ChatGPT. In the technical world, it is called 'Inference Computing' i.e. chips that perform fast calculations to answer questions or generate other responses in chatbots.
Preparation to compete with Nvidia
This decades-old and billion-dollar effort of Google is one of the few options to challenge Nvidia's dominance, opening up new possibilities in the AI processor market. Google's Tensor Processing Units (TPU) can only be used by the company's engineers or its cloud platform. This has also given Google an edge over some competitors in internal AI development.
Google had previously divided its TPU chips into two parts. One chip that is capable of training large AI models. The other chip makes Inference (run time use of AI) cheaper and faster. Now the new Ironwood chip combines both these features in one. The Ironwood chip is designed to work in a group of 9,216 chips simultaneously. This chip has more memory than the previous generation, making it better for serving AI applications.
Disclaimer: This content has been sourced and edited from Amar Ujala. While we have made modifications for clarity and presentation, the original content belongs to its respective authors and website. We do not claim ownership of the content.