New York.- The tech giant Microsoft presented this Monday Maia 200, the second generation of its artificial intelligence (AI) chip with which it seeks to reduce dependence on Nvidia and compete against those of Google and Amazon in the cloud.
This new model has been presented two years after its first version, the Maia 100, which was not available to customers.
According to the company, the Maia 200 is designed as an inference processor and promises up to 30% more performance per dollar than current Microsoft hardware. In addition, it is already being deployed in Azure data centers in the United States to power services such as Microsoft 365 Copilot, Foundry, and the latest GPT models from OpenAI."It's the most efficient inference system Microsoft has ever deployed," said Scott Guthrie, Microsoft's executive vice president of cloud and AI, in a blog post.
You may be interested in: They create a computer that imitates the human brain and drastically reduces energy consumption in AI
These chips are also being integrated into data centers in the central region of the US and will subsequently reach those in the west and other locations. The chips use Taiwan Semiconductor Manufacturing Co.'s three-nanometer process. The tech company's announcement comes amid a race to lead generative AI. The major cloud providers are struggling to develop their own microchips that do not depend exclusively on Nvidia, which leads the market. Amazon, Google, and now Microsoft's releases increase competition in a booming market.







