Billionaire Elon Musk is criticizing Microsoft’s artificial intelligence (AI) plans. A new Microsoft AI chip called “Maia” is set to be the cornerstone of the company’s emerging AI infrastructure, which will compete with Musk’s, hence his criticisms.
We’re told that Maia will bring artificial generalized intelligence (AGI) to “every facet” of human life. AGI differs from the isolated AI applications that many people are now familiar with in that it allegedly represents actual “intelligence” minus human inputs, e.g., ChatGPT.
“ChatGPT and MidJourney AI are cool, novel applications, but they only function within a specific domain—an AGI is more like the science fiction computers we have seen in the Terminator franchise,” explains Wlt Report.
“Yikes!” Musk exclaimed in a tweet highlighting Microsoft’s annual $50 billion investment in data centers to support Maia and other similar projects.
The chips will start to roll out early next year to Microsoft’s data centers
“Microsoft has reimagined our infrastructure with an end-to-end systems approach to meet our customer’s unique AI and cloud needs,” stated Scott Guthrie, Vice President of the Cloud and AI Group at Microsoft, about the project.
“With the launch of our new AI Accelerator, Azure Maia, and cloud-native CPU, Azure Cobalt, alongside our continued partnerships with silicon providers, we can now provide even more choice and performance.”
According to a recent announcement, the two custom-designed chips and integrated systems developed by Microsoft Ignite—the Microsoft Azure Maia AI Accelerator, optimized for AI tasks and generative AI—and the Microsoft Azure Cobalt CPU, an Arm-based processor designed to run general-purpose compute workloads on the Microsoft Cloud—represent Microsoft’s future.
“The chips represent a last puzzle piece for Microsoft to deliver infrastructure systems – which include everything from silicon choices, software, and servers to racks and cooling systems—that have been designed from top to bottom and can be optimized with internal and customer workloads in mind,” the company says.
“The chips will start to roll out early next year to Microsoft’s data centers, initially powering the company’s services such as Microsoft Copilot or Azure OpenAI Service.”
Microsoft Chairman and CEO Satya Nadella says the company’s partnership with OpenAI is an important part of its “product roadmap” and that the division’s new leadership, led by Emmett Shear, as well as newly hired Sam Altman and Greg Brockman, will help Microsoft’s AI future succeed.