Microsoft Build 2020: Company reveals new Azure cloud based supercomputer built in collaboration with OpenAI
It features over 285,000 CPU cores, 10,000 GPUs and 400 gigabits per second of network connectivity for each GPU server.
At the Build 2020 conference, Microsoft not just updated its Teams and Edge browser but has also announced a new supercomputer. The supercomputer, as mentioned by the firm in a blog post, is built in collaboration with and exclusively for OpenAI and is hosted on Microsoft's Azure platform. It is worth adding that both OpenAI and Microsoft went into a partnership last year in a bid to create supercomputing technologies.
It is said to feature over 285,000 CPU cores, 10,000 GPUs and 400 gigabits per second of network connectivity for each GPU server. This also makes its way to the Top 500 Supercomputer list, says Microsoft.
Also read: Microsoft Build 2020: From Azure to Microsoft 365, here's everything Microsoft announced last night
The Redmond-based tech giant thinks that it is a first step in making the next generation of very large AI models and infrastructure that can train these models for other organisations and developers.
“The exciting thing about these models is the breadth of things they're going to enable,” said Microsoft Chief Technical Officer Kevin Scott. As per Scott, the potential benefits extend far beyond narrow advances in one type of AI model. “This is about being able to do a hundred exciting things in natural language processing at once and a hundred exciting things in computer vision, and when you start to see combinations of these perceptual domains, you're going to have new applications that are hard to even imagine right now.”
Also read: Microsoft Build 2020: Everything new coming to Teams
Although Machine Learning experts have previously made separate, smaller AI models that can learn single tasks like translating between languages, recognizing objects, reading text to identify key points in an email and more, Microsoft thinks a single large AI model can perform tasks better.
“As part of a companywide AI at Scale initiative, Microsoft has developed its own family of large AI models, the Microsoft Turing models, which it has used to improve many different language understanding tasks across Bing, Office, Dynamics and other productivity products.”
Also read: Microsoft Build 2020: Everything new coming to Edge browser
And to train these large AI modes, Microsoft has built a supercomputer. “Training massive AI models requires advanced supercomputing infrastructure, or clusters of state-of-the-art hardware connected by high-bandwidth networks. It also needs tools to train the models across these interconnected computers,” adds the firm.
Follow HT Tech for the latest tech news and reviews , also keep up with us on Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.