Microsoft (MSFT) revealed on Tuesday; it has developed one of the world’s top five supercomputers specifically designed for OpenAI. The company announced the news during its Build Developers Conference, which was held in Seattle, where it usually takes place every year, virtually rather than in person. The companies are hoping this supercomputer could create more human-like intelligence. We also hope that that intelligence won’t exterminate humanity. Microsoft hasn’t said precisely where the latest Azure-hosted supercomputer sits on the TOP500 list; it’s only in the top five.
Microsoft says the supercomputer will be used to train patented artificial intelligence models from OpenAI. Created in 2015, OpenAI has received initial support from the likes of Elon Musk and Sam Altman ever since Musk has left the company while Altman is now its CEO.
Based on the last list update in November 2019, Microsoft’s system is capable of achieving at least 38,745 teraflops per second, which is the peak speed of the University of Texas Frontera supercomputer ranked number five. But, without going up the scale, it may be as large as 100,000 teraflops — there is a significant difference between the numbers four and five.
Although we don’t have the raw computing power numbers, Microsoft was happy to talk about all the hardware inside its new supercomputer. There are 285,000 CPU cores, which looks a lot like that. That is smaller than any of the supercomputers in the top five currently. Microsoft’s AI computer also sports 10,000 GPUs for each GPU server and 400 gigabits of data bandwidth.
From their work on GPT-2, you may know OpenAI, the fake news bot that the company initially considered too risky to announce. OpenAI has used a methodology called self-supervised learning to build GPT-2, and the new supercomputer would be able to do more of it. Computers create models in self-supervised education (sometimes called unsupervised learning), by assimilating large amounts of unlabeled data, and humans can make adjustments to the model to nudge it in the correct direction. This has the potential to create much more nuanced and efficient AI, but a lot of processing power is required.
We can only hint at what it would be possible for OpenAI to build for one of the fastest supercomputers available worldwide. Microsoft and OpenAI believe a powerful machine with advanced learning methods will learn to do something that a human can do — it’s only a matter of time and distance. There are trillions of synapses in a human brain that hold electrical impulses to generate conscious thought. In AI, the parameter is the relative. The new OpenAI model has approximately 17 billion parameters, and companies expect parameters that will quickly hit the trillions.