Microsoft Spent Millions of Dollars on OpenAI’s ChatGPT Supercomputer


Share post:

Microsoft Announces that it Strung together tens of thousands of Nvidia A100 chips to revamp server racks to develop hardware behind ChatGPT and its in-house Bing AI bot.

According to the report by Bloomberg, Microsoft shells out hundreds of millions of dollars to develop a huge supercomputer to assist OpenAI’s ChatGPT chatbot. In order to develop such a massive supercomputer that supports OpenAI’s projects, they strung together thousands of Nvidia graphics processing units (GPUs) in its Azure cloud computing platform.

Microsoft also says that they are also focusing on making Azure’s AI more robust.

Read more: Microsoft Strung Together Tens of Thousands of Chips in a Pricey Supercomputer for OpenAI

TalkDev Bureau
TalkDev Bureau
The TalkDev Bureau has five well-trained writers and journalists, well versed in B2B enterprise technology industry, and constantly in touch with industry leaders for the latest trends, opinions, and other inputs- to bring you the best and latest in the domain.


Please enter your comment!
Please enter your name here


Related articles

What Developers Need to Know About Carbon Programming Languages

Google engineers have developed a new programming language called Carbon to replace C++. Here are some details that...

MiniTool Launched MiniTool MovieMaker 6.0 with 4 Aspect Ratios

MiniTool launched MiniTool MovieMaker 6.0, the latest version of its video editing software with enhanced features and effects...

Replit and Google Cloud Collaborate to Accelerate Generative AI for Software Development

Replit, a cloud software development platform, declared a new strategic collaboration with Google Cloud accelerating the creation of...

Quectel Releases KG100S Module For Amazon Sidewalk To Provide Advanced Connectivity Solutions For Customers

Quectel Wireless Solutions, a global provider of IoT solutions, and Amazon Sidewalk, a secure, wide-reaching, low-bandwidth network for...