🤖 I’m a bot that provides automatic summaries for articles:
Click here to see the summary
But one thing Microsoft-backed OpenAI needed for its technology was plenty of water, pulled from the watershed of the Raccoon and Des Moines rivers in central Iowa to cool a powerful supercomputer as it helped teach its AI systems how to mimic human writing.
Few people in Iowa knew about its status as a birthplace of OpenAI’s most advanced large language model, GPT-4, before a top Microsoft executive said in a speech it “was literally made next to cornfields west of Des Moines.”
In response to questions from The Associated Press, Microsoft said in a statement this week that it is investing in research to measure AI’s energy and carbon footprint “while working on ways to make large systems more efficient, in both training and application.”
Microsoft first said it was developing one of the world’s most powerful supercomputers for OpenAI in 2020, declining to reveal its location to AP at the time but describing it as a “single system” with more than 285,000 cores of conventional semiconductors, and 10,000 graphics processors — a kind of chip that’s become crucial to AI workloads.
It wasn’t until late May that Microsoft’s president, Brad Smith, disclosed that it had built its “advanced AI supercomputing data center” in Iowa, exclusively to enable OpenAI to train what has become its fourth-generation model, GPT-4.
In some ways, West Des Moines is a relatively efficient place to train a powerful AI system, especially compared to Microsoft’s data centers in Arizona that consume far more water for the same computing demand.
🤖 I’m a bot that provides automatic summaries for articles:
Click here to see the summary
But one thing Microsoft-backed OpenAI needed for its technology was plenty of water, pulled from the watershed of the Raccoon and Des Moines rivers in central Iowa to cool a powerful supercomputer as it helped teach its AI systems how to mimic human writing.
Few people in Iowa knew about its status as a birthplace of OpenAI’s most advanced large language model, GPT-4, before a top Microsoft executive said in a speech it “was literally made next to cornfields west of Des Moines.”
In response to questions from The Associated Press, Microsoft said in a statement this week that it is investing in research to measure AI’s energy and carbon footprint “while working on ways to make large systems more efficient, in both training and application.”
Microsoft first said it was developing one of the world’s most powerful supercomputers for OpenAI in 2020, declining to reveal its location to AP at the time but describing it as a “single system” with more than 285,000 cores of conventional semiconductors, and 10,000 graphics processors — a kind of chip that’s become crucial to AI workloads.
It wasn’t until late May that Microsoft’s president, Brad Smith, disclosed that it had built its “advanced AI supercomputing data center” in Iowa, exclusively to enable OpenAI to train what has become its fourth-generation model, GPT-4.
In some ways, West Des Moines is a relatively efficient place to train a powerful AI system, especially compared to Microsoft’s data centers in Arizona that consume far more water for the same computing demand.
Saved 79% of original text.