Just as it did in 2021—when it doubled the size of its logistics network to more than 450K SF to address the surge in e-commerce during the pandemic—Amazon is marshalling its unmatched financial resources to double the capacity of its data center network in order to put its cloud platform at the center of generative AI.

Amazon Web Services (AWS) aims to be the leading platform for training large language models (LLMs) for generative AI applications, directly challenging Microsoft—for now, the industry leader with OpenAI's GPT 4—as well as Google and Salesforce—and, coming soon, initiatives from mixed-martial-arts enthusiasts Elon Musk and Mark Zuckerberg—in a tech-sector version of an arms race between superpowers.

Data centers around the world are racing to expand their capacity with servers equipped with AI-programmable nano-scale semiconductors like the chips pioneered by Nvidia that enabled the LLM breakthroughs of the past year.

Want to continue reading?
Become a Free ALM Digital Reader.

Once you are an ALM Digital Member, you’ll receive:

  • Breaking commercial real estate news and analysis, on-site and via our newsletters and custom alerts
  • Educational webcasts, white papers, and ebooks from industry thought leaders
  • Critical coverage of the property casualty insurance and financial advisory markets on our other ALM sites, PropertyCasualty360 and ThinkAdvisor
NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.