From theregister.com
Microsoft made sure to include Azure in the AI-fest that was the Build 2023 developer conference this week.
As enterprises consider experimenting with or deploying generative AI, they may well look to public clouds and similar scalable compute and storage infrastructure to run things like large-language models (LLMs).
Microsoft, armed with ChatGPT, GPT-4, and other OpenAI systems, has for months been shoving AI capabilities into every nook and cranny of its empire. Azure is no different – the OpenAI Service is an example – and after its Build conference, Redmond’s public cloud now has even more claimed offers.
High on the list is an expanded partnership with Nvidia, which itself is rushing to establish itself as the indispensable AI technology provider, from GPU accelerators to software. This week alone the chipmaker unveiled a host of partnerships, such as with Dell at Dell Technologies World and supercomputer makers at ISC23.