Microsoft has announced that OpenAI’s ChatGPT is now accessible through the Azure OpenAI service, after announcing that it was coming back in January. This provides developers and businesses with the opportunity to integrate OpenAI’s ChatGPT model into their cloud applications, allowing for more widespread integration of conversational AI.
Starting today, Azure OpenAI users now have access to ChatGPT on a preview basis, charged at a rate of $0.002 for every 1,000 tokens. Billing for the said service will commence on the 13th of March and be integrated as part of the Azure OpenAI offer. Access is however restricted to Microsoft-managed customers and partners, who must complete the necessary application before being granted permission.
The tech industry giant claims that with the help of Azure OpenAI Service, more than 1,000 customers are using the most cutting-edge AI models to “innovate in new ways.” These models include Dall-E 2, GPT-3.5, Codex, and other large language models, all of which are supported by Azure’s special supercomputing and enterprise capabilities.
Azure OpenAI is harnessed to power a range of Microsoft services, such as GitHub Copilot, Power BI, Microsoft Teams Premium, Viva Sales, and its cutting-edge Bing chatbot. The company also combines Azure data management, handling, and scalability capabilities with tools like ChatGPT and DALL-E.
Microsoft says its customers and partners have the opportunity to develop new, innovative applications and services in Azure OpenAI Studio that will help them to outperform their competitors. As well as being able to customize every model provided by the service, Azure OpenAI Studio also has a convenient interface that can be used to customize ChatGPT and tailor its responses to better suit your company’s needs.
What are your views about this latest development? Do let us know in comment section below.