How Microsoft can become the biggest winner in generative AI TOU

How Microsoft can become the biggest winner in generative AI

Discover all the Smart Security Summit on-demand sessions here.

Since exit from ChatGPT in November, there was a lot of speculation about the possible murderous application of large advanced language models (LLMs). Some time ago there were reports that Microsoft would be integrate ChatGPT in its Bing search engine to get ahead of Google. There are also many discussions about something like ChatGPT replacing search completely.

Although I am not convinced by any of these ideas, I think that we are only just beginning to explore the huge business potential of LLMs and other generative artificial intelligence technologies.

And Microsoft has the chance to become the big winner of this new wave of innovation that is about to be unleashed. Azure Open AI service, now generally availablemay be Microsoft’s winning card in the race to dominate the growing market for Generative AI.

Azure OpenAI Service vs. Open AI API

Azure OpenAI Service was launched in November 2021 but was only available through a sales model. Now anyone can apply and access the service if they comply with Microsoft’s Responsible AI Principles.


On-Demand Smart Security Summit

Learn about the essential role of AI and ML in cybersecurity and industry-specific case studies. Watch the on-demand sessions today.

look here

Currently, Azure OpenAI Service supports basic and fine-grained GPT-3 models, basic and fine-grained Codex series, and LLM integrations. Microsoft too added DALL-E 2 to OpenAI service in October, although it is still not available as part of the public product. According to the Microsoft blog, the company will soon add support for ChatGPT.

Azure OpenAI Service is essentially a copy of the OpenAI API, although it has several advantages. For Microsoft customers already using Microsoft’s cloud, accessing OpenAI technology through Azure will be much easier. Since many companies are already using Microsoft’s machine learning and devops products, it will be much easier for them to manage their GPT-3 and Codex instances on the Azure platform.

Azure also offers enterprise-grade security features that are required in many industries. And it supports features like choosing the geographic region of the cloud instance and adding content filters to prevent abuse.

Interestingly, Azure OpenAI service pricing is more competitive than OpenAI API. In the OpenAI API, prices for fine-tuned GPT-3 models are higher than base models. In Azure, the basic and fine-tuned models are the same price. Azure also allows customers to pay for thin models using a pay-by-the-hour model instead of the usual token-based pricing, which is more convenient for applications with high-volume model usage.

Both Microsoft and OpenAI are taking advantage of the growing market for the Azure OpenAI service and OpenAI API. The OpenAI API is powered by Microsoft’s cloud, which means that as its customers grow, the Azure bill for OpenAI will increase. On the other hand, Microsoft has a license agreement with OpenAI. Details of the deal have not been made public (apart from the fact that Microsoft has exclusive license rights to OpenAI technology). But with the growing use of Azure OpenAI Service, Microsoft’s license fees will increase.

However, in the long term, I expect Azure to eat away at OpenAI’s business as the generative AI market grows and matures. Azure is much more flexible than the OpenAI API and also offers a host of other services essential for large-scale software and machine learning development.

The OpenAI API will always remain a hub for exploration and innovation, but higher-paying customers who want to build scalable products will slowly migrate to Azure. This will make OpenAI increasingly dependent on Microsoft as a source of revenue for its models.

Azure’s robustness, flexibility, and convenience will also allow it to compete with emerging open source and commercial alternatives. Microsoft’s scalable, AI-optimized hardware infrastructure enables it to deliver generative models at competitive prices. At the same time, the complexity and upfront costs of setting up hardware for generative models will make hosted systems like Azure OpenAI the preferable option for many companies that lack the in-house talent to set up open source models.

The RLHF market

Prior to ChatGPT, the primary way to train LLMs and other generative models was through unsupervised or self-supervised learning. The model comes with a very large corpus of text, software code, images, or other types of data and is left on its own to learn the relevant patterns. During training, the model hides parts of the data and tries to predict them. It then reveals the hidden sections and compares its predictions with the ground truth, and corrects its internal parameters to improve its predictions. By repeating this process over and over again, the LLM learns statistical representations of the training corpus and can use it to generate relevant sequences of text, computer instructions, image pixels, etc.

ChatGPT has shown the power of adding human control to the training process. ChatGPT was trained using reinforcement learning from human feedback (RLHF). Instead of pure, unsupervised learning, OpenAI engineers used human annotators to guide the model through various stages of the training process. The team first refined a pre-trained model using a set of prompts and responses written by human experts. Next, they created a “reward model” that ranked the language model output. The reward model was trained on output quality scores provided by human reviewers. Finally, they used the reward model to further train the model and align its output with human preferences. The impressive results of ChatGPT show how far LLMs can be pushed with human assistance.

With the success of ChatGPT, the market for RLHF-educated LLMs is likely to grow. Businesses will want to use the technique to fine-tune LLMs like ChatGPT to follow application-specific instructions. But the pipeline for RLHF requires complicated development and management tools, including data preparation and annotation, reward model development, model and data versioning, regular recycling, monitoring and control. templates, and more.

Luckily for Microsoft, its Azure platform is well-prepared to meet these demands with its MLops and data warehousing tools. This, along with its scalable cloud infrastructure and software development tools, will give Microsoft the edge in this more specialized niche of generative models.

Microsoft missed the mark on smartphones and mobile platforms. But its early investment in OpenAI, an AI lab that at the time lacked a profitable business model, gave it the chance to grab a big market share for the next wave of disruptive innovation. .

VentureBeat’s mission is to be a digital public square for technical decision makers to learn about transformative enterprise technology and conduct transactions. Discover our Briefings.

if you want to read this article from the original credit source of the article then you can read from here.