
Microsoft aims to expand its ecosystem of AI-powered apps and services, called “copilots,” with plug-ins from third-party developers.
Today at its annual Build conference, Microsoft announced that it is adopting the same plug-in standard that was introduced to its close ally, OpenAI, for ChatGPT, its AI-powered chatbot — allowing developers to create such plug-ins. Allows to create these that work on ChatGPT, Bing Chat (on). on the web and in the Microsoft Edge sidebar), Dynamics 365 Copilot, Microsoft 365 Copilot, and the recently launched Windows Copilot.
“I think in the coming years, this will become an expectation of how all software works,” Microsoft CTO Kevin Scott said in a blog post shared with TechCrunch last week.
Bold announcements aside, the new plug-in framework gives Microsoft’s family of “copilots” — apps that use AI to assist users with various tasks, such as writing emails or drawing pictures — different software and Interacting with a range of services. Using IDEs such as Visual Studio, CodeSpaces, and Visual Studio Code, developers can build plug-ins that retrieve real-time information, incorporate company or other business data, and perform actions on behalf of the user.
A plug-in could let Microsoft 365 Co-Pilot, for example, arrange travel in line with company travel policy, query a site like WolframAlpha to solve the equation, or answer questions about Can give how some legal issues were handled in a firm in the past.
Customers in the Microsoft 365 CoPilot Early Access program will have access to new plug-ins in the coming weeks from partners including Atlassian, Adobe, ServiceNow, Thomson Reuters, Moveworks and Mural. Meanwhile, Bing Chat will see new plug-ins from Instacart, Kayak, Klarna, Redfin, and Zillow added to its existing collection, and those same Bing Chat plug-ins will come to Windows within Windows Copilot.
The OpenTable plug-in allows Bing Chat to search restaurants for available bookings, for example, the Instacart plug-in lets the chatbot take a dinner menu, turn it into a shopping list, and order ingredients to be delivered gives. Meanwhile, the new Bing plug-in brings web and search data from Bing into ChatGPT, complete with citations.
a new frame
Scott describes the plug-in as a bridge between an AI system, such as ChatGPT, and data that a third party wants to keep private or proprietary. A plug-in gives the AI system access to those private files, enabling it to answer a question about business-specific data, for example.
Such a bridge is certainly in growing demand as privacy becomes a major issue with generative AI, which has a tendency to leak sensitive data such as phone numbers and email addresses from the datasets it was trained on. Was. To mitigate the risk, companies including Apple and Samsung have banned employees from using ChatGPT and similar AI tools over concerns that employees could mishandle and leak confidential data in the system.
“What a plugin does is say ‘hey, we want to make that pattern reusable and put some limits on how it’s used,'” John Montgomery, CVP of AI Platforms at Microsoft, said in a canned statement. determine.”
There are three types of plug-ins in Microsoft’s new framework: ChatGPT plug-ins, Microsoft Teams messaging extensions, and Power Platform connectors.

Image Credits: Microsoft
Teams messaging extensions, which allow users to interact with a web service through buttons and forms in Teams, are not new. Nor are Power Platform connectors, which act as a wrapper around an API that allows an underlying service to “talk” to apps in Microsoft’s Power Platform portfolio (such as Power Automate). But Microsoft is expanding its reach, letting developers tap new and existing messaging extensions and connectors to extend the company’s Assistant feature to services like Microsoft 365 Copilot, Microsoft 365 Apps, and Word, Excel and PowerPoint.
For example, Power Platform connectors can be used to import structured data into the “Dataverse,” Microsoft’s service that stores and manages data used by internal business apps, which Microsoft 365 Copilot can then access. Can do. In a demo during the build, Microsoft showed how Dentsu, a public relations firm, tapped Microsoft 365 Copilot with a plug-in for Jira and data from Atlassian’s Confluence without writing new code.
Microsoft says developers will be able to build and debug their own plug-ins in a number of ways, including within the Azure AI family of apps, adding the ability to run and test plug-ins on private enterprise data . Azure OpenAI Services, Microsoft’s managed, enterprise-focused product designed to give businesses access to OpenAI’s technologies with additional governance features, will also support plug-ins. And Teams Toolkit for Visual Studio will get features for running plug-ins.
transition to a stage
As they will be distributed, Microsoft says developers will be able to configure, publish, and manage the plug-ins, among other places, through the Developer Portal for Teams. They’ll also be able to monetize them, though the company wasn’t clear on how the pricing would work.
In any case, with the plug-in, Microsoft is playing to stay afloat in the highly competitive Generative AI race. The plug-ins essentially turn the company’s “copilots” into aggregators — putting them on a path to becoming one-stop shops for both enterprise and consumer customers.
Microsoft undoubtedly sees the lock-in opportunity as increasingly important as the company faces competitive pressure from startups and tech giants building out generative AI, including Google and Anthropic. One can imagine that plug-ins will become an attractive new source of revenue as apps and services rely more and more on generative AI. And it could address the fears of businesses who claim that generative AI trained on their data infringes on their rights; Getty Images and Reddit, among others, have taken steps to block the companies from training generative AI on their data without compensation of some sort.
I expect rivals to respond to Microsoft’s and OpenAI’s plug-in frameworks with their own plug-in frameworks. But Microsoft has first mover advantage, as OpenAI did with ChatGPT. And this cannot be underestimated.