Artificial intelligence (AI)-assisted development is all the rage. The emergence of large language models (LLMs) and generative AI has opened the floodgates for the insertion of this new breed of intelligence across all sorts of business productivity suites and software development platforms. An O’Reilly study found that one-third of developers already use AI-based programming tools at work, and GitHub places this figure much higher.
Microsoft, which has invested billions into OpenAI, is now leveraging its newfound partnership to introduce impressive AI feature sets within its platforms.
Below, we’ll explore some ways Microsoft is integrating LLMs and copilots into its offerings and how others in the market are following suit. We’ll consider how copilots can, in general, aid creative potential for business users and software programmers alike and consider the benefits and potential ramifications inherent in the copilot concept.
More and More Copilots
Most programmers are already familiar with GitHub Copilot, which uses the OpenAI Codex trained on billions of lines of code to make code suggestions in real time. And the AI can be easily integrated into your code editor. (The Visual Studio GitHub Copilot extension had nearly seven million downloads at the time of this writing).
But these powers aren’t limited to the programming elite — generative AI is also coming to the productivity apps that average users are most familiar with.
Microsoft recently added a copilot for Microsoft 365. This means that Microsoft 365 users can also leverage a copilot baked into Microsoft productivity apps, like Microsoft Word, Excel, PowerPoint, and Microsoft Teams. But this virtual assistant goes well beyond the Clippy of yesteryear.
Powered by LLMs, Microsoft 365 Copilot can generate full drafts within Word, provide breakdowns of Excel data, create fully-fledged custom presentations in PowerPoint, and more, all from natural language prompts. Its Business Chat can plug into your entire suite of data to summarize information or plan team objectives.
Custom Copilots Are Next
But the copilot craze doesn’t end there. Microsoft has announced a Microsoft Security Copilot as well as copilots for Windows 11, Power Apps, and Dynamics 365. And, at Build 2023, Microsoft unveiled perhaps the most exciting evolution — a framework for developers to construct their own custom copilots.
Microsoft’s Copilot stack is structured into four main components:
- Infrastructure: The hardware and processing power required to run deep learning models.
- Foundational model: The large language model (LLM) is used to perform various tasks. These are grouped into hosted models, hosted and fine-tuned models, and “bring your own.” Examples include OpenAI’s GPT-4, DALL-E, and Whisper. And Microsoft is also partnering with Hugging Face to make open-source models like BERT, Dolly, and LLaMa available in their Copilot stack.
- Orchestration: This layer adds guardrails around prompts sent to the LLMs. It filters and interprets the user’s prompts for safety and moderation purposes. Also, using what Microsoft calls Retrieval Augmented Generation (RAG), this layer can plug in to other databases to bring factual data and more application-specific details to the output.
- Copilot front end: The front end is the user experience layer, where users input their prompts and have a natural language conversation within a consolidated chat window.
Which companies are the most important vendors in AI and hyperautomation? Check out the Acceleration Economy AI/Hyperautomation Top 10 Shortlist.
The Benefits of Custom Copilots
Generative AI trained on a particular corpus of data will have knowledge of more specialized information, meaning the AI will be smarter about the domain and task at hand. With more filtering and plugins in the orchestration layer, the LLM can work with more relevant data and avoid faulty outputs. This could make it easier to build copilots around specific verticals such as finance, healthcare, education, or real estate.
In general, opening the copilot stack could enable the platform effect. A platform gives more power to independent software vendors (ISVs) whose combined efforts exceed the sum of its parts. As Microsoft founder Bill Gates puts it:
“A platform is when the economic value of everybody that uses it exceeds the value of the company that creates it. Then it’s a platform.”
Chatbots like ChatGPT and Bard have enraptured technologists. And expectations are shifting to the point where creators will come to expect more AI-driven capabilities on a regular basis. Thus, the move to a platform model to deliver contextually aware chatbots makes sense to help port these types of services into more and more environments and domains.
AI Assisting Development At Large
Microsoft’s investment in generative AI is not entirely unique. It fits into a larger trend of more AI-assisted tooling coming to market. For instance, other cloud service providers have been quick to integrate OpenAI GPTs into their offerings or train their own foundational models.
For example, Google’s PaLM 2 is a next-generation language model currently being integrated into many of the Google products, like Google Workspaces and Google Cloud. Google’s Bard also offers coding capabilities on par with ChatGPT. Amazon has similarly been developing its own foundational models (FMs), branded as Titan FMs, and offers Amazon Bedrock, a service that makes it easier to consume external FMs.
All this advancement in generative AI will undoubtedly aid software development in many ways, from code generation to designing templates, debugging code snippets, clarifying code repositories, recommending modules, suggesting performance fixes, and more. This should lead to increased agility and productivity in the feature release process.
AI Grabs the Spotlight, For Better Or Worse
The AI craze is sweeping much of the development landscape. For example, at the 2023 RSA Conference, over a dozen large companies and over 50 startups unveiled AI capabilities within their products. But in this new era of AI-washing, it’s not always possible to discern which products are legitimately supported by true AI.
Furthermore, with all the new LLMs coming to market, consumers are becoming spoiled for choice when choosing a foundational model. AI is rapidly progressing, and the fluctuations in the market are challenging to track. Fractured generative AI adoption within an enterprise could lead to shadow IT, security and privacy issues, and excessive software-as-a-service (SaaS) subscriptions if not governed.
Other potential troubles surround intellectual property rights. Most notably, there are ongoing legal complaints surrounding Copilot. Plaintiffs claim that the model violates the rights of the developers whose open-source code the model was trained upon, equating it to “software piracy on an unprecedented scale.”
With the continued advancement of Copilot integrations, privacy, security, and risk mitigation will be as important as ever. As AI-assisted development gains momentum, businesses must keep in mind both the benefits and possible complexities of AI adoption.