
02/04/2025
GitHub Copilot now supports multiple LLMs
GitHub is bringing more flexibility and choice to Copilot through the integration of multiple large language models (LLMs).
Since its inception, GitHub Copilot has utilised different LLMs for varied uses. The journey began with the deployment of Codex, an early iteration of OpenAI’s GPT-3, that was fine-tuned specifically for coding tasks. The evolution continued with the launch of Copilot Chat in 2023, initially using GPT-3.5 and subsequently transitioning to GPT-4. As demands evolved, GitHub adapted, employing models from GPT 3.5-turbo to the more recent GPT 4o and 4o-mini, catering to needs for both latency and quality.
“The past year has witnessed a surge in high-quality small and large language models, each excelling in different programming tasks,” stated Thomas Dohme, CEO of GitHub. “It’s clear that the future of AI code generation will be defined not only by multi-model functionality but also by multi-model choice.”