GitHub has rolled back a controversial feature in Copilot. This came after developers voiced widespread criticism of unwanted additions in pull requests (PRs). The change allowed Copilot to automatically insert so-called tips into user contributions. In practice, this amounted to promotional messages for third-party tools.
This was reported by The Register. The situation came to light when a developer noticed that, following a simple correction by Copilot, a recommendation for another application suddenly appeared in his pull request. That addition was not written by him, but was presented as if it were part of his contribution. This caused confusion and irritation, partly because the content strongly resembled advertising.
According to users, this was not an isolated incident. Within a short time, thousands of pull requests were found containing similar messages. The messages were automatically inserted as soon as Copilot was called upon at any point in the process. This raised questions about the control developers still have over their own contributions and about the role of AI within collaboration platforms.
GitHub quickly acknowledged internally that the change had gone too far. The functionality had existed in a limited form for some time, but previously only applied to pull requests generated entirely by Copilot itself. The recent expansion, which allowed existing user contributions to be modified as well, turned out to be a miscalculation in hindsight.
Microsoft Responds to Criticism
Microsoft, GitHub’s parent company, also responded to the situation. Product managers indicated that the original goal was to familiarize developers with Copilot’s capabilities. In practice, however, this was perceived as unwanted interference, especially since the changes were implemented without clear consent or visibility.
The feature has since been completely disabled. This means that Copilot no longer adds automatic suggestions or similar additions to pull requests, regardless of whether they were created or merely edited by the AI. GitHub states that this prevents users from being confronted with such surprises again.
The incident underscores how sensitive the use of AI in development environments can be. While tools like Copilot are intended to accelerate and support work, a lack of transparency or control can actually lead to mistrust. For many developers, it is essential that their code and communication remain entirely under their own control.
By rolling back the feature, GitHub appears to be taking that message seriously. At the same time, the situation shows that experiments with AI functionality can quickly clash with expectations within the developer community.