VS Code inserting 'Co-Authored-by Copilot' into commits regardless of usage
Microsoft's VS Code stirred a hornet's nest by defaulting to automatically add 'Co-Authored-by: Copilot' to Git commits, often without the user's knowledge or Copilot's actual involvement. This aggressive AI promotion is widely seen as a serious breach of trust, compromising commit integrity and raising legal questions about software authorship. The rapid, widespread backlash forced a quick apology and a planned rollback, underscoring deep developer anxieties about corporate control and AI's creeping influence.
The Lowdown
A recent Pull Request (PR) for VS Code introduced a feature intended to append 'Co-Authored-by: Copilot' to Git commit messages, which controversially defaulted to 'on'. This decision, made without sufficient validation, quickly became a major point of contention within the Hacker News developer community.
- The feature was designed to automatically attribute code changes to Copilot, even when the AI tool was disabled or not actually used for the specific modifications.
- Users reported that the 'Co-Authored-by' line was often invisible during the commit staging process, making it impossible to review or remove before the commit was finalized.
- The PR was approved and merged, leading to widespread criticism across social media and developer forums, including Hacker News.
- A Microsoft engineer, Dmitriy Vasyura, publicly apologized on Hacker News for approving the PR, acknowledging the error and promising to revert the default behavior to 'off' in an upcoming update.
- The incident sparked a broader debate about corporate ethics, the integrity of development tools, and the potential legal ramifications of AI 'co-authorship' on intellectual property.
This misstep highlighted the tension between technological advancement and user autonomy, leading to a significant loss of trust among many developers regarding Microsoft's stewardship of VS Code and its aggressive push for AI integration.
The Gossip
Corporate Credibility Concerns
Many commenters viewed this incident as Microsoft reverting to old, 'embrace, extend, and extinguish' tactics, undermining years of reputation building. There's a strong sentiment that the company prioritizes AI adoption metrics and marketing over user experience and ethical standards, leading to a general feeling of betrayal and a call to switch to alternative tools.
Copyright Conundrums
A significant debate centered on the legal implications of the 'Co-Authored-by: Copilot' tag. Concerns were raised that this attribution could jeopardize copyright ownership, as AI-generated works are not copyrightable, or enable Microsoft to claim a stake in user code. While some dismissed it as a mere marketing tactic, others feared it could be a strategic move to 'launder' copyright or establish legal precedents for future AI IP claims, especially given the tag's indiscriminate insertion.
Developer Disillusionment
Developers expressed deep frustration over the default-on, hidden nature of the feature, calling it an 'invasion' of their commit logs, which are considered crucial historical and legal records. The lack of transparency and perceived disrespect for developer autonomy led many to declare their intention to switch from VS Code to alternatives like VSCodium or Zed, citing a fundamental breach of trust with a once-favored development tool.
Apology and Accountability Absurdities
The public apology by Dmitriy Vasyura, the Microsoft engineer who approved the PR, was met with mixed reactions. While some appreciated his directness, many questioned how such a 'mistake' could have passed through internal review processes, suggesting deeper organizational dysfunction or a 'vibe-coding' culture. Commenters also noted the locking of the original GitHub PR discussion as a heavy-handed response to legitimate criticism, further eroding trust.