On April 24, 2026 GitHub changed the privacy terms for Copilot. The service now automatically adds every request, response, code snippet and even repository structure from free and paid Pro accounts to its training data set. Only Business and Enterprise customers are exempt; their code stays out of the training pipeline.
Mario Rodriguez, the product lead, said internal tests using Microsoft employee data showed a 27 percent increase in Copilot recommendation acceptance. GitHub argues that the additional data makes autocomplete suggestions more accurate. It also stresses that the collected data never leaves the Microsoft ecosystem and is not shared with third parties.
For most developers this means millions of lines of code are fed into AI model training sets. Public code, including proprietary libraries and niche algorithms, could become visible to competitors. The Decoder reported in 2026 that Company X saw a 15 percent productivity boost after its code was added to Copilot, but it also suspected that patented solutions had leaked.
Corporate users of Copilot Business and Enterprise are protected from having their code used for training. However they will need to revise internal AI‑assistant policies. They must define permissible automation scenarios, monitor which project parts are sent to the service, and obtain security team approval.
Why this matters: Even small code leaks can erode competitive advantage by 8–12 percent. Auditing and moving developers from free plans to corporate licenses can cost up to $150,000 annually. Conduct a rapid license audit, identify all developers on free tiers, and transition them to Business or Enterprise plans to retain control over intellectual property.