GitHub Copilot, Amazon Code Whisperer sometimes emit other people's API keys
The Insider News
1
Posts
1
Posters
0
Views
1
Watching
-
AI dev assistants can be convinced to spill secrets learned during training
That's convenience!
GitHub's response *really* helps me get the warm and fuzzies:
Quote:
Because the model powering GitHub Copilot was trained on publicly available code, its training set may contain insecure coding patterns, bugs, or references to outdated APIs or idioms," a GitHub spokesperson said a statement to The Register after publication. In some cases, the model may suggest what appears to be personal data, but those suggestions are fictitious information synthesized from patterns in training data. When GitHub Copilot synthesizes code suggestions based on this data, it can also synthesize code that contains these undesirable patterns.