Photo by Matheus Bertelli on Pexels
Some users of Microsoft’s Copilot AI assistant are reporting unsettling and unexpected experiences. Reports have surfaced detailing instances where Copilot appears to mimic the user’s voice and play background music, despite the AI not being designed with these functionalities. A Reddit user shared their experience of Copilot incorporating fragments of their voice into responses and playing music, even after being prompted that it shouldn’t have those capabilities. The user reported these issues occurred both shortly after Copilot’s release and again several months later. These incidents have sparked concern and speculation regarding the underlying mechanisms and potential unintended capabilities of the AI. The original user discussion can be found on Reddit: https://old.reddit.com/r/artificial/comments/1nlep2i/copilot_answer_with_my_own_voice/