AI, yi, yi.
With artificial intelligence becoming omnipresent in every sector of life, privacy has become a growing concern among users, wondering where the details they share with the machines are winding up.
One woman, who recently used ChatGPT to make a grocery list, was shocked to see the bot get its wires crossed — delivering a message she thinks she wasn’t meant to see.
“I’m having a really very scary and concerning moment with ChatGPT right now,” the TikTok user Liz — who goes by @wishmeluckliz ‚ confessed in a viral video detailing the eerie-sounding episode.
Liz claimed that “somebody else’s conversation” infiltrated her thread — and that even the trendy tool told her that this is what had transpired, even though skeptics believe it could be a creepy coincidence.
The Post has reached out to ChatGPT’s parent company OpenAI for comment.
According to the clip, the cybernetic eavesdropping occurred while the content creator was using the AI’s voice mode — wherein users can converse with the service — to help facilitate food shopping.
However, after rattling off her list of needs, Liz forgot to turn the recorder off and left it running even though she remained silent for a “long time” afterward, per the clip.
Despite the lack of input, the Chat service replied with a seemingly unrelated message that was so jarring that Liz had to double-check via the transcription to make sure she wasn’t imagining it.
The message read, per a screenshot: “Hello, Lindsey and Robert, it seems like you’re introducing a presentation or a symposium. Is there something specific you’d like assistance with regarding the content or perhaps help with structuring your talk or slides? Let me know how I can assist.’”
Liz found the reply bizarre given that she “never said anything leading up to this.”
After retracing the transcript, she realized that the bot had somehow recorded her saying that she was a woman named Lindsey May, who claimed to be the Vice President of Google, and was giving a symposium with another man named Robert.
Confused, she broached the issue to GPT in voice mode, saying, “I was just randomly sitting here planning groceries, and you asked if Lindsey and Robert needed help with their symposium. I’m not Lindsey and Robert. Am I getting my wires crossed with another account right now?”
The bot replied, “It seems like I mistakenly mixed up the context from a different conversation or account. You’re not Lindsey and Robert and that message was meant for someone else.”
“Thanks for pointing that out and I apologize for the confusion,” it added, seemingly confessing to leaking someone else’s private information.
Shaken by the apparent admission, Liz said that she hoped she was “overreacting and that there’s a simple explanation for this.”
While some TikTok viewers shared her concern of a potential privacy breach, techsperts believe that the bot could’ve been hallucinating based on patterns in its training data, which is based in part in user input.
“This is spooky — but not unheard of,” assured one AI expert and programmer. “When you leave voice mode on but don’t speak, the model will attempt to extract language from the audio — in the absence of spoken word it will hallucinate.”
They added, “It also isn’t crossing wires, but is oriented towards hallucinating in agreement, so you suggested that wires got crossed and it agreed with you in an attempt to successfully ‘answer your query.’”
On Reddit, AI aficionados cited multiple instances where the bot would reply unprompted. “Why does it keep transcribing ‘Thank you for watching!’ when I use voice recorder but am not saying anything?” said one.
While seemingly harmless in these cases, hallucinating AI Chatbots can offer dangerous disinformation to humans.
Google’s AI Overviews, designed to give quick answers to search queries, has been guilty of multiple slips of the technological tongue, including one instance where it advised adding glue to pizza sauce to help cheese stick better.
During another case, the AI bot billed a fake phrase — “You can’t lick a badger twice” — as a legitimate idiom.