Does an apple a day keep the AI away?

Creators are calling bull on alleged deepfake doctors scamming social media users with unfounded medical advice.

On TikTok, one search yields dozens of videos of women rattling off phrases like, “13 years as a coochie doctor and nobody believes me when I tell them this,” before dishing so-called health secrets for perky breasts, snatched stomachs, chiseled jawlines and balanced pH levels.

But the so-called experts aren’t even real. They’re completely computer-generated by artificial intelligence.

Some of the so-called “doctors” might claim to be experts in other fields — diet, plastic surgery, breasts, butts, stomach and more — and offer advice to cure or remedy viewers’ ailments or health concerns.

One account has posted dozens of clips featuring the same woman, who claimed to have spent 13 years as a “coochie” and “butt” doctor. A different account features the exact same woman also spewing unfounded medical advice under the guise of being an alleged “coochie doctor.”

Media Matters reported that the same gaggle of alleged deepfake characters have also appeared as salespeople for wellness products or claimed to have connections to Hollywood to dish insider gossip.

The discrepancies are enough to raise a few eyebrows.

Javon Ford, the creator of his namesake beauty brand, recently revealed that the AI-generated personalities can be manipulated on an app called Captions, which bills itself as a tool to generate and edit talking AI videos. The company claims that it has 100,000 daily users of the app, with over 3 million videos produced every month.

But Ford called the service “deeply insidious.”

“You might have noticed a few of these ‘creators’ on your ‘For You’ page. None of them are real,” he warned.

In a TikTok video, he scrolled through an exhaustive list of AI avatars that users can choose from — such as a woman named “Violet,” who can be seen in many of the “coochie doctor” clips — demonstrating how a script can be written and the avatar will regurgitate it.

Aghast users called the technology “very dangerous,” while some weighed the option of ditching social media altogether due to the “scary” reality of realistic deepfakes.

“I’ve seen Violet so many times,” one shocked viewer commented, while another agreed that they’ve seen her “say she’s a dentist and a nurse.”

“So that’s actually scary! Now that you point it out, I can see through it, but w/o the warning, I may have fallen for it!” someone else admitted.

In an attempt to educate viewers, creators have highlighted the ways to determine if the person on your screen is real or AI-generated as deepfakes proliferate online.

Ford, for one, called out the “mouth movements,” calling them “uncanny.” He noticed that the lips did not sync with the audio, which he said is the “first red flag.”

“It’s 2025,” he said in a TikTok. “Nobody should be having audio video lag issues.”

He added that their claims — that a product or natural remedy works better than whatever is commonly used — should also raise alarms.

Ford also advised looking at the account owner’s profile to see how many videos feature the so-called “doctor,” who has someone been a gynecologist, proctologist and more over a mere 13 years.

“My, my, they’ve had a productive career,” he joked.

One user named Caleb Kruse, an expert in paid media, pointed out the telltale signs of an AI avatar in a previous video on TikTok, using another creator’s content as an example. The woman later confirmed that, while she is, in fact a real person, the video in question was created with AI by a company who had asked to clone her likeness.

In addition to the unrealistic mouth movements, Kruse highlighted the woman’s eyes, awkward head movements and overall vibe of the video or feeling that “it’s not real.”

“The eyes are too big when they shouldn’t be — they’re not always reflecting exactly how a normal person might react when they say things,” he explained.

“Third, is the cadence, how she speaks, how the words between sentences flow,” he continued. “There’s sometimes these weird pauses that you wouldn’t normally say.”

The callouts were a wake-up call for his followers.

“This should be illegal,” one dismayed viewer commented.

“It looks so real its horrifying,” another chimed in.

Share.

Leave A Reply

Exit mobile version