Experts sound alarm over rising use of 'deepfake' doctors in social media scams
text_fieldsScammers are increasingly using AI-generated "deepfake" videos of trusted doctors to promote dangerous, unproven treatments on social media.
These videos, appearing on platforms like Facebook and Instagram, feature fabricated endorsements from well-known doctors to advertise dubious "natural" cures for serious health conditions, such as diabetes. Some even falsely claim that life-saving medications like metformin could be fatal.
AI expert Henry Ajder noted that the use of doctor deepfakes "really took off this year."
These videos often target older audiences, manipulating the likeness of popular TV doctors such as British presenter Michael Mosley, who passed away earlier this year, to lend credibility to these fake treatments. "People do seem to trust these videos," said British doctor John Cormack, explaining that the trust these media doctors have built makes the false claims appear believable.
French TV doctor Michel Cymes has taken legal action against Meta, the parent company of Facebook, over scams using his image.
British doctor Hilary Jones, frequently seen on UK television, has also been the victim of deepfakes, with one video falsely showing him endorsing a fake cure for high blood pressure and promoting cannabis gummies. Despite efforts to remove these videos, Jones lamented, "Even if they're taken down, they just pop up the next day under a different name."
AI advancements have made deepfakes more realistic and difficult to detect, according to French academic Frédéric Jurie. He explained that new AI algorithms can analyze and regenerate images, making the deepfakes increasingly convincing. While detection tools are being developed, Jurie warns that the fight against deepfakes is a "game of cat and mouse."
Even controversial figures in the medical world, such as French researcher Didier Raoult and Australian naturopath Barbara O'Neill, have been targeted.
Videos have falsely depicted O'Neill selling pills that "clean blood vessels," and some even claim she died from using a miracle oil sold online. O'Neill's husband condemned the misuse of her name for scams, emphasizing that she does not endorse the products being promoted in the videos.
Experts are not optimistic about the effectiveness of AI detection tools in combating the deepfake surge. Instead, Jurie suggests using technology that can verify whether content has been altered, such as digital signatures that authenticate the integrity of messaging.