AI-generated hoax: Fabricated death of Canadian actor Saint Von Calucci

Von

It became widely known last week that Saint Von Colucci, a 22-year-old Canadian-Portuguese actor who worked in South Korea’s entertainment industry and underwent surgery to look like BTS’ Jimin, had died due to complications from plastic surgery. If an Al Jazeera fact check is to be trusted, the rumor was fake. According to the most recent sources, the actor never existed. He was nothing more than a figment of someone’s imagination. And, according to reports, AI was used to bring the character to life.

According to Al Jazeera, this is one of the first recorded incidents of AI being used to deceive individuals and the media in order to propagate misinformation, signaling the advent of a new era of AI-generated false news. According to the station, newly discovered evidence proves that the narrative was an elaborate fabrication.

How did it all begin?

It all started when HYPE Public Relations distributed a press release to journalists all around the world announcing Von Colucci’s death. The poorly drafted press release was riddled with red flags, but dozens of media outlets ignored them.

The story was initially reported by the Daily Mail Online. It was rapidly picked up by various media sites all around the world. On Wednesday, the publisher removed its piece without placing a retraction note on its website.

How did it become known?

According to Al Jazeera, numerous web links in the paper are broken. Von Colucci’s alleged Instagram account was similarly inactive. The hospital indicated in the news release was also unavailable.

Concerning the public relations firm that sparked the entire saga, it appears that HYPE’s website was registered only a few weeks before Von Colucci’s reported death and is still incomplete. Meanwhile, no one answered the phone numbers listed on the website when Al Jazeera attempted to contact them, according to the report.

Aside from the press release, there is little to no proof that Von Colucci exists. He has no online presence, and no one from his family has come forward to express their condolences. The images available online are pixelated and do not appear to be real.

The incident has drawn attention to Deepfakes and AI tools that are being used to propagate misinformation. It has demonstrated how believable AI-generated images may be, as well as their hazards.

Exit mobile version