Future Tech

People ‘over-confident’ they can spot deepfake videos, survey finds

Tan KW
Publish date: Sun, 11 Jun 2023, 04:07 PM
Tan KW
0 462,206
Future Tech

SAN FRANCISCO: Joe Biden in drag, drinking Bud Light; Donald Trump as a shady lawyer in Breaking Bad. The two likeliest contestants in the 2024 US presidential election have both been subject of recent deepfake hoax videos.

Boosted by “generative” AI tools such as ChatGPT, deepfakes “have reached a level of sophistication that prevents detection by the naked eye”, according to the publishers of a recent multi-nation survey, which found people to be too sure of their own ability to spot a fake.

The 2023 Online Identity study, published at the end of May and conducted by Censuswide for Jumio, a California-based online safety business, canvassed more than 8,000 people across Britain, Mexico, Singapore and the US about deepfakes.

Around two-thirds of those asked claimed awareness of the technology, though that ranged from just 56% in Britain to almost 90% in Singapore, with over half saying they were confident they could tell the difference between an authentic clip and a mock-up.

Perhaps they should not be so sure of themselves. “Deepfakes are getting exponentially better all the time and are becoming increasingly difficult to detect without the aid of AI,” said Stuart Wells, Jumio’s chief technology officer.

And while hoaxes featuring public figures are more likely to be quickly refuted, the same may not always be the case for low-profile scams targeting personal finances or identity.

Less than half of those surveyed in Britain and the UK were aware that AI could pose a threat when it comes to identity theft and related money-grabbing ruses.

The survey team described those numbers as “concerning”, pointing to data from financial association UK Finance showing impersonation scams costing £177mil in 2022. In the US, consumers lost US$2.6bil the same year, according to the Federal Trade Commission.

 - dpa

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment