Future Tech

Microsoft pushes US lawmakers to crack down on deepfakes

Tan KW
Publish date: Tue, 30 Jul 2024, 07:13 PM
Tan KW
0 460,455
Future Tech

Microsoft Corp is calling on Congress to pass a comprehensive law to crack down on images and audio created with artificial intelligence - known as deepfakes - that aim to interfere in elections or maliciously target individuals.

Noting that the tech sector and nonprofit groups have taken steps to address the problem, Microsoft president Brad Smith said on Tuesday, “It has become apparent that our laws will also need to evolve to combat deepfake fraud”. He urged lawmakers to pass a “deepfake fraud statute to prevent cybercriminals from using this technology to steal from everyday Americans”.

The company is also pushing for Congress to label AI-generated content as synthetic and for federal and state laws that penalise the creation and distribution of sexually exploitive deepfakes.

The goal, Smith said, is to safeguard elections, thwart scams and protect women and children from online abuses. Congress is currently mulling several proposed bills that would regulate the distribution of deepfakes.

“Civil society plays an important role in ensuring that both government regulation and voluntary industry action uphold fundamental human rights, including freedom of expression and privacy,” Smith said in a statement. “By fostering transparency and accountability, we can build public trust and confidence in AI technologies.”

Manipulated audio and video technology has already created some controversy in this year’s campaign for US president.

In one recent instance, Elon Musk, owner of the social media platform X, shared an altered campaign video that appeared to show Democratic presidential candidate Vice-President Kamala Harris criticising President Joe Biden and her own abilities. Musk didn’t clarify that the video had been digitally manipulated and suggested later that it was intended as satire.

 


  - Bloomberg

 

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment