Microsoft deepfake software combats election propaganda
Ahead of the 2020 United States presidential election, Microsoft released Video Authenticator, an AI-powered tool to help identify deepfake images and videos.
Microsoft revealed a new AI-powered deepfake detection software to combat manipulated images and videos as experts worry deepfakes could impact the 2020 United States presidential election.
Video Authenticator can help alert users to deepfake images in videos and news articles that have been changed to spread false information.
Automated deepfake software
Initially developed by Microsoft Research in collaboration with Microsoft's Responsible AI team and the Microsoft AI, Ethics, and Effects in Engineering and Research Committee, the tool can automatically analyze videos and photos to provide a confidence score that the media has been manipulated.
A Microsoft blog post noted that the tech giant trained Video Authenticator on FaceForensics++, an open-source dataset comprised of 1,000 video sequences manipulated by four automated techniques. Microsoft tested the tool on the DeepFake Detection Challenge Dataset, a dataset consisting of more than 100,000 videos created by Facebook AI in collaboration with other technology vendors. The DeepFake Detection Challenge aimed to spur researchers to develop algorithms to detect deepfake images and videos.
The winning algorithm of the challenge, which ended in June, had an accuracy level of around 65%.
Yet, as Forrester analyst Brandon Purcell pointed out, Microsoft hasn't released any metrics about the efficacy of its new deepfake detection tool, so it's unclear how well it works.
This article is part of
Technology a double-edged sword for U.S. election security
However, if Microsoft's tool performs at a similar level of accuracy, "we can expect many deepfakes to slip through the cracks... and many authentic pieces of content to be falsely labeled as fake," Purcell said.
Brandon PurcellAnalyst, Forrester
"For the public to have confidence in this tool, Microsoft will have to reveal more of these details about the model itself," he added.
Microsoft declined to provide data about the accuracy of Video Authenticator. The company also declined to provide details around how to access the tool and its cost.
Disinformation and politics
Still, the tool could play an important role in the upcoming presidential election. Over the last few years, domestic and foreign political actors have increasingly created deepfakes, as well as manipulated news articles and videos, to sway voters.
Notably, President Donald Trump has frequently shared false and manipulated media on Twitter, prompting the social media platform to flag some of his tweets.
Currently, the biggest use of deepfakes is to create nonconsensual pornography, which is damaging to the people involved but likely won't swing elections, Purcell said.
"That being said, in the political arena, even a deepfake of poor quality can spread quickly if it confirms the biases of one side during these incredibly polarized times," he continued.
Deepfake detection technology can help weed out disinformation and help keep elections fair. Whether Microsoft's Video Authenticator will play a role remains to be seen.