Soft Biometrics Could be Key to Detecting Deepfakes
One of the biggest concerns in the upcoming 2020 election is the creation of false videos known as deepfakes to spread misinformation online. Deepfake videos which often utilize artificial intelligence can be a powerful force because they make it appear a political candidate said or did something that they never really did, often altering how viewers see the politician.
But a team of researchers at the University of California, Berkeley, believe they have developed the technology to help detect deepfakes and prevent their harmful spread. The researchers say they have a computer algorithm that will work quickly to detect a deepfake by utilizing “soft biometrics.” Soft biometrics are the distinctive verbal and facial tics that take place when someone speaks. For example, lead researcher Hany Farid explained that Sen. Elizabeth Warren has “a habit of moving her head left and right quite a bit” when she speaks.
The research team fed hours of video into their system to generate the computer algorithm. Farid says the work is essential leading up to the next election. “What we are concerned about is a video of a candidate, 24-to-48 hours before election night going viral and disrupting an election. You can steal an election,” said Farid.