Fake news? Pah! Deepfakes are truly terrifying

June 17, 2019

Fake news? Pah! Deepfakes are truly terrifying

I'm deeply confused. What are you talking about?  

Deepfakes are hyper-real videos that are actually fake. The deep bit of its name comes from deep learning (a type of artificial intelligence), and we all know about fakes - the adjective du jour. They have the potential to spark "violent" social unrest in today's political climate. Mostly they are used by creeps to manipulate a celeb's image to look like they're saying something they haven't said - or doing something they definitely wouldn't do on film.


Srsly! Like what? 

Like having sex - but I'll come back to that. The latest deepfake was posted on Instagram by two British artists for an art exhibition using AI developed by an Israeli company Canny AI. It had Facebook's Mark Zuckerberg (above) saying “whoever controls the data, controls the future”, just days before the US House Intelligence Committee's hearing on deepfake technology. Capitol Hill is freaking out that voters won't be able to "trust their own eyes or ears" when assessing what they see on their screens.  


That ship may already have sailed. But wait, Facebook owns Instagram and wasn't there a deepfake of Nancy Pilosi sounding drunk that Facebook refused to take down?

Indeed. And another damaging one last week of Pilosi (Speaker of the US House of Reps) here that Trump immediately thought was real and posted on Twitter. The news channels are also up in arms - this tech can be used to discredit the authenticity of real journalists and activists. Witness, a human rights organisation says: “As companies release products that enable creation, they should release products that enable detection as well”. 


Getting these deepfakes off the internet is surely impossible?  

True. There is a US law called Section 230 of the Communications Decency Act that protects websites from being sued for what their clients upload - so injured parties need to go after the individuals. Taking anything down "could be a long hard struggle" according to Fortune magazine's website - a "whack-a-mole" game as they put it. Basically the legal response to this abuse has not been developed.


So how abusive can this kind of deepfake AI be? 

Very. A deep-learning neural network can simply swap an adult film star's face for a celeb's - Scarlett Johansson, Taylor Swift and Katy Perry have all been victims. Here is Star Wars actress DaisyRidley's face on someone else's body. Pretty real right? The opportunity for revenge or humiliation or political defamation is endless. But it's also being used in comedy - watch Bill Hader scarily and imperceptibly become Arnie Schwarzenegger. It took "Tom" from the Czech Republic a couple of days to create this using free, open-source software called DeepFaceLab.  


Who the heck started this? 

 A Stanford graduate student, obvs, called Ian Goodfellow (who looks kind of creepy himself) and who in 2014 invented a machine learning technique called a “generative adversarial network”, or a GAN, by which deepfakes are made. It's a gift for purveyors of fake news. Danielle Citron, a professor of law at the University of Maryland gave this scenario to the House Intelligence committee on Thursday: "The night before a public offering... a deepfake showing a company's CEO...could upend the IPO, the market will respond and fall far faster than we can debunk it," she said.  



That sounds pretty tame compared with deepfakes taking hold of the US 2020 presidential campaign. Or any government election.

"We are outgunned," admitted Hany Farid, a digital-forensics expert from the University of California last week. "The number of people working on the video-synthesis side, as opposed to the detector side, is 100 to 1." Last year in central Africa, a deepfake video of Gabon's long-unseen President Ali Bongo sparked a coup. And former president Barack Obama told a Canadian audience last month: "People can duplicate me speaking and saying anything... and it's a complete fabrication". But many of these deepfake attacks target women, as this devastating story of journalist Rana Ayyub reveals.  


Come. On. We need that deepfake detector software pronto. Though probably not for David Beckham. Watch him here speaking nine languages.
As if.  


PS

Facebook, without which deepfakes would be worthless, was not scheduled to send anybody to participate in the Intelligence Committee's hearings. Probably too busy peddling their own truths.  
 

 



Leave a comment

Comments will be approved before showing up.