As I was going through the pictures of Deepika Padukone in a website and after scrolling a little too much down, I got transferred to a link ‘www.adultdeepfakes.com’. I was so intrigued by what it was all about and to my utter horror I came across pornographic videos of the actress with her face morphed to the bodies. It was utterly shocking how realistic the videos seemed and the voice modulation which was incorporated in the video. A careful look into it and the next thing I know is that not only celebrities but influential people like politicians have also fallen prey to it.
Come to think of it, you’ve been active for years with thousands of followers and one day you see your video doing rounds where you are passing statements or engaging into acts you never did. The video seems real, the voice modulation is just how you’d modulate when you are talking and at the end, people only think it is real because it has been created so aptly. Today the possibility of technology is eye-popping and endless.
Deep fake is powered by one of the most fascinating aspects of machine learning known as deep learning and ‘fake.’ It was first derived from the malicious use of FakeApp, a face-swapping software which utilizes Artifical Intelligence (AI) and helps swap someone’s face with another and makes it look like they’re doing or saying something that they shouldn’t be doing.
Presently, similar features are also incorporated in Snapchat and even in the video effects presented by FB Messenger. The potential a deep fake’s manipulation carries is vast. Not only that the ability of social media to make deep fakes viral only compounds the issue. Both hardware and software are cheap, easily accessible and surprisingly, capable enough of making a deep fake video which hardly takes half a day.
It has the upper hand to exploit the very human tendency of seeing is believing. Certainly, the old has it. It uses Generative Adversarial Networks (GANs), in which machine learning (ML) models detect the forgeries. The forger keeps on creating fakes until the other ML model can’t detect the forgery.
Originally when it started deep fakes required immense advanced editing skills. But, now they are easily available to everyone. A single image is more than enough to make videos and what is popularly known as Deep Fakes Club teaches one on how to produce such videos.
— Deepfakes Club (@deepfakesclub) February 22, 2018
Its mind baffling how websites like these will provide you with apt instructions along with explainer videos in case one still doesn’t understand. It’s funny how they are lowkey convincing the user to have these skills. Deepfakes soon caught people’s attention and started spreading after a Reddit user is known as “Deepfake” posted how the face of a famous person could be manipulated heavily to give them a starring role in a pornographic video clip.
Could we have ever imagined that we’ll be in an era where we could perfectly morph audios and videos to create videos so real that they’ll be difficult to differentiate? I mean, at most all I have done is superimposing pictures cut out from paper of my favorite actor as a kid and pasted it onto my scrapbook, or observed political caricatures on TV shows of famous people.
A year back, Journalist Rana Ayubb, found out how a pornographic video with her face morphed was doing the rounds on social media which was nothing but one of the few “repercussions” she faced for speaking up about how the freedom of speech in newsrooms was being stifled.
One of the most famous deep fakes includes Donald Trump holding a child and asking, “Who do you like the most?” The child, dressed to look like a Trump lookalike, says, “Modi.” Deep fakes come to pose a serious threat to democracy as often countries continuously try fighting misinformation and international conflict. On the face of it, the term “fake news” is taking a literal turn as the technology is making it easier to manipulate the faces and audio in videos. Not only that but the rudimentary nature of it can provoke people to fight against one another for which the reasons could earlier only be imagined.
Deep fakes going viral not only tarnish the credibility of influential politicians, celebrities and brands; but they could be potentially responsible for causing harm to our society by affecting the global policy plans and efforts. Although, the journey of deep fakes starts with people creating them for good fun and humor. However, more often than not it’s nothing but like awakening a sleeping giant. It goes beyond just goofing around and soon enters the malicious territory of manipulation. Not only influential people but, we too are active and could possibly fall prey to such havoc.
Today we as internet users are an important classification of people who need to have their rights and property protected under strict cyber laws. Someone’s online reputation deserves protection more than ever, which is why we have laws against slander and libel.
However, the sad truth is that the more videos and images there are of you online, the easier you are prone to deep fake; we cannot deny that technology is actually getting better all the time, and now it can be done with just one photo of you. However, with great technology also comes great responsibility. If we’re to keep our heads and our free speech, we will have to need to come up with clever ways to prevent a digital post-truth world.
Often deep fakes can be harder to spot. Here are a few tips to help you look out for deep fakes.
- Facial movements can be jerky, for instance, the mouth of a fake head often moves robotically.
- Shifts in skin tone and lighting. The video may seem fidgety just how it happens in video games with bad graphics.
- A weird mixture of two faces. This can often be caught particularly during complex movements, because more the movement, more convincing the fakeness has to be.
No one is safe from the ill effects of AI and machine learning. It’s thought-provoking as to how technology is rising and we are often left behind in the face of up-gradation. It’s like an arms race, we all like to think that we’ll be able to keep up with it.