Home AI Security ‘Deepfake’ Videos: Clear & Present Danger! – 2

‘Deepfake’ Videos: Clear & Present Danger! – 2



The ‘About’ section of this website states how it is aimed at being “a community resource to promote and develop deep learning-based face swapping techniques. We focus on education, news, and technical development.”. Obviously, this website does not take any responsibility on how you use or rather misuse it. But again, more often than not, it is mostly misused by the users.

The Threat
At this point of time when most of the people have the whole world in their smart phones, they also have the best opportunity to misuse it. Like any other technology, ‘DeepFake’ technology can also be misused. Today, there is a global debate on the possible negative implications of DeepFakes videos. DeepFakes will allow you to create HD-quality videos which will blend so seamlessly and have so many angles, the chances that you will know it is fake is close to impossible.

The bizarre world of ‘Deepfakes’ videos which are often explicit with the faces of the subjects replaced by those of global high profile political figures and celebrities, has made law enforcement agencies around the world looking for ways to stop it. If you thought that these videos are of low resolution and difficult to make, you would be surprised to know that the ‘Deepfake’ technology produces super HD quality video which are highly convincing as it always refines the resolution and technique.

Logical Conclusion
Deepfake videos are getting better and better, and it’s terrifying. There’s an intense community trading fake celebrity videos, and manipulated political speeches are furthering the serious fake news epidemic.

There have been several instances of a woman’s face being morphed onto questionable pictures and videos and circulating on Whatsapp and Facebook. The adverse implications of growing technology like ‘Deepfakes’ videos is huge. When most of the social media users believe every WhatsApp forwards and Facebook posts which are fake news without even verifying by simple process of Google search — how are we going to deal with videos that are morphed and circulated using technology that is easily available at your fingertips.

Already, we have seen countless messages and pictures based on fake news being circulated in WhatsApp and Facebook. In addition to this situation, if you find a video committing a questionable deed of your family member, friend, or a relative — how likely are you to trust the malicious video?

The manipulation of images and videos using ‘Deepfake’ technology has already become a dangerous mass phenomenon. Today the lawmakers and Intelligence officials in the United States are concerned that the bogus videos of ‘Deepfakes’ technology could be used to threaten national security or interfere in elections.

With so much of negative implications of this technology, the million-dollar question is — do we really need the ‘Deepfake Video’ technology?


Please enter your comment!
Please enter your name here

9 + 1 =