The Evolution of Fake News – The Deepfake
Even though the term fake news has just recently been popularized, the concept is not new. Misinformation, lies and deceit have been around forever. But in recent times the technology that disseminates or even produces fake news has progressed, bringing with it the means to spread it faster than ever before. Tools such as social media, artificial intelligence and various advertising systems have allowed people to use fake news on a large scale in order to influence others’ behaviour.
Recently, we have seen the rise of a new type of fake news, the deepfake. Deepfakes are fake videos that look and sound just like the thing they are depicting. Such videos are created by a neural network, a type of deep machine-learning model that analyses video footage and algorithmically transposes a human face onto the movements of another. When they first started to appear, deepfakes were easy to spot, but every day they become more real and indistinguishable from the people they are mimicking.
Until now video footage has been difficult to alter in any substantial way. But with the advancements in the development of AI technology creating a deepfake doesn’t require significant skill. Practically anyone could create a realistic video and promote their agenda. Because as it turns out, nowadays even low-tech doctored videos can be used as an effective form of misinformation.
Anyone could spread fear, uncertainty and doubt, something that is very familiar to the people close to the crypto industry. A new kind of fake news could have severe consequences. People could easily believe lies if they are based on undetectable technology or they could, on the other hand, become more and more sceptical, eventually distrusting video content entirely.
What are Examples of Deepfakes?
Not long ago, a digitally altered video, showing the Demographic politician Nancy Pelosi appearing to slur through a speech, was widely shared on Facebook and YouTube. It was supposedly first posted by a supporter of Donald Trump and soon became very popular among his fans. Soon after that, Trump himself shared the video on Twitter. The video was quickly marked as fake, but millions of people already saw it and the damage was done.
Deepfakes are also being used to create pornography – a practice where a celebrity’s or maybe even an ex-girlfriend’s – head is put on the body of a porn star. All the creator needs are some photographs of his victim and a short video, something that is easily obtained in today’s world of social media. Sometimes deepfakes aren’t about misleading the population, but about harassment or blackmailing, easily achieved with modern technology. Even some celebrities, who are used to public hating, have been shaken by discovering their faces used in pornography videos.
In the past, the technology used to create deepfakes had nothing to do with politics or the spreading of misinformation. But increasingly, this is changing and as the means to create deepfakes are becoming more accessible, so have the use cases. Who knows how the technology could be used to influence the cryptocurrency market?
How to Detect Deepfakes?
Detecting deepfakes can be a difficult problem. Of course the amateurish videos can easily be spotted by the naked eye, but how to spot the professional ones? One sign can be blinking. Healthy adult humans blink somewhere between every 2 and 10 seconds and a single blink can take between one-tenth and four-tenths of a second. This is not what happens in many deepfake videos. Because not many photographs with people with their eyes closed are available, they can’t be used by the AI as training data to create deepfake videos. This is why the technology is less likely to create faces that blink normally, which can give us an opportunity to find deepfakes by ourselves. Another sign can be shadows that look wrong or unnatural, so if you have a hunch the video could be fake, this can also be something to look out for.
But as we have said, the technology used to create deepfakes is getting better by the day, and soon we may have to rely on digital forensics to detect altered videos. If we will be able to detect them at all.
There will always be a race between generation and detection of deepfakes, as is the case with fake news in general. Unfortunately, in many cases, generation is always one step ahead. And until the technology has progressed to a point, where we will be able to detect deepfakes with certainty, the best way to protect yourself is by staying informed. Especially in crypto, checking several reliable news sources can be a way to protect yourself from making decisions based on lies and misinformation.
Fortunately, there is a lot of quality blockchain and cryptocurrency related information online. Combining news with knowledge gained from thought leaders and reliable influencers can lead to great success. But taking the time to find all the content can be difficult. That is where BLOCKBIRD comes in! With our platform and our advanced AI algorithms, you can get all the important news, tailored exactly to your needs, in one place. Reading news from all the reliable sources has never been so easy. If you want to learn more, check our website and become a part of the Blockbird movement.