- Did you know your face van now be stolen?
Tired of fake news? Well, it was just taken to the next level. Deepfake is a technology that let’s the user digitally steal someone’s face which can be used to produce forged videos. This birth to both ethical and security issues. Gulf News readers discuss
Detrimental to society
This could harm individuals and society
I believe the use of deepfake technology is detrimental to not just an individual but a society as a whole. There have been instances in the past where people have forged videos of influential personalities using deepfake, making them look extraordinarily realistic, hence, facilitating the perpetrator’s agenda of ruining the image of the individual or changing the perception people have of the exploited person.
On the contrary, many people can scapegoat deepfake technology as their way out for being exposed to what they have said or done in an authentic viral video or audio on social media platforms.
This computer-generated manipulation has made people doubt everything they see on the internet as there are constant doubts of what actually is accurate and real and what is, in fact, a hoax.
I suspect that if this technological manipulation is adopted by every other individual in the general public this can aid in a lot of damaging activities, such as blackmailing, ruining a person’s career, their personal and professional relationships, and can also be used as a source of bullying people in schools, and universities.
From Ms Haniyah Irfan
Finance head in a branding firm based in Dubai
Consider ethics
Systems to detect fake content need to be in place
Deepfake a name taken by a Reddit profile after the person’s work of fake videos of real people like Barack Obama came out giving a fake speech, using a generative adversarial network (GAN). This is mostly meant to be used as a video editing tool for movies and media, but this also opens the field up for issues regarding to authenticity of information in this era of fake news and information.
OpenAI, an Elon Musk funded research firm released a text generator that was even able to answer comprehension questions by itself. Combine that with Deepfake, Nvidia and other media based artificial intelligence platforms that are coming up, it becomes concerning for many. However, I think that such technology needs the focus of not just developers but also of ethicists, governance, cyber security, defence and other areas to navigate it towards improvement in the future. An example is Defense Advanced Research Projects Agency (DARPA) in the US creating a detecting system that can recognize Deepfakes. It works by detecting flaws. Systems like these show what defence mechanisms and cyber security combined can do to promote a secure future in terms of technology.
From Mr Mohammad Yaseen
Computer science student based in Abu Dhabi
Awareness needed
People need to be aware of such technology
Deepfake is a technology that enables its wielder to put the face of another person on a person captured in a video. This technology can be used in art, entertainment and other creative endeavours. Movies, for example, won’t have to rely on expensive Computer-generated imagery (CGI) to recreate the faces of actors who aged or passed away. Unfortunately, it also has a great potential for abuse. It can be used to ruin reputations, create fake evidence and it can negatively affect person’s self-image and mental well-being. For the people who are in the constant spotlight or who have money this may be less of an issue. Others, however, will face struggles to clear their name and get their life back to normal. However, progress cannot be stopped and rather than try to impede this technology, we should try to use it in a constructive manner and find proper counter-measures against its misuse. People should be educated not to treat the videos as is and consider if they are dealing with a deepfake. Finally, education is important. Everyone should know what technologies are available, how they are used, both ethically and unethically.
From Mr German Shein
Application developer based in Dubai
Poll results
Do you think you can spot fake news?
Yes 71%
No 29%
Have your say
Do you think deepfake is a technological epidemic?