Rashmika Mandanna’s Deepfake Video Goes Viral

Rashmika Mandanna’s Deepfake Video Goes Viral, Sparking Calls for Legal Framework

A deepfake video of popular South Indian actress Rashmika Mandanna has gone viral on social media, sparking concerns about the misuse of artificial intelligence (AI) to spread misinformation.

The video, which was originally posted on Instagram by a British-Indian influencer, shows Mandanna entering an elevator in a form-fitting black dress with a plunging neckline. The video has been viewed over 18 million times and has been shared by many people on Twitter, Facebook, and other social media platforms.

However, fact-checkers have confirmed that the video is a deepfake, which is a type of AI-generated video that is created to make it look like someone is saying or doing something that they never actually said or did. In this case, the deepfake video was created by superimposing Mandanna’s face onto the body of another woman.

The video has sparked outrage among many people, who have condemned the misuse of AI to create such realistic and convincing deepfakes. Some people have also called for the creation of a legal framework to deal with the problem of deepfakes.

Also Read –

In India, there is currently no specific law against deepfakes. However, the Information Technology Act, 2000 (IT Act) does contain provisions that can be used to prosecute people who create and share deepfakes. For example, Section 66A of the IT Act makes it an offense to post any information that is “offensive to public morality”.

However, some legal experts have argued that the IT Act is not enough to deal with the problem of deepfakes. They have called for the creation of a new law that specifically addresses the issue of deepfakes and that provides for stricter penalties for those who create and share them.

Rashmika Mandanna's Deepfake Video Goes Viral

The deepfake video of Rashmika Mandanna is a reminder of the potential dangers of AI and the importance of creating safeguards to prevent its misuse. It is also a reminder of the need for public awareness about deepfakes and how to identify them.

Leave a Comment