Deepfake video controversy involving Rashmika Mandanna: what is it and how can you recognise one?

 Certain AI tools are available for free and only serve to make the issue of phoney images, videos, and audio worse.

A discussion concerning AI and deepfakes has been sparked by Rashmika Mandanna's phoney video.

SUMMARY 
  • A video that has been altered with an algorithm to substitute a different person for the original subject is known as a "deepfake."
  • The prevalence of deepfake videos has increased since the introduction of various AI tools.
  • Even though deepfake videos can be very convincing, you can spot them by looking for a few telltale indicators.
In a recent development, well-known actress Rashmika Mandanna has been drawn into a deepfake video controversy. A woman can be seen entering a lift in the now-viral video on social media, but her face has been digitally altered to resemble Mandanna. Numerous people are upset about this incident and are calling for legal action. Amitabh Bachchan, a Bollywood legend and co-star in the film Goodbye, expressed his worries about the deepfake trend and advocated for legal action.

What is a deepfake video?
"Deep learning" and "fake" are combined to form the term "deepfake." It describes a video that has been altered with an algorithm to authentically replace the person in the original clip with a different person, usually a well-known person. Deepfakes create images of fictitious events using a type of artificial intelligence known as deep learning. The prevalence of deepfake videos has increased since the introduction of various AI tools. Certain AI tools are available for free and only serve to make the issue of phoney images, videos, and audio worse.

How can you spot a deepfake video?
Even though deepfake videos can be very convincing, you can spot them by looking for a few telltale signs:
1. Unnatural Eye Movements: Keep an eye out for any unusual eye movements, such as erratic or nonexistent blinking.
2. Colour and Lighting Mismatches: Take note of the colour and lighting differences between the background and the face.
3. Audio Quality: Evaluate how well the audio is matched to the lip movements by comparing and contrasting it.

4. Visual Inconsistencies: Examine visual irregularities such as odd body posture or movement, contrived facial expressions, abnormal facial feature placement, or unusual body shape.

5. Reverse Image Search: Use this technique to determine whether or not the person or the video is real.

6. Video Metadata: Examine and determine whether the metadata for a video has been edited or changed.
7. Deepfake Detection Tools: Make use of tools that can identify suspicious videos, such as browser extensions or online platforms.



 A number of technologies are being created to address the issue of deeply fake images and videos, including:


1. AI-Based Detection: A lot of tools employ AI to identify video tampering. Microsoft, for example, has created a tool that evaluates images and videos and provides a confidence score indicating the likelihood that the content was produced artificially. The Deepfake Detection Challenge Dataset was used to test this tool, which was developed using a publicly available dataset from Face Forensics++.

2. Browser Plugins: To assist in identifying deepfake content on the internet, the AI Foundation developed a browser plugin called Reality Defender. SurfSafe, another plugin, carries out comparable checks.

3. Startups: A number of startups are developing cutting-edge strategies to combat false information. For instance, OARO provides resources for media, compliance, and digital identity authentication and verification. Sentinel is addressing cyberwarfare.

4. Unfakeable Records: Businesses, governmental organisations, and individual users can all authenticate any image or video by using the unchangeable data trail that OARO Media generates.

Legal and regulatory actions are desperately needed to stop the spread of content like the Rashmika Mandanna deepfake controversy. Although a lot of the technologies listed above are being developed quickly, many of them are either not yet widely and easily accessible or are not entirely accurate. It is still the user's responsibility to find and distribute these kinds of videos. 


Post a Comment

Previous Post Next Post