scorecardresearch

Urgent call for robust regulations & vigilance against deepfakes in the face of alarming accessibility at just Rs 40

Deepfake dangers demand swift action and awareness. Strengthening regulations is crucial, considering the threat's affordability, which is as low as Rs 40.

advertisement
Deepfake Threatartificial intelligence
Deepfake Threat
profile
New Delhi, UPDATED: Dec 4, 2023 18:06 IST

Highlights

  • Tools for creating deepfake content are accessible for as little as Rs 40, leading to widespread manipulation
  • Women face a disproportionate threat, with 96 percent of global deepfake images being explicit, highlighting the urgent need for regulations

In the intricate tapestry of digital age, the menace of deepfake content has woven itself into the very fabric of our online existence.

Recent revelations have not only exposed the dark underbelly of manipulated media but have also underscored a chilling fact – the ability to generate such content is available for as low as Rs 40, placing a formidable weapon in the hands of those who seek to exploit the vulnerability of individuals.

advertisement

The inescapable proliferation

Beyond the glitz and glamour of celebrity controversies, the insidious reach of deepfake pornography has permeated the mainstream, finding its way onto platforms like X and Instagram.

Astonishingly, the creators of this manipulative content, fueled by a dangerous blend of accessibility and affordability, operate on a spectrum ranging from tech novices to seasoned professionals.

Deepfake accessibility at a nominal cost

What adds a harrowing layer to this fiasco is the revelation that the tools to generate deepfake content are available for as low as Rs 40. 'Deepnude' apps, such as 'undress.app,' utilises generative AI models to produce strikingly realistic content.

Such easily available tools and cheap resources have given rise to a broader user base to engage in the creation of sophisticated deepfake content that blurs the lines between reality and manipulation.

advertisement

Women in the crosshairs

While the implications of deepfakes are far-reaching, it is women who find themselves disproportionately targeted. A disconcerting 96 percent of global deepfake images are explicit, with an overwhelming 99 percent directed at women.

The ease with which abusers can now create such content, coupled with the lack of effective regulations, exacerbates the vulnerability of women to this sinister trend.

The legal void & regulatory quandaries

A critical impediment in addressing this digital menace lies in the absence of specific laws in India to combat deepfake proliferation. Jurisdictional and regulatory challenges compound the identification of creators, leaving victims with limited legal recourse.

While existing provisions can be invoked, the delayed response from social media platforms amplifies the damage inflicted.

Global perspectives on regulations

Contrastingly, several other nations have proactively fortified their legal arsenals against the deepfake threat. China's comprehensive law, introduced in 2022, and the UK's Online Safety Bill exemplify concerted efforts to criminalise the creation and dissemination of manipulated explicit content without consent.

The imperative of vigilance

As the sophistication of deepfake technology advances, traditional methods of detection become increasingly insufficient. The Massachusetts Institute of Technology (MIT) suggests delving deeper into media, relying on fact-checking and digital forensics for accurate verification.

In conclusion, the ominous accessibility of deepfake content at a nominal cost demands urgent attention and concerted efforts to fortify regulatory frameworks. The fight against this digital peril requires not just legal measures but also heightened awareness to safeguard individuals from the far-reaching implications of AI manipulation, all while bearing in mind the astonishing fact that this threat can be unleashed for as low as Rs 40.

advertisement

Published on: Dec 4, 2023 18:06 ISTPosted by: Minaal, Dec 4, 2023 18:06 IST
IN THIS STORY

COMMENTS 0

Advertisement
Recommended