Artificial Intelligence (AI) has become popular among the masses with celebrities using it to change themselves into someone else, media channels using it to develop a news anchor that could replace human beings in near future, and many more.
It looks like AI technology can provide a wide array of possibilities to do unthinkable tasks, however, like every other thing, this also has negative impacts on our lives especially for girls as cybercriminals have now started using this technology to generate nude photos of innocent and clueless girls.
It had been a recurring and concerning issue in different parts of the country and now it has arrived in Guwahati with several girls falling prey to it without having any clue that their deepfakes (manipulation of facial appearance through deep generative methods using AI) are being sold off by the sextortionist in reddit, discord, and other online platforms.
This incident of Guwahati girls falling victim to it has come to the fore after a girl took to social media to inform the masses about the incident and how her own school classmate tried to sell her deepfakes to numerous boys in exchange for money.
Narrating the incident of how she got to know about it, she told Pratidin Time Digital, “I received a message from a guy in Delhi saying that someone was selling my deepfake images and asking if I was aware of it. I told him to let me in so I could get a good look of the channel. When I first joined, I was astounded to see people crafting deepfakes of their own mothers or any mother-daughter duos that we occasionally upload.”
In order to track the guy who was selling her deepfakes, she contacted him and pretended to be one of his clients. Then he shared his bank details through which she got to know the name of the accused, Ridip Ranjan Deka, and realised he was her classmate from school she studied till class 10.
She said, “They frequently offer blurry nudes and seek for DMs before selling privately or sometimes they post publicly too. i had link of two such channels Desi Kotha and Muthal ki duniya. Also you can buy premium to get 100s of nudes daily.”
After being aware of the incident, the victim filed a First Information Report (FIR) in All Women Police Station via Assam Police Seva Setu App on July 29.
“The accused was found selling fake obscene photos of me, that he generated using Artificial Intelligence on a social media platform named – Discord. He was caught red handed by me, after I contacted him on discord by pretending to be one of his clients. I have been receiving multiple texts from people stating that some people are selling photos of me, since last one year. It has been very traumatizing for me, and it has taken a great toll on my mental health and my social image. I have been placed in many unsafe situations because of this. On 5th July 2023, I along with my family submitted an application for FIR at the Chandmari P.S. but no formal complaint was registered. The police was constantly pursuing us to go for settlement, citing concerns for the accused’s future. In spite of multiple follow ups, no action has been taken against him. Many more girls have fallen victim to his actions and this is completely outrageous. We have filed a Cyber Complaint against him too (Acknowledgement number – 20407230005164). I hereby request you to take cognizance of this matter and take strict actions against the accused and ensure my safety.”
It is established that before filing the e-FIR, she reported Ridip to the police by submitting an application for FIR who later detained him on July 15 for questioning and confiscated his phone, however, the victim received this information only after her father inquired about the development regarding the case on July 25, the victim informed.
The matter has raised serious concerns among the citizens the clueless girls, who are being targeted, will be left traumatized for life.
Sharing any photographs on social media platforms has now become difficult for girls as no one has any idea if the person selling the deepfakes is one of their friends or close ones.