World

Even before deepfakes, tech was a tool of abuse and control

Of the many “profound risks to society and humanity” that have tech experts worried about artificial intelligence (AI), the spread of fake images is one that everyday internet users will be familiar with. Deepfakes – videos or photographs where someone’s face or body has been digitally altered so that they appear to be doing something they are not – have already been used to spread political disinformation and fake pornography. These images are typically malicious and are used to discredit the subject.

When it comes to deepfake pornography, the vast majority of victims are women. Generative AI – technology used to create text, images and video – is already making imagebased sexual abuse easier to perpetrate. The evolving landscape of technology has introduced new avenues for abuse within controlling relationships, often overshadowed by the attention given to advanced threats like deepfake pornography.  My research initially explored the impact of smartphones in abusive relationships, revealing that perpetrators of domestic abuse were exploiting these devices to exert power and control over their victims. Mobile phones serve as direct tools of monitoring and control, leveraging GPS tracking, incessant messaging, and video calls to create a panopticonlike environment.

Abusers employ tactics such as excessive texting, tracking the victim’s location, and even contacting their friends to isolate the victim socially. Such isolation is a common characteristic of abusive relationships and underscores the significance of tech-enabled control. M o r e o v e r, m o b i l e phones serve as gateways to the broader “internet of things,” enabling abusers to manipulate connected devices in the victim’s environment. This can include adjusting household thermostat settings, leading to gaslighting techniques that make victims question their sanity and judgment. The concept of the modern panopticon, where victims feel perpetually watched, parallels the 18th-century philosopher Jeremy Bentham’s idea.

Mobile phones play the role of the guard tower, making victims believe they are under constant surveillance, even in public. This phenomenon compels survivors to act in ways aimed at pleasing their abusers, often leading to misinterpretation by others as paranoia or mental health issues. As technology advances, so do the tools and strategies available to abusers, perpetuating the cycle of surveillance, gaslighting, and abuse. Addressing this issue requires tech companies to consider the experiences of domestic abuse survivors and integrate safety measures into their product designs. Until such measures are taken, abuse will persist hidden in plain sight, emphasizing the urgent need for greater awareness and action to protect vulnerable individuals in controlling relationships.

TDG Network

Recent Posts

Donald Trump’s Bold First Day Promises as 47th U.S. President

Donald Trump plans to fulfill several campaign promises on his first day as President, including…

5 minutes ago

Rajasthan: Girl’s Family Chops Off Groom’s Brother’s Moustache After Engagement Call-Off | WATCH

In a shocking incident from Karauli, Rajasthan, a groom’s brother had his moustache chopped off…

12 minutes ago

Is Uber Manipulating Fares? How Your Phone And Battery Could Be Costing You More

Delhi-based entrepreneur Rishabh Singh's experiment reveals surprising Uber fare discrepancies linked to phone platform and…

28 minutes ago

Trump’s Victory Dance To ‘YMCA’: Why This Song Became His Rally Trademark | WATCH

Donald Trump adopted the 1978 disco anthem "YMCA" as his rally signature, energizing supporters with…

48 minutes ago

‘A 21km Race Like Never Before’: World’s First Human-Robot Marathon To Take Place In Beijing

China is set to host the world’s first marathon featuring both human and humanoid robot…

1 hour ago

Trump to Release Classified Documents on Kennedy and King Assassinations

President-elect Donald Trump announced plans to release classified documents related to the assassinations of John…

1 hour ago