My Mother’s Warning Came True: Women Trapped in the Grip of Algorithms
By Samaya Jaber
“Your smile is beautiful… it turns me on.” That was the creepy message I received at 17 from a man in his thirties on Instagram. It was enough to keep me off the app for weeks, away from the digital world, haunted by my mother’s constant warning: “Never post your pictures online. One day they’ll take them and put your face on another body.”
Eleven years later, my mother’s prophecy has come true. Social media platforms have become arenas, not only for harassment, threats, and the restriction of women’s freedom through offensive comments, doctored images, and manipulated conversations, but for generating AI-made photos and videos that serve to blackmail women and destroy their reputations in societies that blame women alone for the “scandal.”
- Naira Ashraf – an Egyptian young woman subjected to digital violence even after her death, when someone used AI to generate an image showing her beside her killer.
• Zainab – a Libyan girl whose graduation photos were misused to defame her, while those around her blamed her for it.
• Fidaa Ghaleb – a Yemeni woman who hanged herself from a tree outside her home after being blackmailed by Yusuf Doum with a photo secretly taken by a friend.
• Riham Yaaqub – an Iraqi activist assassinated by gunmen after an online campaign of incitement and threats, triggered by a photo showing her with the U.S. ambassador.
• Sanaa – a Lebanese woman blackmailed by her boyfriend using photos taken secretly and others fabricated for extortion, costing her job, harming her family, and damaging her mental health.
• Karima Al-Tarhouni – a Libyan artist whose deepfake pornographic video remains online despite multiple reports demanding its removal.
• Sara Al-Waraa – a Syrian blogger targeted when followers used AI tools to generate fake images of her, sparking a wave of public shaming and incitement.
These stories,and many others still hidden beneath the weight of “tradition, shame, scandal, and sin”, reveal the recurring suffering of Arab women in a digital space deeply biased by gender. This space is being exploited to blackmail women by fabricating their ages, affiliations, and social identities.
What drives this phenomenon to worsen in our societies? And how can the law protect women from digital violence that threatens their psychological and social well-being?
Women’s Challenges in the Digital Space
According to ChatGPT, one of the most widely used artificial intelligence systems, AI is “an attempt to make machines think or act like humans by learning from data and past experiences, an extension of our human capacities and a reflection of humanity’s ambition to expand its abilities.” Yet, in the context of the Arab digital space, these tools have instead become an extension of the same restrictions imposed on women, their bodies, behavior, and freedom, particularly in conservative “Eastern” societies. They have entrenched new forms of blackmail, threats, and gender-based violence.
Both Ruwaida Al-Arabi, founder of the Hiya Tatahaqaq (“She Verifies”) platform in Libya, and Nada Hamzeh, a media and digital communication expert in Lebanon, attribute these challenges partly to women’s limited awareness of their digital rights, and partly to the lack of knowledge about how to protect themselves online. This ignorance, Hamzeh explained, provides abusers with fertile ground to exploit and target their victims.
Hamzeh noted that the digital sphere mirrors the societies we live in: a space where blackmail, violence, and violations remain major challenges women have long faced. As a result, women are particularly vulnerable to breaches of privacy online, which makes digital harassment easier and more pervasive. That is exactly what happened to Sara (a pseudonym), a young woman from Libya.
Through Snapchat, Sara regularly shared snippets of her daily life with her friends. But this seemingly ordinary act was exploited by one of her followers, who violated her privacy by saving a photo of her wearing a hijab and manipulating it with AI, placing her face onto the body of a woman wearing shorts and a crop top.
While such an image might be considered harmless elsewhere, in Libya it can easily become a cause for defamation or incitement against women. As Ruwaida explained to Sharika Wa Laken, “This cultural misunderstanding makes AI an unjust tool, especially in societies where women are held responsible for the ‘scandal’ more than the perpetrators themselves.”
She continued, “Using AI to generate fake images of women makes blackmail and threats easier and more dangerous, particularly with algorithms that don’t understand the social sensitivities of our countries or respect cultural privacy.”
According to Nada Hamzeh, the blatant technological disregard for women’s privacy in our societies has turned digital spaces for free expression into tools that reinforce gender bias. AI tools, trained on online content to fulfill user requests, behave in the same discriminatory way. “No tool tells you it will use your photos for such purposes,it simply asks for permission, and we give it,” Hamzeh said. “That’s how any woman might one day find her face edited into an image somewhere on the internet.”
The Psychological Impact of Digital Violence
After manipulating her photo, the blackmailer contacted Sara through Snapchat, sending her the fake AI-generated “face swap” version and threatening to publish it. Shock, fear, and anxiety overwhelmed her, but she acted quickly, knowing she had never posted such an image. Many women, however, do not have the same courage, often paralyzed by fear of the perpetrator or feelings of guilt and shame, explained clinical psychologist Rachel Mhanna to Sharika Wa Laken.
With the support of Libyan feminist groups like Hiya Tatahaqaq, Sara chose to confront her blackmailer rather than give in, publicly posting their chat and the original image on her page. But for women living in repressive societies, where social threats and a lack of safety dominate, the psychological defense mechanisms differ. According to Mhanna, victims often suppress anger, turning it into fear or shame to avoid confrontation or retaliation in environments that do not allow open expression of rage.
Mhanna further explained that the deep sense of shame resulting from such violations stems from living in conservative societies where reputation and honor are tied to family identity. “In my view,” she said, “feelings of fear and shame come from a mix of psychological insecurity, social pressure, and the defensive behaviors women learn throughout their lives. Any intervention must start by understanding these three factors, then giving victims a space to express their fear, addressing the shame, and channeling anger into positive energy that helps them regain control of their lives.”
Vague Policies and Missing Legislation
The responsibility to protect Sara, and countless women like her, from blackmailers who exploit digital tools “using the body of an Indian woman” to threaten their existence in online spaces, is one that major tech companies continue to evade. Despite their public pledges to prevent the use of their technologies in producing and sharing child sexual abuse material, women remain excluded from this framework of protection.
Compounding this is the absence of any binding international standard requiring these companies to implement measures that specifically protect women. As a result, they continue to adopt vague, inconsistent policies that allow predators to infiltrate digital spaces and exploit them. With the growing number of cyber blackmail cases in both Libya and Lebanon, especially amid the spread of AI tools, a deep legal and legislative gap has emerged, turning the digital realm into an unsafe space for women.
In an interview with Sharika Wa Laken, Ruwaida Al-Arabi, founder of the Hiya Tatahaqaq platform, noted that “many of the cases filed in Libya stall because judicial and security authorities lack the technical capacity to trace digital evidence, or because they treat cybercrimes as ‘moral issues’ rather than crimes that violate dignity and privacy.” She added, “This directly affects women’s willingness to report. Many fear being turned from victims into suspects, or that their reputations will be smeared.”
When it comes to cases of digital blackmail or the sharing of private images, Al-Arabi explained that “Libyan law remains very limited. There are general provisions under the cybercrime law, but they fail to take gender into account or address women’s specific vulnerabilities. Society itself has a double standard: instead of protecting victims, it often blames them and excuses the perpetrators. That’s why most women prefer silence, out of fear of stigma or social repercussions.”
The situation in Lebanon, according to digital communication expert Nada Hamzeh, is “not much better.” She told Sharika Wa Laken that “we’re still far from having the kind of policies and ethical frameworks needed to regulate technological tools. The lack of training and preparedness among security forces makes women hesitant to seek legal protection when they face digital abuse.”
In conservative societies where traditions and social norms often override civil laws, women develop a deep fear of reporting crimes committed against them, terrified of “scandal” or of their families finding out. Although Lebanon has made significant progress in increasing reports to the Cybercrimes Bureau, Hamzeh urged authorities to “take the initiative to break this barrier and build trust with women so they feel safe enough to report digital abuse.”
Paths to Protection: From Prevention to Recovery
What happened to Sara, and before her to Karima, Sanaa, Riham, Fidaa, and many others whose stories remain untold, demands urgent collective effort. Some were killed in the name of so-called “honor,” others silenced by harassment or isolation disguised as “protection.” These cases show the need to treat artificial intelligence not as a neutral tool, but as a system requiring ethical and legal safeguards that prioritize women’s safety.
As Al-Arabi emphasized, “The solution requires a multilayered approach, beginning with major tech companies developing tools that detect harmful content and understanding the cultural sensitivities of Arab societies as a fundamental necessity.” She added that AI systems must be trained on local data and Arabic dialects “so they can distinguish between criticism and defamation, or between humor and incitement.”
Legally, Al-Arabi underscored the need for clear legislation that criminalizes the publication or use of private images without consent, backed by rapid mechanisms for content removal and prosecution of offenders. Until then, as long as survivors are blamed and both men and women fail to recognize that the digital world is an extension of real life, governed by moral and human laws, civil society, especially feminist platforms, must continue to promote digital safety awareness, document violations, and support survivors.
For now, as women continue to face severe harm in digital spaces, protecting their privacy and online security remains essential. Hamzeh advises women to adopt technical and security measures to protect their data, at the very least, by creating strong passwords, enabling two-factor authentication, avoiding malicious apps, and restricting apps’ access to photos or private data.
While Hamzeh stressed the need to expand such awareness, she also called for a holistic approach that goes beyond digital safety to include psychological and social support. Clinical therapist Rachel Mhanna outlined steps that help survivors recover from trauma and rebuild a sense of safety and confidence: securing accounts and preserving evidence, seeking a trusted support person, maintaining emotional stability, reducing stress, gaining family and community support, and pursuing legal and technical intervention, all leading to the gradual restoration of identity and self-trust.
After all this, what we women endure, regardless of our backgrounds, status, or strength, from multifaceted digital violence leaves me with one question: What if we were still living in my mother’s time, would we have been safe from the machines that now serve the whims of a “patriarchal man”? Then I realize that this “man hiding behind his false masculinity, thirsty for control under the guise of honor,” would still find a way to exercise his power, even through something as cold and mechanical as a calculator.
By: Samaya Jaber