Gender-based violence is a global issue, but studies consistently show the Pacific has among the highest rates in the world. Up to 79% of women in the region experience some form of abuse over the course of their lives.
An emerging concern is violence through technology. This is where digital technologies are used to abuse, harass, coerce and exploit another person.
Most often, these harms are disproportionately experienced by women and girls.
Yet there is limited research on gender-based violence in the Pacific. And even fewer academic studies looking at the role of technology.
Our recent study aims to fill that gap. We surveyed victim-survivor support practitioners from nine Pacific Island nations. We found smartphones, Facebook and AI-generated sexualised deepfakes are being used to control and harm women and girls.
Coercive controlling behaviours
In our study, recently published in the journal Violence Against Women, we surveyed 19 and interviewed five practitioners across Fiji, Kiribati, Micronesia, Tonga, Samoa, Vanuatu, Tuvalu, Papua New Guinea and the Solomon Islands who work with victim-survivors.
We asked about the ways technologies are being used to abuse. We also asked about any challenges in supporting victim-survivors who experience tech-based violence.
We found common types of abuse included:
-
controlling access to devices
-
sharing or threatening to share intimate images without permission (often with the person’s family and religious or faith-based networks)
-
monitoring another person’s location using trackers or publicly available online information
-
and creating or threatening to create AI-generated sexualised deepfake videos or images to extort money.
Practitioners reported they were supporting increasing numbers of victim-survivors with experiences of technology-facilitated violence.
The abuse was also happening in the context of other forms of intimate partner violence. This included financial, physical and psychological harm, further compounding the abuse.
One device per household
One finding particular to the Pacific is the shared-device problem.
Practitioners reported that many families share a single phone. This meant the somewhat standard digital safety advice, “change your password” or “use a different device”, does not apply.
Practitioner Mere said partners sharing access to one digital device can facilitate controlling and abusive behaviours. She explained:
married couples having the same Facebook account, then the other partner sees messages coming in directly to the wife […] and monitoring where the other partner is going.
Sexualised image-based abuse
Another common form of abuse reported was coerced sexual acts and image-based sexual abuse. Victim-survivors are commonly forced into sexual activities via digital means, according to 36% of participants.
Other forms of image-based sexual abuse practitioners reported as very common included:
-
the taking of sexual images or videos without permission
-
the sharing of sexual images without permission
-
the threat to share sexual images without permission.
Practitioner Kiana said victim-survivors report image-based sexual abuse happening in both relationship breakdowns and as a way to force them to stay in an abusive relationship:
the partner would threaten or even send nude photos of their partners […] to group chats [or] threaten to send the photos to his partner’s family members.
An emerging issue in the Pacific, and one that is being experienced globally, is sexualised deepfake abuse. This is where sexualised imagery is created with Artificial Intelligence (AI) or other digital technologies, such as Photoshop. Of the participants in our study, 26% reported this as occurring “often”.
With the rapid development of AI technologies that easily create sexualised deepfake abuse content, these trends are likely to increase.
Read more:
What to do if you, or someone you know, is targeted with deepfake porn or AI nudes
Challenges in supporting victim-survivors
The study found a range of challenges and barriers for practitioners in supporting victim-survivors in the Pacific.
One of the prominent barriers was cultural practices and norms. Practitioners said these norms are shaped by traditional communal values, family honour, kinship systems, faith, ideals of modesty and respect for hierarchy.

Pita Simpson/Getty
Sexual and cultural taboos, strongly ingrained within traditional Pacific value systems, were also seen to discourage women from seeking help.
Participants said the controlling of phones by perpetrators and the shared device problem was also restricting women’s opportunities to connect with support networks, to identify their situation as abusive and to seek help.
Another major barrier identified by 37% of practitioners was the poor handling of cases by police. Cases are simply not taken seriously by authorities, according to 32% of participants. In this context, practitioners observed perpetrators were rarely held accountable, leaving victim-survivors without justice or protection.
Where to next?
While the findings in our study are similar to those in other countries, they highlight the importance of social and cultural contexts in addressing these issues. These contexts should inform how technology-facilitated violence in the Pacific is prevented, and how victim-survivors are supported.
There are a range of things governments, technology providers, police and the legal sector can do to address the problem. One would be to fund and prioritise practitioner and police training to better understand and respond to technology-facilitated violence.
Another is to develop culturally-sensitive community education initiatives that stop victim-survivors from being silenced.
And finally, religious and faith-based organisations should be brought on board to help prevent and respond to technology-facilitated violence.
The authors would like to thank Siân Human from the Australian Research Council Centre of Excellence for the Elimination of Violence Against Women for her insights and support during the process of writing this piece.
![]()
Emma Quilty receives funding from the Australian Research Council, including the Australian Research Council Centre of Excellence for the Elimination of Violence Against Women and eSafety.
Asher Flynn receives funding from the Australian Research Council and eSafety.
Tarannum Baigh receives funding from the Private Enterprise Development in Low-Income Countries (PEDL) programme and the International Growth Centre (IGC).


