Warning: this blog article mentions instances of sexual violence which could be triggering or disturbing for some readers.
In 2019, Mrs. V. contacted the Institute for the Equality of Women and Men (hereinafter ‘Institute’) looking for help. She had been raped years ago by her former life partner, who filmed the assault. One day, Mrs. V. learned that a short clip of the film was spread all over the internet, in Flanders as well as in the Netherlands. People produced compilations of the clip, memes, or even merchandise referring to this clip.
At the time, distributing or showing intimate or sexual images without consent of the depicted person was already criminalized under Belgian legislation. However, the legislator at the time viewed non-consensual distribution not as a crime as such, but rather as an aggravating circumstance of voyeurism. Non-consensual distribution was criminalized by adding a paragraph to the article criminalzing voyeurism.
Due to this construction, non-consensual distribution as an offence was not very known with the police and public prosecutors, who did not always handle complaints about non-consensual distribution correctly.
Mrs V. was a victim of this lack of knowledge. When she went to file a complaint with the police, an investigation was started by the police and the public prosecutor into the rape, but not into the distribution of her images.
Around the same time, at the end of 2019, the Belgian legislator was working on a new legislative proposal in order to combat the non-consensual distribution of intimate images. This new legislation had several purposes:
After an intervention from our equality body, the Institute was charged to be the organization supporting victim-survivors. After all, non-consensual distribution is deemed to be a form of gender–based violence threatening gender equality (see, for instance, here).While Male victim-survivors can experience the same consequences (and will be equally helped by the Institute), 80-90% of the victim-survivors are women. These women suffer greatly and in the same manner as victim-survivors of physical assaults. Victim-survivors suffer from feelings of shame, anxiety or even suicidal thoughts. They are very regularly harassed, both online and offline, they often lose their jobs, are forced to stop their education, move away or even change their legal name.
Additionally, victim-survivors will adapt their behavior online. They will censor themselves or even quit social media temporarily or even permanently in order to avoid further victimization. In other words, these women are hindered to participate in the online debate as well as online social interaction, which forms an unacceptable limitation to their freedom of speech and to their participation in modern society.
As of the first of July 2020, the Institute can, by law, support victim-survivors of non-consensual distribution and voyeurism. One of the tools at our disposal is taking legal actions.
However, the first concern of victim-survivors is often not seeking legal remedies, but the removal of the images. Therefore, the Institute has drafted a manual on how to ask for the removal from social media that empowers and shows victim-survivors totake action themselves, without the intervention of the Institute. Of course, if the victim-survivors face difficulties or are not capable of contacting the social media companies themselves for any given reason, the Institute can and will act on their behalf. The Institute also collaborates with a specialized service of the Belgian police, who investigates where the images are disclosed online in order for the Institute to ask for removal.
In order to make everything run as smoothly as possible, the Institute also meets both with public partners such as the Public Prosecutor and judges and with private partners such as search engines, social media companies and adult pornographic sites.
Since the new law became effective on the first of July 2020, the Institute received about 70 cases in 2020. Especially after September there was an increase, as intimate images were leaked of local celebrities, leading to an increased attention to this phenomenon and the competency of the Institute.
To date, the Institute was involved in three criminal judicial cases of non-consensual distribution:
From the moment the new legislation was implemented, on the first of July 2020, the Institute started writing to several social media platforms, as well as adult pornographic sites, demanding the removal of the video fragment. Amongst others, the fragment could be found on Twitter, thus the Institute mandated by Mrs. V, contacted Twitter several times, based on the new legislation, with the request to take down several posts. Whereas the Institute had rather successful experiences with other sites, we encountered issues with Twitter.
Under the e-Commerce Directive, Twitter like all other social media, has the obligation to install a procedure people can use to ask for the removal of harmful content. The idea is that this procedure has to be user-friendly and definitely not discourage people from reporting harmful content. Twitter did implement such a procedure, but users have to overcome several obstacles, making it hard and not user–friendly on victim-survivors who are often already traumatized.
In the case of Mrs. V., we experienced that Twitter asked for proof that the Institute was acting on behalf of Mrs. V., even though this is part of the legal mission and mandate of the Institute. Additionally, a copy of her passport was requested. We could overcome the first issue thanks to the mandate given to the Institute by Mrs. V., but providing a copy of a passport can discourage victim-survivors whose name is not yet linked to the image and who want to remain anonymous at all costs during the procedure. Also, it sometimes took Twitter months to remove the images after they were reported.
At the beginning of November 2020, the Institute once again contacted Twitter with the purpose of removing the video fragment that appeared on a new account. However, Twitter was not cooperating, not even after the lawyer reminded Twitter of its legal obligation, twice. By mid-December, Twitter simply closed this reports requesting the removal of the video extract, without removing the images. This can be interpreted as an implicit refusal to remove the images, even though they removed the same images on other occasions. A check-up in mid-January learned that this specific post was still online.
These structural obstacles in combination with the refusal to remove the images led the Institute to decide to take legal action. The Institute is thus convinced that Twitter is knowingly allowing users to show intimate images without the consent of the depicted person and is thus a culprit of non-consensual distribution. Moreover, Twitter operates with a business model in order to generate money. Creating controversy attracts more users, which increases the prices for advertisement on the Twitter-platform. In other words, Twitter is not only a culprit of non-consensual distribution, it acts with economic reasons in mind, thus the aggravating circumstance is applicable.
The Institute prefers to work together with all actors instead or taking legal action against them, but Twitter has allowed on a daily basis for the sexual integrity of women to be violated for their own gain. This practice is intolerable and needs to be put to an end at all costs. The investigation is ongoing, but the Institute hopes that Twitter will (be made to) change their policy and their internal procedure so that people who are already victimized by having their images non-consensually spread, will at the very least have a possibility to end the violation of their sexual integrity.
If you have been a victim of non-consensual distribution, or other forms of image-based sexual abuse or digital violence, contact your national Equality Body in order to seek support. Always file a complaint with the police so that the offender can be identified or even tried by court. Additionally, filing a complaint is necessary to have a better idea of the number of victims, so that the national Equality Body or other organizations can advocate for the implementation of an adequate policy.
If you are only concerned with removing the images, know that all social media platforms have help centers or safety centers explaining how to prevent the abuse of your images and what to do if they have been abused. You can find them easily through search engines. The Institute has a webpage on non-consensual distribution, including a manual for victims on how to remove their images themselves. (Dutch: https://igvm-iefh.belgium.be/nl/activiteiten/geweld/wraakporno; French: https://igvm-iefh.belgium.be/fr/activites/violence/revenge_porn). Helpwanted.nl also made files per social media platform, albeit only in Dutch.
The Institute organized a panel on sexual integrity in the digital era, titled “Is ‘no’ still ‘no’ in an online world? Discussing non-consensual distribution of intimate images and deepfakes” with several international experts. You can rewatch this panel here: https://igvm-iefh.belgium.be/nl/nieuws/instituut_organiseerde_panelgesprek_over_seksuele_autonomie_in_de_digitale_wereld
If you have any questions related to the function of the Institute for the Equality of Women and Men, in relation to non-consensual distribution, other forms of image-based sexual abuse or non-criminal discrimination, you can contact us (In English, French and Dutch) through e-mail (equality.womenmen@iefh.belgique.be), through phone (+32 233 44 00) and through post (Institute for the equality of women and men – Rue Ernest Blerot 1, 1070 Brussels).
The views on this blog are always the authors’ and they do not necessarily reflect Equinet’s position.