Facebook’s Proactive Approach to Addressing Nonconsensual Distribution of Intimate Images

It’s well-known that technology has made sharing sexually intimate content easier. While many people share intimate images without any problems, there’s a growing issue with non-consensual distribution of intimate images (NCII[1]), or what is often referred to as “revenge porn.” Perpetrators often share - or threaten to share - intimate images in an effort to control, intimidate, coerce, shame, or humiliate others. A survivor threatened by or already victimized by someone who’s shared their intimate images not only deserves the opportunity to hold their perpetrator accountable, but also should have better options for removing content or keeping it from being posted in the first place.

Recently, Facebook announced a new pilot project aimed at stopping NCII before it can be uploaded onto their platforms. This process gives people who wish to participate the option to submit intimate images or videos they’re concerned someone will share without their permission to a small, select group of specially trained professionals within Facebook. Once submitted, the images are given what’s called a “hash value”, and the actual images are deleted. “Hashing” basically means that the images are turned into a digital code that is a unique identifier, similar to a fingerprint. Once the image has been hashed, Facebook deletes it, and all that’s left is the code. That code is then used as a way for Facebook to identify if someone is attempting to upload the image and prevent it from being posted on Facebook, Messenger, and Instagram.

Facebook’s new pilot project may not be something everyone feels comfortable using, but for some it may bring much peace of mind. For those who believe it may help in their situation, we’ve outlined detailed information about how the process works:

  1. Victims work with a trusted partner. Individuals who believe they’re at risk of NCII and wish to have their images hashed should first contact one of Facebook’s trusted partners: the Cyber Civil Rights Initiative, YWCA Canada, UK Revenge Porn Hotline, and the eSafety Commissioner in Australia. These partners will help them through the process and identify other assistance that may be useful to them.
  2. Partner organizations help ensure appropriate use. The partner organization will carefully discuss the individual’s situation with them before helping them start the hashing process. This helps ensure that individuals are seeking to protect their own image and not trying to misuse the feature against another person. It’s important to note that the feature is meant for adults and not for images of people under 18. If the images are of someone under 18, they will be reported to the National Center for Missing and Exploited Children. Partner organizations will help to explain the reporting process so that individuals can make appropriate decisions for their own case.
  3. The Image will be reviewed by trained staff at Facebook. If the images meet Facebook’s definitions of NCII, a one-time link is sent to the individual’s e-mail. The link will take the individual to a portal where they can directly upload the images. All submissions are then added to a secure review queue where they will be reviewed by a small team specifically trained in reviewing content related to NCII abuse.
  4. NCII will be hashed and deleted: All images that are reviewed and found to meet Facebook’s definition of NCII will be translated into a set of numerical values to create a code called a “hash.” The actual image will then be deleted. If an image is reviewed and Facebook determines it does not match their definition of NCII, the individual will receive an email letting them know (so it’s critical that someone use an email that cannot be accessed by someone else). If the content submitted does not meet Facebook’s definition of NCII, then the concerned individual may still have other options. For example, they may be able to report an image for a violation of Facebook’s Community Standards.
  5. Hashed images will be blocked: If someone tries to upload a copy of the original image that was hashed, Facebook will block the upload and provide a pop-up message notifying the person that their attempted upload violates Facebook’s policies.

This proactive approach has been requested by many victims, and may be appropriate on a case-by-case basis. People who believe they’re at risk of exposure and are considering this process as an option should carefully discuss their situation with one of Facebook’s partner organizations. This will help them make sure they’re fully informed about the process so that they can feel empowered to decide if this is something that’s appropriate for their unique circumstances.  

For more information about how survivors can increase their privacy and safety on Facebook, check out our Facebook Privacy & Safety Guide for Survivors of Abuse.


 

[1] NCII refers to private, sexual content that a perpetrator shares publicly or sends to other individuals without the consent of the victim. How we discuss an issue is essential to resolving it. The term “revenge porn” is misleading, because it suggests that a person shared the intimate images as a reaction to a victim’s behavior.

Protecting Victim Privacy While Increasing Law Enforcement Transparency: Finding the Balance with Police Data Initiatives

One of the hallmark efforts of the outgoing Obama administration has been the Police Data Initiative, launched to improve the relationship between law enforcement agencies and the communities they serve. The Police Data Initiative encourages local law enforcement agencies to publicly share information about 911 calls, stops, arrests, and other police activities so that community members can look both at individual cases, as in some high-profile events covered by the media, and at trends that might reveal disproportionate response over time.

It has been more than two decades since the Violence Against Women Act was first passed, and we have seen significant improvements in the criminal justice system’s response to domestic violence, sexual assault, and stalking. This success is due in great part to the efforts of victim advocates and law enforcement officials working together to improve systems. But as we celebrate these successes, we know this work is far from finished, and that there is still much work to be done to improve police response - particularly within marginalized communities.

As we work with law enforcement to improve responses to victims and communities, we must ensure that the privacy and safety of victims who interact with law enforcement is a fundamental cornerstone of those efforts. Police data released to the public has the potential to reveal victims’ identities and consequently put them at risk of further harm, harassment, or damage to their reputation. These concerns can also significantly impact a survivor’s decision on whether they even contact law enforcement for help in an emergency.

For more than a year, Safety Net has explored the issue of how to maintain victim privacy and safety while simultaneously supporting the overall intention behind the Police Data Initiative. These efforts have been made possible by the support of the Office on Violence Against Women (U.S. Department of Justice) and Harvard University’s Berkman Center for Internet & Society, and in partnership with the White House, the Police Foundation, the International Association of Chiefs of Police, the Sunlight Foundation, the National Institute of Standards and Technology, the Vera Institute of Justice, and others.

Today, we are pleased to announce the release of a guide that outlines the results of these efforts titled, “How Law Enforcement Agencies Releasing Open Data Can Protect Victim Privacy & Safety”, which was authored collaboratively with the Police Foundation. This guide describes the need for victim privacy to be a central consideration in efforts to share data with the public, and provides specific recommendations that will assist local law enforcement agencies in their efforts to ensure victim privacy while increasing transparency.

In the coming weeks, we will be releasing a similar guide written for advocates, as well as an issue summary that describes how the Police Data Initiative intersects with our work to ensure the safety and privacy of survivors.

 

Revenge Porn and the Distribution of Sexually Explicit Images: What’s consent have to do with it?

In February, a New York court dismissed a case against a man who posted nude images of his ex-girlfriend online by sharing them on his twitter account and emailing them to her employer and family.  While his actions were reprehensible he faces no punishment because, unfortunately, legal limitations in New York, and many other states, do not currently make what he did criminal. But that is changing.

When sexually explicit images are uploaded online and distributed without consent of the person in the image, it’s often done as a tactic of abuse meant to cause humiliation and harm to the person. Many of these images may have been taken or originally shared with someone else under the expectation of privacy and within a trusting relationship. Some images may have been captured without the victim’s knowledge. In either case, it is an unacceptable violation of trust and privacy. This abuse has been coined “revenge porn,” a term that has been getting a lot of media lately.

Whether the victim willingly took or originally shared the image is irrelevant. Sharing a picture with one person does not mean consent was given for mass, public distribution of the image, and it definitely is not a green light for the person who received the picture to do what they please with it. We make many decisions that can have severe consequences if someone we trusted abused that trust. I can give my neighbors a key to my house and still have a personal and legal expectation that they will not steal from me when I’m not home. I can give a store employee my credit card and expect that will only use the information to finalize the purchase that I have requested. If they do, I am legally protected.

We must stop blaming the victim and start holding abusers accountable in these cases. The person who shared these images with the intent to harm, injure, humiliate, and abuse. By focusing on the victim’s actions and questioning why the victim shared the picture in the first place, as Mary Anne Franks, a law professor at the University of Miami said, "…what we're really saying is if you're sexual with one person, society is entitled to treat you as sexual for all purposes…”

Fortunately, the perception of this behavior is changing, as is the legal landscape around it. Due to the strength and determination of many survivors, states have begun drafting and enacting legislation to address this issue.

Read our new handout on Images, Consent, & Abuse for more detailed information on this issue and tips for survivors. Additional resources can also be found at withoutmyconsent.org. This issue has gained momentum and attention recently as people speak up and speak out. Learn more at the above links and share to continue the conversation.