Victims of revenge porn are stuck between recalcitrant Silicon Valley technology service providers and the comparatively glacial pace of courts and legislators.

They have few rights even under laws that treat the public, nonconsensual distribution of intimate images as a crime against privacy or public decency, and they are hobbled by  laws that protect websites from liability.

To victims, the repeated sharing of their images—even if the acts in the images were consensual at the time pictures were taken—often feels like a form of ongoing sexual assault.

Many victims report outcomes like job loss and a sense of shame in their communities. Witness Katie Hill, who resigned her seat in the U.S. House of Representatives after her  intimate images were nonconsensually shared.

In even more extreme cases, family disownment, physical abuse, even honor killings, aren’t unheard of, either.

It’s a problem tech can’t necessarily solve.

Tech companies like Facebook have begun to shift from relying on users to report images—unlikely to happen in closed groups created for sharing purposes—to developing “digital fingerprints,” or hashes, which a single changed pixel could throw off.

The kinds of artificial-intelligence solutions that identify child sexual abuse material could also identify other forms of image-based abuse, but aren’t without their own problems.

Differentiating consensual from nonconsensual material in both Western and non-Western cultures, where nudity may not be a factor, is one such challenge. Another is the fact that algorithms which lack images of people from a certain race or ethnicity may not proactively identify their nonconsensual images.

Other issues include:

      • Reliance on contextual language. In an interview with The Crime Report, David Bateman, a partner in the law firm K&L Gates and a co-founder of the Cyber Civil Rights Legal Project, said these kinds of posts are “not deeply contextual” compared with, say, grooming conversations.
      • Self-limiting deployment. The New York Times reported that Apple doesn’t scan files saved in its cloud servers; Dropbox, Google and Microsoft only scan for shares, not uploads. Video content is a blind spot for virtually all major tech companies.
      • Lack of cross-platform information sharing, also as reported in the Times.

Enablers Have Legal Protection

As tech companies struggle to respond, the burden of removing images often falls to the victims—a protracted, “whack-a-mole” process that requires victim-survivors to rely on themselves or trusted friends to seek out their own abuse online.

Creative solutions, such as using the Digital Millennium Copyright Act (DMCA) to file takedown notices against hosts, are limited to cases where the original images belong to the victim.

In part, that’s why New York State’s new statute, “Unlawful dissemination or publication of an intimate image,” requires websites to remove the material as soon as they are notified of its existence and its nonconsensual nature. If a site refuses, it can be held liable.

On the other hand, this could run afoul of Section 230 of the 1996 Communications Decency Act (CDA), which protects websites, internet service providers, and others from liability for what their users post.

Indeed, in 2018, Google successfully managed to block the passage of New York’s law based on that single requirement. More recently, the New York Times reported, IBM, Disney, and Marriott have all taken up an anti-CDA mantle, albeit for different reasons.

It isn’t that the law can’t adapt.

The 2018 passage of two anti-human-trafficking laws, the Fight Online Sex Trafficking Act (FOSTA) and the Stop Enabling Sex Traffickers Act (SESTA), create exceptions to the CDA— making online providers liable for third-party ads that promote prostitution.

On the criminal side, United Kingdom law professor Clare McGlynn has proposed classifying image-based offenses as sex crimes. This would make consent, not intent, the crucial factor. It could help to ensure more investigative resources are devoted to the crimes.

It would also afford victims more rights, such as protections against using sexual history during trial, or against being called to the stand as witnesses having to face their abuser.

However, the requirement to register as a sex offender may do more harm than good.

In an interview, Robert Peters, senior attorney with the Zero Abuse Project, said that, for example, placing juvenile first-time offenders—who may not have been aware of the consequences of their actions—on a registry alongside convicted adult offenders could damage a second individual and “overburden” an already strained system.

There is a third way.

Many women surveyed across the U.K., Australia, and New Zealand discussed the need for “some kind of punishment, but not prison,” according to a report, Shattering Lives and Myths, published in the U.K. in 2019.

Taking Responsibility

A restorative-justice approach could respond to that. It would help offenders understand the impact of their actions and to take responsibility.

“Traditional cases place the government in one big bubble and the defendant in another,” Peters explained, “where the victim does not have equal bargaining power.”

By mapping out the rights and responsibilities of both plaintiff and defendant, Peters argued, restorative justice could rebalance the power differential, reducing the government’s role in favor of the victim’s.

In other words: No longer would a victim have to serve as a witness in the government’s attempt to prove an invasion of privacy or offense to public decency beyond a reasonable doubt. Instead, they would be able to describe in their own words the impact that nonconsensual intimate image sharing had on their life and livelihood. They would be able to ask questions of their own.

And they could have a say in what “restoration” might look like for them.

It has already been used successfully:

Peters acknowledges that in order to work, restorative justice requires the right systems and structures.

“We have to ensure we meaningfully elevate the victim’s perspective in the criminal justice process, free of coercive dynamics that result in manipulation of victims to the offender’s benefit,” he explained.

For example, less experienced prosecutors may be assigned revenge-porn or domestic-violence cases that are typically viewed as lower profile and often, more frustrating to deal with. Disempowered victims may try to drop cases, while overextended prosecutors may offer plea bargains to defendants.

But frequently, said Peters, victims don’t fully understand the legal ramifications of a plea bargain, taking the prosecutor’s word that going to trial would result in undue stress—even if it also led to a lesser sentence.

“Restoration can’t happen when we don’t have an accurate picture of what that looks like from the victim’s genuine perspective, free of undue influence,” Peters added.

Properly trained facilitators are one way to ensure this accuracy. While they needn’t be court-appointed, Peters said, a court-sanctioned example is modeled by the nonprofit organization Court Appointed Special Advocates. In the United Kingdom, facilitators enable dialogue to take place either in person or remotely.

The downside, reported NPR in 2017, is that improperly trained facilitators can do more damage. Other factors that can derail restorative justice include the failure to protect offenders from legal jeopardy under the Fifth Amendment.

Christa Miller

Christa Miller

In some cases, restorative justice could even be misused when a traditional approach would be more appropriate—such as when a multiple offender is shown to prey on victims. In those cases, a traditional punitive approach and perhaps registered sex offender status would be called for.

However, these are solvable problems as long as legislators, the courts and policy-makers are willing to look closely at all the nuances, listen and learn from victims’ needs, and commit to find solutions that are in step with everyone’s goals.

blog articles

For International Revenge Porn Victims, Justice Is Hard to Find

Alexandra Gauvin first learned her nude pictures were being shared through an online chat app when a stranger from Alabama sent her a direct message on Instagram last summer. The 25-year-old New Brunswick resident logged into her Instagram account and found a message request from the stranger claiming that someone was spreading Gauvin’s private photographs [...]

Fraudulent Lawsuits and Illegal Hacks to Silence Online Consumer Complaints

Some of it may be familiar to our readers (see here for my brief on the "fraudulent lawsuits" side of the analysis), but there's a new item, too: [Aaron] Minc advertises himself as an internet defamation attorney, capable of "removing damaging content from the internet." He can be seen in an advertisement with a man [...]

Forced Arbitration: A Clause for Concern for Businesses

Mandatory arbitration deprives consumers of important options if a product is faulty or harmful. Here's how to fight back.   In July 2018, Ronald Gorny woke up in his Chicago home and noticed a few small insects scurrying on his new upholstered headboard. Gorny pulled back the sheets to find dozens of more bugs, all [...]

Comments are closed.

Looking for Support?

Get in Touch with us