Important information regarding COVID-19 | Información importante sobre el Coronavirus
Tech Companies Fail to Adequately Respond to Revenge Porn Problem - We Rise Legal

Tech Companies Fail to Adequately Respond to Revenge Porn Problem

Revenge porn is the sharing of intimate images of someone without their consent. The practice is named revenge porn because it is often carried out by a scorned former partner, to damage the subject’s public image. It is an abhorrent practice that is currently illegal in 46 states and Washington, D.C. 

Of course, it is easy to vilify this practice at a theoretical level, but carrying out real-world punishment has proved challenging. A lot of tech platforms that perpetrators regularly rely on to post revenge porn images (i.e., Facebook), have relied on user-reporting. This becomes problematic when revenge porn creators and distributors have created online groups for the sole purpose of sharing such harmful images. No one subscribed to such a group is going to report. 

AI is another option for tech companies, but the software is not yet sophisticated enough to understand the context. The difference between an underwear ad and revenge porn is subtle to an AI but extremely different to a victim. Tech companies are spinning their wheels looking for a better way to handle this new type of sexual assault, and in the meantime, the subjects of the photos are continually re-victimized, and their abusers have free reign. 

This is not a new problem within the tech industry. In November of 2019, the New York Times published a piece about how many tech companies are aware that explicit images of children are shared on their platforms, and intentionally look the other way. The article calls tech companies to task, asserting that “approaches by tech companies are inconsistent, largely unilateral and pursued in secret, often leaving pedophiles and other criminals who traffic in the material with the upper hand. The companies have the technical tools to stop the recirculation of abuse imagery by matching newly detected images against databases of the material. Yet the industry does not take full advantage of the tools.” (source).

It is time for tech companies to step up and take responsibility for the abusive images shared on their sites. Both child and adult victims deserve to be made a priority when it comes to their pictures being shared without consent. 

If you have had private information and images of you shared publicly by someone else, then we are here to help you take back your power. Call Mandana Jafarinejad, Esquire, she has helped dozens of victims to reclaim their voices and their control through legal action.

Leave a Reply

Your email address will not be published. Required fields are marked *

Call Now Button855.316.3001