Tech Company was Complicit in Sex Trafficking

Overview

Sex trafficking can be facilitated by tech companies in several ways.

In the case of an entity such as Backpage.com, advertisements for the sale of in-person sexual encounters with sex trafficking victims were occurring on (i.e., being facilitated by) a digital platform. This is still occurring on sites that have cropped up since Backpage’s demise and even on social media platforms like Twitter.

Another example includes pornography websites which profit from videos depicting criminal activity and abuse including depictions of sexual assault, sex trafficking victims and minors. This can include videos or photos made as a result of threats of force, fraud, or coercion.

Similarly, sex traffickers can profit from the sale of their victims for digital sexual encounters such as strip shows and sexual acts performed via live video stream on platforms such as Chaturbate or OnlyFans (known as “cam sites”). These came-site profiles are often advertised using free videos and clips on more mainstream social media platforms like Twitter and Snapchat.

In still other instances, insufficient safeguards for child users online have allowed perpetrators to find and groom children for sex trafficking via social media platforms or video game messaging functions. This sex trafficking includes not only in-person sexual encounters but also the creation and sharing of child sexual abuse material (CSAM which is often referred to as “child pornography” colloquially) via social media and pornography websites. Predators groom and coerce children into creating CSAM, many times by posing as peers initially, and then share and even monetize this content online.

In all of these instances—and others like them—it is possible that technology companies can be found liable and held accountable for facilitating and benefitting from sex trafficking.

Defining Terms and Issues Related to Tech Companies and Sex Trafficking

Digital platforms are increasingly used to facilitate online sex trafficking. Predators are skilled at identifying vulnerable targets like children posting about their insecurities, problems in their relationships, or someone who is simply young and inexperienced. It’s easy for predators to pretend to be a friend and supporter while hiding behind a keyboard.

When these interactions occur on a familiar or commonly-used platform, it is easy to feel like the situation is safe — especially if the trafficker has invested some time into building a relationship. Many people these days have friends that they only know from social media, so it’s understandable how we can drop our guard in these circumstances. 

In some cases, recruiting for prostitution and trafficking occurs on dating apps where users are supposed to be able to “safely” connect with new people. Sexual violence related to dating app connections is very real and sexual harassment on dating apps is common.

Other tech companies that cause concern are websites that promote and financially profit from “sugar dating.” Sugar dating is when a man — who is usually older and financially well off (called a “sugar daddy”) — offers to pay a woman — who is usually younger and socially/financially disadvantaged (called a “sugar baby) — to date him. Many sites claim these arrangements are fun and harmless and do not require sex. On the contrary, sex is almost always expected and many young women have found themselves in situations they did not choose or expect. This becomes a form of forced prostitution. Sugaring has also led to sexual assault.

Tech companies have a responsibility to maintain safeguards against such abuses. If they fail to do so or if they knowingly profit from these abuses, then they may be legally liable.

Sexual Exploitation via Twitter

Twitter openly allows pornographic content on its platform. At the very least, Twitter turns a blind eye to advertisements for commercial sex which are rampant on the site as well. This has led to an increase in traffickers using the platform to advertise their victims for commercial sex, even creating and controlling their accounts for this purpose. Furthermore, children as young as 13 are allowed to create an account which makes them vulnerable to grooming and exploitation by predators. Twitter’s policy of allowing hardcore pornographic content also opens the door for image based sexual abuse (non-consensually shared pornography aka “revenge porn”) and CSAM (child sexual abuse material aka “child porn”) to be distributed on their platform.

Twitter has a duty  to maintain safeguards against such abuses, especially for the children on its platform. If it fails to do so or if it knowingly profits from such abuse, then it may be legally liable.

Could I Qualify for a Lawsuit?

Confidentiality: The National Center on Sexual Exploitation (NCOSE) may use or maintain the information provided here to contact you for further details regarding your experience. Information from this form is strictly confidential. Access to confidential information is permitted only on a need-to-know basis and limited to the minimum amount of confidential information necessary to accomplish the intended purpose of the use, disclosure, or request. By submitting this information, you authorize representatives of NCOSE to contact you for further information, and confirm your understanding that the provision of information on this form does not establish a client-attorney relationship between you and NCOSE.

Disclaimer: The Law Center cannot guarantee legal representation to every injured person with a potential legal claim and the submission of this questionnaire does not create an attorney-client relationship. Except in situations directly related to litigation the Law Center generally cannot assist with record expungement requests.

Did someone upload sexually explicit images of you as a minor, or without your knowledge or consent?

Were you ever forced, coerced, pressured, or tricked into performing paid sex acts?

Did someone solicit or upload sexually explicit images of your child over the Internet?

Quick Exit