Twitter Needs To Do Something About Its Child Porn Problem

Twitter Goes Public On The New York Stock ExchangeLast Thursday, Canadian officials announced the bust of an enormous child pornography operation that has led to the arrest of 300 people, 76 of them in the United States. Among the people arrested in the investigation, nicknamed “Project Spade” were doctors, school teachers, priests, and actual parents. It’s terrifying, and makes your soul hurt, the very idea of child pornography, and I think that idea of any child being victimized and hurt in the way makes almost every human feel a combination of extreme anger and intense sadness and revulsion. I know it does me.

It’s great when cases like this happen, we like to think that maybe the world is just a little bit safer for kids, and according to an article on RT.com 386 children around the world who were identified as being at risk were rescued. That’s good news. But it’s not nearly enough, because not only does child pornography exist in the shadowy recesses of the internet, it’s also available in plain sight to anyone with a computer and a Twitter account.

Twitter has come under attack recently due to the fact they contribute very little money  to combating child pornography. According to The Independent:

As Twitter prepares for an estimated $1.1 billion flotation on the stock market, the company is dragging its feet over raising its contribution to combatting online child abuse and pornography from just £5,000.

The company is a member of the Internet Watch Foundation (IWF), an organisation backed by Prime Minister David Cameron to work in collaboration with internet service providers to clear up the murky and chaotic world of online abuse.

And while some other tech companies have pledged up to £1 million in support to the industry-funded charity, Twitter insists it shouldn’t have to pay any more than the tiny sum it already does because it isn’t turning a profit.

From an article on NPR, Twitter uses a Microsoft system called PhotoDNA, which is used to scan every photograph that is uploaded to its site, and it correlates with previously scanned images of child abuse that receive a unique signature before they are entered into a database. If one of these photographs appears on the Internet, it is instantly flagged and removed.

In order for the photograph to be scanned in the first place, it has to be reviewed by an actual human. The people who perform this sort of work are content moderators who work for tech companies, and I’m pretty positive it must be one of the most horrific careers someone can go into. But it’s not enough. 

Yesterday I was contacted by someone who asked me to look into why there are so many current child abuse images available on Twitter. You don’t have to follow or be followed by the Twitter user sharing these photographs, if you know their account name all you have to do is click on their profile and all of their images are readily available. Some of these accounts have thousands of followers, and some of them have very few, but what they all have in common is that they are hosting child abuse images on their accounts, re-tweeting images sent to them, and requesting to share other images. I clicked on one screen name sent to me to verify these claims and I briefly saw images of girls, very young girls, well before they even reach the usual child abuse moniker of “Barely Legal” – topless, in provocative poses.

From Twitter user @Snaisy:

Twitter’s complete reliance on user reports leaves a locked account 100%
safe from detection.These images tend to be tweeted and swapped without hashtags to keep them relatively off-radar, also there were a few images of pretty clearly underage girls tweeted with “18+” in the tweet, I reported a few and none were removed. I assume there was no verification process for this.Tweeting about underage fantasies or having them in your bio is not grounds for suspension.

 

And then an update I received from her :

 At least 2 of the accounts I reported haven’t been suspended, despite having full blown, sexualised images of obviously young (as in
YOUNG) girls. I am going to link to the Twitter accounts, and I am also going to suggest
that you don’t look unless you absolutely, positively, definitely have to confirm that this is Twitter really not doing their fucking job.

 

According to Twitter, here is their stance on child pornography:

Child sexual exploitation

We do not tolerate child sexual exploitation on Twitter. When we are made aware of links to images promoting child sexual exploitation they will be removed from Twitter without further notice and reported to The National Center for Missing & Exploited Children (“NCMEC”). We permanently suspend accounts promoting or containing updates with links to child sexual exploitation.

If you encounter a Twitter account distributing or promoting child sexual exploitation, please send an email tocp@twitter.com to let us know. Include a link to the profile and links to the relevant tweets. To find the URLs of individual Tweets, see our help page.

NOTE: Please do NOT tweet, retweet or repost child sexual exploitation for any reason. Report it to us immediately by sending an email to cp@twitter.com and we will take steps to remove it.

For more information about Twitter’s child sexual exploitation policy, click here.

From what I have seen, reporting these accounts isn’t enough because as of this morning, many remain.

Last night an online petition was started here to urge Twitter to do more to combat child pornography. You can also follow the accounts of @Snaisy and @ReportAPedo who have been tirelessly contacting Twitter with messages using the hashtag #TwitterCP.

Screen Shot 2013-11-19 at 5.14.12 AM

Screen Shot 2013-11-19 at 5.19.51 AM

From what I’m aware of, both accounts will be listing the names of accounts that need to be reported to Twitter, and anyone with a Twitter account can do this. At the very least, awareness has to be raised about this.

We (And I mean this in the collective) care a lot about women’s issues. We care about rape, we care about violence against women, when we see instances of injustice against women we care about it, we have Twitter storms, we sign petitions, we write articles, we rally and protest. Which we should do. That’s all important to do. But it’s almost that the idea of children being victimized in this way, it’s so awful to even contemplate, that we don’t rage as hard or bring enough attention to cases like these. This is unacceptable. The fact that I can sign on to my Twitter account and see these images is unacceptable.

This isn’t a free speech issue. This isn’t taking away anyone’s freedom of thought or opinion. This is stopping the spread of child pornography on the internet. Facebook is usually all too happy to flag or delete images they deem “indecent” – including photos of breastfeeding moms in lactation group pages, it’s time a website like Twitter takes a stronger stance in combatting the sharing and spreading of photographs of child abuse.

(Image: getty images)

Similar Posts