Reddit matures, and apologises

The site's general manager has apologised for its conduct during the Boston crisis.

Reddit's general manager , Erik Martin, has apologised for the site's role in creating and spreading misinformation related to the Boston Marathon bombings:

Though started with noble intentions, some of the activity on reddit fueled online witch hunts and dangerous speculation which spiraled into very negative consequences for innocent parties. The reddit staff and the millions of people on reddit around the world deeply regret that this happened. We have apologized privately to the family of missing college student Sunil Tripathi, as have various users and moderators. We want to take this opportunity to apologize publicly for the pain they have had to endure. We hope that this painful event will be channeled into something positive and the increased awareness will lead to Sunil's quick and safe return home. We encourage everyone to join and show your support to the Tripathi family and their search.

The apology is interesting, because it reflects how the rest of the world views Reddit far more than how the community views itself. The decentralised nature of the site means that almost everything that Martin is apologising for is actually the fault of its users, rather than the company which runs Reddit and which Martin is in charge of. The subreddit, r/findbostonbombers, was set up by, and moderated by, normal users; it was Reddit's users who posted personal information, and Reddit's users who led the witch hunts. Viewed from that angle, blaming "Reddit" for this tragedy seems like blaming "Twitter" for naming rape victims; a useful shorthand, but not something we'd expect the head of the company to apologise for.

But the Reddit community is still centralised in a way that Twitter isn't, and that has repercussions. Go to the front page of Reddit without being logged-in, and you'll see the same list of content that everyone else will - and the same that many logged-in users see, as well. Hit up Twitter, on the other hand, and the site doesn't show you a thing until you've told it who you want to follow.

In other words, Twitter is a communications medium through and through, but Reddit – while not a publication in a traditional sense – has elements that we recognise from more conventional news sites. That means the site walks a fine line between trying to enable as much freedom for its users as possible, and having to deal with their mistakes as though someone on a salary made them.

Previously, the administration has been pretty unambiguous in declaring that it is not responsible for its users actions, beyond the site's "park rules":

A small number of cases that we, the admins, reserve for stepping in and taking immediate action against posts, subreddits, and users. We don’t like to have to do it, but we’re also responsible for overseeing the park. Internally, we’ve followed the same set of guidelines for a long time, and none of these should be any surprise to anyone…

  1. Don’t spam
  2. Don’t vote cheat (it doesn’t work, anyway)
  3. Don’t post personal information
  4. Don’t post sexually suggestive content featuring minors
  5. Don’t break the site or interfere with normal usage of the site for anyone else

Those rules are not particularly restrictive, and #4 was only strengthened from the incredibly laissez-faire "no child pornography" last February. Beyond that, the admins have tended to stay silent in the face of what would seem to be noteworthy controversies, like the outing of Violentacrez by Gawker's Adrien Chen and the subsequent widespread banning of Gawker media links from the site.

So it would have been easy for Reddit to respond to this latest problem in much the same way. Blame its users, point out that it has rules to prevent the worst of it and that it is deliberately laissez-faire about the rest, and wash its hands of the whole deal.

That it hasn't is a sign of maturity from the administrative team. But it also means that there's going to be a lot more controversies which they'll be expected to have a view on in future, unless the Reddit community matures at the same time. The chances of that happening soon remain slim.

Photograph: Getty Images

Alex Hern is a technology reporter for the Guardian. He was formerly staff writer at the New Statesman. You should follow Alex on Twitter.

Show Hide image

Age verification rules won't just affect porn sites – they'll harm our ability to discuss sex

Relying on censorship to avoid talking about sex lets children down.

The British have a long history of censoring sex. In 1580, politician William Lambarde drafted the first bill to ban "licentious" and "hurtful... books, pamphlets, ditties, songs, and other works that promote the art of lascivious ungodly love". Last week, the UK government decided to have another crack at censorship, formally announcing that age verification for all online pornographic content will be mandatory from April 2018.

It is unclear at this point what this mandatory check will entail, but it's expected that you will need to submit your credit card details to a site before being allowed to access adult content (credit cards can’t be issued to under-18s).

The appointed regulator will almost certainly be the British Board of Film Classification who will have the authority to levy fines of up to £250,000 or shut down sites that do not comply. These measures are being directly linked to research conducted by the NSPCC, the Children’s Commissioner and the University of Middlesex in 2016, which surveyed more than 1,000 11 to 16-year-olds about viewing online pornography and found over half had accessed it. 

Digital minister Matt Hancock said age verification "means that while we can enjoy the freedom of the web, the UK will have the most robust internet child protection measures of any country in the world". And who can argue with that? No sane adult would think that it’s a good idea for children to watch hardcore pornography. And because we all agree kids should be watching Peppa Pig rather than The Poonies, the act has been waved through virtually unchallenged.

So, let’s put the issue of hardcore pornography to one side, because surely we are all in agreement. I’m asking you to look at the bigger picture. It’s not just children who will be censored and it’s not just Pornhub and Redtube which will be forced to age check UK viewers. This act will potentially censor any UK site that carries adult content, which is broadly defined by the BBFC as "that it was produced solely or principally for the purposes of sexual arousal".

I am a UK academic and research the history of sexuality. I curate the online research project www.thewhoresofyore.com, where academics, activists, artists and sex workers contribute articles on all aspects of sexuality in the hope of joining up conversations around sex that affect everyone. The site also archives many historical images; from the erotic brothel frescoes of Pompeii to early Victorian daguerreotypes of couples having sex. And yet, I do not consider myself to be a porn baron. These are fascinating and important historical documents that can teach us a great deal about our own attitudes to sex and beauty.

The site clearly signposts the content and asks viewers to click to confirm they are over 18, but under the Digital Economy Act this will not be enough. Although the site is not for profit and educational in purpose, some of the historical artefacts fit the definition of  "pornographic’" and are thereby liable to fall foul of the new laws.

And I’m not the only one; erotic artists, photographers, nude models, writers, sex shops, sex education sites, burlesque sites, BDSM sites, archivists of vintage erotica, and (of course) anyone in the adult industry who markets their business with a website, can all be termed pornographic and forced to buy expensive software to screen their users or risk being shut down or fined. I have contacted the BBFC to ask if my research will be criminalised and blocked, but was told "work in this area has not yet begun and so we are not in a position to advice [sic] you on your website". No one is able to tell me what software will need to be purchased if I am to collect viewers' credit card details, how I would keep them safe, or how much this would all cost. The BBFC suggested I contact my MP for further details. But, she doesn’t know either.

Before we even get into the ethical issues around adults having to enter their credit card details into a government database in order to look at legal content, we need to ask: will this work? Will blocking research projects like mine make children any safer? Well, no. The laws will have no power over social media sites such as Twitter, Snapchat and Periscope which allow users to share pornographic images. Messenger apps will still allow users to sext, as well as stream, send and receiving pornographic images and videos. Any tech savvy teenager knows that Virtual Private Network (VPN) software will circumvent UK age verification restrictions, and the less tech savvy can always steal their parents' credit card details.

The proposed censorship is unworkable and many sites containing nudity will be caught in the crossfire. If we want to keep our children "safe" from online pornography, we need to do something we British aren’t very good at doing; we need to talk openly and honestly about sex and porn. This is a conversation I hope projects like mine can help facilitate. Last year, Pornhub (the biggest porn site in the world) revealed ten years of user data. In 2016, Brits visited Pornhub over 111 million times and 20 per cent of those UK viewers are women. We are watching porn and we need to be open about this. We need to talk to each other and we need to talk to our kids. If you’re relying on government censorship to get you out of that tricky conversation, you are letting your children down.

The NSPCC report into children watching online pornography directly asked the participants about the effectiveness of age verification, and said the children "pointed out its limitations". When asked what intervention would most benefit them, this was the overwhelming response: "Whether provided in the classroom, or digitally, young people wanted to be able to find out about sex and relationships and about pornography in ways that were safe, private and credible." I suggest we listen to the very people we are trying to protect and educate, rather than eliminate. 

Dr Kate Lister researches the history of sexuality at Leeds Trinity University