Twitter goes full douchebag

Twitter is to block most third-party apps, which don't comply with their strict rules on access.

We've written, at length, on Twitter's attempts to safeguard the profitability of its network against all-comers, so this shouldn't come as any surprise: the company has confirmed that, from March 2013, they will begin enforcing a de facto ban on third-party apps.

The ban is revealed in two passages in a post to developers by Twitter's director of consumer content, Michael Sippey. The first tells developers that the company's "display guidelines" will become "display requirements", while the second explains that from now on, any service with more than one million users will need special permission from Twitter to continue growing.

The display requirements are an incredibly strict set of requirements which not only hit their intended target, third-party consumer clients like Tweetbot, Econfon or Ubersocial, but also a huge number of unintended ones – Jason Kottke says that his aggregation site Stellar meets just four of the 16 requirements, while Marco Arment, developer of the popular Instapaper reading app, thinks that his "liked by friends" feature will have to be pulled, or at least rewritten, to comply.

Other rules look likely to hit services like Flipboard (which breaks 5.a., "tweets that are grouped together into a timeline should not be rendered with non-Twitter content. e.g. comments, updates from other networks") and Storify and Favstar (which break 3.b., "no other social or 3rd party actions may be attached to a Tweet"). Or they would, had Twitter not clarified that actually, those latter two are the "good" apps. Ryan Sarver, the company's director of platform, tweeted that they are what they want in the ecosystem.

This ought to be good news - two of the most useful third party apps are safe - but in fact, it's even more upsetting. It shows that, from the off, Twitter's rules all contain an implicit "...but you can ignore these if we like you." If that is the case, it's not hard to imagine that they also contain an implicit "...and no matter how well you follow these, if we don't like you, you're off the service." Everything using the network does so at the capricious whim of its overlords.

The million user limit is even more indiscriminately applied. Any application, no matter what it does or how well it complies with the published rules, needs to "work with [Twitter] directly" to get more users than that. It is, essentially, a rule that gives the company carte blanch to pick and choose whether any company getting too big can be allowed to grow.

Most companies try to keep customers by keeping customers happy. Twitter is clear in its intentions: it wants to keep customers by making it extraordinarily difficult for them to leave. It is holding its network hostage; you can go, but you can't take your friends with you.

In July, when Twitter first acted on their intentions to block clients which "mimic or reproduce the mainstream Twitter consumer client experience" I wrote that:

That is bad enough for the company, but up to now, the users of those apps are a minority on the service. The vast majority of twitterers use the website itself, or one of the official clients on mobile devices.

But with these changes, Twitter hasn't just hit the apps used by a small (nerdy) minority of users. There are going to be very few Twitter users who aren't affected in some way or another by this attempt to turn the site into a Facebook-style walled garden.

Ben Brooks, author of the Brooks Review, sums up the news:

We like to make analogies to Apple in tech blogging circles, so here goes: this is the moment in Twitter’s life where they kicked Steve Jobs out of the company and told Sculley to run it.

Alex Hern is a technology reporter for the Guardian. He was formerly staff writer at the New Statesman. You should follow Alex on Twitter.

Show Hide image

Age verification rules won't just affect porn sites – they'll harm our ability to discuss sex

Relying on censorship to avoid talking about sex lets children down.

The British have a long history of censoring sex. In 1580, politician William Lambarde drafted the first bill to ban "licentious" and "hurtful... books, pamphlets, ditties, songs, and other works that promote the art of lascivious ungodly love". Last week, the UK government decided to have another crack at censorship, formally announcing that age verification for all online pornographic content will be mandatory from April 2018.

It is unclear at this point what this mandatory check will entail, but it's expected that you will need to submit your credit card details to a site before being allowed to access adult content (credit cards can’t be issued to under-18s).

The appointed regulator will almost certainly be the British Board of Film Classification who will have the authority to levy fines of up to £250,000 or shut down sites that do not comply. These measures are being directly linked to research conducted by the NSPCC, the Children’s Commissioner and the University of Middlesex in 2016, which surveyed more than 1,000 11 to 16-year-olds about viewing online pornography and found over half had accessed it. 

Digital minister Matt Hancock said age verification "means that while we can enjoy the freedom of the web, the UK will have the most robust internet child protection measures of any country in the world". And who can argue with that? No sane adult would think that it’s a good idea for children to watch hardcore pornography. And because we all agree kids should be watching Peppa Pig rather than The Poonies, the act has been waved through virtually unchallenged.

So, let’s put the issue of hardcore pornography to one side, because surely we are all in agreement. I’m asking you to look at the bigger picture. It’s not just children who will be censored and it’s not just Pornhub and Redtube which will be forced to age check UK viewers. This act will potentially censor any UK site that carries adult content, which is broadly defined by the BBFC as "that it was produced solely or principally for the purposes of sexual arousal".

I am a UK academic and research the history of sexuality. I curate the online research project www.thewhoresofyore.com, where academics, activists, artists and sex workers contribute articles on all aspects of sexuality in the hope of joining up conversations around sex that affect everyone. The site also archives many historical images; from the erotic brothel frescoes of Pompeii to early Victorian daguerreotypes of couples having sex. And yet, I do not consider myself to be a porn baron. These are fascinating and important historical documents that can teach us a great deal about our own attitudes to sex and beauty.

The site clearly signposts the content and asks viewers to click to confirm they are over 18, but under the Digital Economy Act this will not be enough. Although the site is not for profit and educational in purpose, some of the historical artefacts fit the definition of  "pornographic’" and are thereby liable to fall foul of the new laws.

And I’m not the only one; erotic artists, photographers, nude models, writers, sex shops, sex education sites, burlesque sites, BDSM sites, archivists of vintage erotica, and (of course) anyone in the adult industry who markets their business with a website, can all be termed pornographic and forced to buy expensive software to screen their users or risk being shut down or fined. I have contacted the BBFC to ask if my research will be criminalised and blocked, but was told "work in this area has not yet begun and so we are not in a position to advice [sic] you on your website". No one is able to tell me what software will need to be purchased if I am to collect viewers' credit card details, how I would keep them safe, or how much this would all cost. The BBFC suggested I contact my MP for further details. But, she doesn’t know either.

Before we even get into the ethical issues around adults having to enter their credit card details into a government database in order to look at legal content, we need to ask: will this work? Will blocking research projects like mine make children any safer? Well, no. The laws will have no power over social media sites such as Twitter, Snapchat and Periscope which allow users to share pornographic images. Messenger apps will still allow users to sext, as well as stream, send and receiving pornographic images and videos. Any tech savvy teenager knows that Virtual Private Network (VPN) software will circumvent UK age verification restrictions, and the less tech savvy can always steal their parents' credit card details.

The proposed censorship is unworkable and many sites containing nudity will be caught in the crossfire. If we want to keep our children "safe" from online pornography, we need to do something we British aren’t very good at doing; we need to talk openly and honestly about sex and porn. This is a conversation I hope projects like mine can help facilitate. Last year, Pornhub (the biggest porn site in the world) revealed ten years of user data. In 2016, Brits visited Pornhub over 111 million times and 20 per cent of those UK viewers are women. We are watching porn and we need to be open about this. We need to talk to each other and we need to talk to our kids. If you’re relying on government censorship to get you out of that tricky conversation, you are letting your children down.

The NSPCC report into children watching online pornography directly asked the participants about the effectiveness of age verification, and said the children "pointed out its limitations". When asked what intervention would most benefit them, this was the overwhelming response: "Whether provided in the classroom, or digitally, young people wanted to be able to find out about sex and relationships and about pornography in ways that were safe, private and credible." I suggest we listen to the very people we are trying to protect and educate, rather than eliminate. 

Dr Kate Lister researches the history of sexuality at Leeds Trinity University