New Times,
New Thinking.

  1. Science & Tech
22 July 2013updated 22 Oct 2020 3:55pm

Would the Daily Mail website fall foul of the online porn filters it has championed?

Ban this sick filth. No, not THIS sick filth, obviously.

By Alex Hern

David Cameron wants to block online porn, the Daily Mail reported approvingly this morning.

 

Now comes the big question: where the line should be drawn. If you’re too lax, things slip through the filter; if you’re too strict, non-pornographic images or sites get caught in the net. How do you tell what’s pornographic and what isn’t? “I know it when I see it,” said Justice Potter Stewart in 1964. But it isn’t that simple.

Despite the best efforts of programmers everywhere, you can’t just tell a computer “block any page with an image or video of a female nipple or male or female genitalia” (a rule which, itself, would be hopelessly over-strict; farewell Titian! So long, Leonardo!). Instead, most blocking software uses contextual clues on the page to work out whether the site itself is problematic. That can be obvious: there are few safe-for-work sites which use the phrase “double penetration”, for instance. Except this site, now – which explains part of the issue.

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

Existing filters show that if you want to make the blocking comprehensive, some of the contextual clues used have to be broad enough to make collateral damage certain. Blocking anatomical terms like “vagina” or “anus”, for instance, frequently leads to sites discussing sexual health or feminist topics being caught up.

Meanwhile, humans – being cleverer than filters – learn to use terms which can’t be blocked because they also have innocuous uses. (Take, at the extreme end, the use of “Lolita” to denote images of child abuse.)

The thing is, even if all the technological quirks were worked out, drawing the line is still hard, just in terms of choosing how prudish we as a nation are. So where do we start blocking?

Pictures of women in their underwear?

 

 

Sexual, nude but non-explicit photos?

 

 

Pictures of women with clearly visible breasts?

 

 

Topless pictures of prostitutes in 1940s Paris?

 

 

Playboy style photo-shoots?

 

 

Non-explicit pictures of people having sex?

 

 

Well, would you want your children seeing that kind of material? Ban this sick filth.

Content from our partners
The UK’s skills shortfall is undermining growth
<strong>What kind of tax reforms would stimulate growth?</strong>
How to end the poverty premium