New Times,
New Thinking.

1 November 2017updated 27 Nov 2017 11:15am

Apple is cataloguing photos of people in bras, is it justified to freak out?

The iPhone's “bra” photo album exposes a serious privacy flaw – but it's not the one you think.   

By Amelia Tait

You can test it out for yourself. Pick up your iPhone, go to Photos, and click the little magnifying glass. Type “Brassiere” – if you have any, you’ll soon realise that Apple has been cataloguing pictures of you/your loved ones/your mates in their bras.

Apple has been doing this for over a year, but most people only realised this week. A viral tweet on Monday and a follow up one on Tuesday mean thousands of people are now aware of the feature. But as celebrity model Chrissy Teigen asked when she first exposed the tech to her 8.25 million Twitter followers: why? Why does it exist?

Image categorisation has been a feature on iPhones since the introduction of iOS 10 in June last year. Searching your photos will reveal the thousands of categories Apple caters for, from the ordinary (cocktails, puppies, birthday cake) to the slightly more bizarre (dance palace, firearm, clock towers).There are nearly 4,500 total categories, a number which was announced by developer Kenny Yin over a year ago.

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via
  • Administration / Office
  • Arts and Culture
  • Board Member
  • Business / Corporate Services
  • Client / Customer Services
  • Communications
  • Construction, Works, Engineering
  • Education, Curriculum and Teaching
  • Environment, Conservation and NRM
  • Facility / Grounds Management and Maintenance
  • Finance Management
  • Health - Medical and Nursing Management
  • HR, Training and Organisational Development
  • Information and Communications Technology
  • Information Services, Statistics, Records, Archives
  • Infrastructure Management - Transport, Utilities
  • Legal Officers and Practitioners
  • Librarians and Library Management
  • Management
  • Marketing
  • OH&S, Risk Management
  • Operations Management
  • Planning, Policy, Strategy
  • Printing, Design, Publishing, Web
  • Projects, Programs and Advisors
  • Property, Assets and Fleet Management
  • Public Relations and Media
  • Purchasing and Procurement
  • Quality Management
  • Science and Technical Research and Development
  • Security and Law Enforcement
  • Service Delivery
  • Sport and Recreation
  • Travel, Accommodation, Tourism
  • Wellbeing, Community / Social Services
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.

Is this creepy? Those who just found out about the feature seem to think so – but Apple reassures its users that its image recognition AI works within your actual device, maintaining the privacy of your pictures. Only you can see your brassiere shots, and no humans have been involved in picking them out and plonking them in an album for you. It’s also worth noting that the “brassiere” album is about as saucy as it gets – Apple doesn’t categorise pictures with labels like “underwear”, “nipples”, “nude”, “naked”, etc., etc.

Despite this, brassiere-gate has still exposed a serious privacy flaw.

It’s nothing to do with Apple, and it’s nothing to do with any bosoms that may or may not be in your camera roll. The privacy flaw exposed by this episode is simple: the flaw is the way we think about privacy itself.

It’s easy to freak out about how technology invades our private lives when nip pics are on the line. But similar image recognition to Apple’s is being used by other companies for far more nefarious ends. Do you want to live in a world where the police use facial recognition software to identify “troublemakers” at Notting Hill carnival? Where Facebook can automatically tag pictures of you? Where the iPhone X’s facial recognition could be used by brands to check you really are watching their ads?

Even these aren’t the real privacy issues we face today. It’s much easier to view tech as dystopian when it’s literally looking at you and/or your underwear pics, but in reality our privacy is invaded in far more boring (but still disturbing) ways. Your personal data is being collected, bought, and sold every day when you use Google, Amazon, Apple, Facebook, Instagram, etc., etc.  

The fact Apple’s image categorisation has existed for over a year and is only hitting headlines now illustrates how little we’re tuned in to the decisions companies make about our lives.

Yet again this week, headlines were made when people suspected Facebook was using microphones to listen to their conversations and then advertise products towards them. Facebook said it wasn’t – but why do we repeatedly freak out about this and not the fact the social network uses practically everything but the microphone to serve you ads? Listening in is spooky, sure, but so is monitoring your emotional state, keeping a track of your hobbies, handing over your personal details to the British government, and using your phonebook to recommend you Friends.

It’s even creepier that we often don’t know how Facebook has figured out stuff about us. I turned vegetarian around four months ago, and a search of Facebook’s Activity Log shows I haven’t used the word “vegetarian” on the social network since 16 March 2013. So why has the site’s ad preferences toolbar suddenly started listing “vegetarianism” and “vegetarian cuisine” under my interests?

Related: A guide to escaping Facebook’s evil clutches without, erm, actually deleting it

The fact that Apple’s “bra” album has made people think twice about privacy is still a good thing. Hopefully it will open up conversations about what we’re consenting to when we buy and use tech. But in the end, Apple’s AI hasn’t exposed anyone’s breasts – it has exposed our ignorance. 

Content from our partners
An innovative approach to regional equity
ADHD in the criminal justice system: a case for change – with Takeda
The power of place in tackling climate change