"Instagram act" under fire for treatment of copyrighted works

Is the Government handing your photos to media giants?

The Government's Enterprise and Regulatory Reform Act, which became law last week with the end of the 2012/2013 parliamentary session, has come under attack over its treatment of so-called "orphan works".

The act aims to legislate a way for publishers to use copyrighted material which has no obvious author, or no way to track down the author. In the past, orphan works were typically older media, like out-of-print books, with little-to-no contact information available. Those works still cause problems, and are covered by the Enterprise and Regulatory Reform Act, which ought to aid plans to catalogue them, like Google's audacious attempt to scan every book in America.

But the reason why orphan works are kicking up such a fuss now is that more and more works are being orphaned shortly after creation, thanks to the internet. You can see it all the time online: a photo is tweeted, someone cross-posts it to Facebook, someone else reposts it to Twitter from there, it makes it over to Tumblr, and then is incorporated into a Storify which a media organisation reports on. In such circumstances, it can very quickly become nearly impossible to track down the original image. That's why the law has been nicknamed the "Instagram act".

As a result, the bill comes up with a sticking-plaster solution: any publisher that performs a "diligent search" and fails to identify the creator of the orphaned work can use it without fear of a copyright infringement suit. The scheme is envisaged to be similar to that administered by the PRS, which collects money from establishments which play recorded music and distributes it to artists; but since details will be filled in by secondary legislation, we don't know exactly how similar.

As a result, there are reservations about how well the system will work in practice. For instance, the Register's Andrew Orlowski writes:

For the first time anywhere in the world, the Act will permit the widespread commercial exploitation of unidentified work - the user only needs to perform a "diligent search". But since this is likely to come up with a blank, they can proceed with impunity. The Act states that a user of a work can act as if they are the owner of the work (ie, you) if they're given permission to do so by the Secretary of State, acting as a regulated body. The Act also fails to prohibit sub-licensing, meaning that once somebody has your work, they can wholesale it. This gives the green light to a new content scraping industry, an industry which doesn't have to pay the originator a penny. Such is the consequence of "rebalancing copyright," in reality.

A lot of the questions rely on the definition of a "diligent search"; if, as Orlowski suggests, it is merely a formality for any image which isn't obviously attributed, then real problems could occur. Already, it is relatively standard practice at many high-turnover outlets to crop-out watermarks on images and republish them credited to "Twitter" or "Facebook" – a copyright notice which has no legal backing – so it would not be surprising to see similar publications try to get away with woefully substandard searches.

But without some shady dealings (admittedly, discounting shady dealings might be a fool's game) it's hard to see how the act will lead to the situation where "most digital images on the internet" will be exploitable. Although metadata, embedded information about the image's provenance, is frequently stripped out on uploading, unless the image goes through a tortuous progress like that above there, a diligent search would still find the original uploader.

Nonetheless, the balance of power does appear to have shifted firmly towards publishers and away from artists. That could wind up being ripe for abuse, but it could also fix the system we have now, where artists ostensibly have the power but have very little ability to use it. We will have to wait and see which is the case.

Instagram's website.

Alex Hern is a technology reporter for the Guardian. He was formerly staff writer at the New Statesman. You should follow Alex on Twitter.

Getty Images.
Show Hide image

The problems with ending encryption to fight terrorism

Forcing tech firms to create a "backdoor" to access messages would be a gift to cyber-hackers.

The UK has endured its worst terrorist atrocity since 7 July 2005 and the threat level has been raised to "critical" for the first time in a decade. Though election campaigning has been suspended, the debate over potential new powers has already begun.

Today's Sun reports that the Conservatives will seek to force technology companies to hand over encrypted messages to the police and security services. The new Technical Capability Notices were proposed by Amber Rudd following the Westminster terrorist attack and a month-long consultation closed last week. A Tory minister told the Sun: "We will do this as soon as we can after the election, as long as we get back in. The level of threat clearly proves there is no more time to waste now. The social media companies have been laughing in our faces for too long."

Put that way, the plan sounds reasonable (orders would be approved by the home secretary and a senior judge). But there are irrefutable problems. Encryption means tech firms such as WhatsApp and Apple can't simply "hand over" suspect messages - they can't access them at all. The technology is designed precisely so that conversations are genuinely private (unless a suspect's device is obtained or hacked into). Were companies to create an encryption "backdoor", as the government proposes, they would also create new opportunities for criminals and cyberhackers (as in the case of the recent NHS attack).

Ian Levy, the technical director of the National Cyber Security, told the New Statesman's Will Dunn earlier this year: "Nobody in this organisation or our parent organisation will ever ask for a 'back door' in a large-scale encryption system, because it's dumb."

But there is a more profound problem: once created, a technology cannot be uninvented. Should large tech firms end encryption, terrorists will merely turn to other, lesser-known platforms. The only means of barring UK citizens from using the service would be a Chinese-style "great firewall", cutting Britain off from the rest of the internet. In 2015, before entering the cabinet, Brexit Secretary David Davis warned of ending encryption: "Such a move would have had devastating consequences for all financial transactions and online commerce, not to mention the security of all personal data. Its consequences for the City do not bear thinking about."

Labour's manifesto pledged to "provide our security agencies with the resources and the powers they need to protect our country and keep us all safe." But added: "We will also ensure that such powers do not weaken our individual rights or civil liberties". The Liberal Democrats have vowed to "oppose Conservative attempts to undermine encryption."

But with a large Conservative majority inevitable, according to polls, ministers will be confident of winning parliamentary support for the plan. Only a rebellion led by Davis-esque liberals is likely to stop them.

George Eaton is political editor of the New Statesman.

0800 7318496