Credit cards are obsolete. Is consumer debt heading the same way?

The technological history of credit.

Slate's Matt Yglesias, in a post about the effect higher bank capital requirements could have on the real economy, gives a brief overview of the changing nature of credit in America:

Once upon a time credit overwhelmingly meant business credit, which then expanded into the personal sphere primarily in the special case of houses and what you might call household investment goods (cars, large appliances). That then metastasised into the all-in culture of consumer debt and credit that we know from the past 25 years.

Yglesias' point is that high capital ratios will reverse that trend, boosting the price of consumer debt while making corporate debt cheaper. This, he adds, might not be a bad thing, "disproportionately encouraging business borrowing to finance investment while discouraging consumer borrowing to enhance consumption".

But what I find interesting is how that "metastasisation" of a relatively small field of debt into the widespread credit economy we now have was born. It was, broadly, a technological imperative, as the Financial Times' Isabella Kaminska points out:

The credit component in credit cards came into play because in the “old days” extending credit was the easiest way to transact remotely without the use of physical cash.

Any alternative back then would have involved waiting hours (if not days) for the merchant to call your bank, who would then verify who you were, who would then make a deduction from your account, who would then send an instruction to the merchant’s bank, whose bank would make a corresponding credit, who would both use different parties to clear and confirm the transaction. Sometimes by post.

It was basically much easier (from a velocity point of view) for a bank to guarantee to the merchant that you were good for the money by means of a piece of plastic. The transaction would take place and you would then owe the bank, whilst all the settlement processes continued on in the background. If you didn’t pay, it was between you and the underwriter bank. The merchant was covered. You were probably black-listed.

Initially, then, the fact that credit cards enabled people to freely and easily spend beyond their means wasn't deliberate — it was a by-product of the real aim, which was just to let people pay for things. It wasn't quite a bug in the system, because card issuers were always more than happy to let people pay off their credit card bills in instalments, racking up healthy interest payments in the process. But it was hugely important in getting the concept of borrowing to pay normal daily bills into people's heads.

Nowadays, of course, that technological imperative is nonexistent. Although they will take every possible opportunity to delay payments, squeezing marginal gains from the extra interest, banks are capable of transferring money instantly. At the very least, the fact that debit cards are now possible renders the initial rationale for credit cards obsolete.

Of course, if this apotheosis of the credit economy is something which is worth pushing back against, as Yglesias suggests, then doing so by just raising interest rates is about the most damaging possible way. People have got used to boosting their standard of living with easy credit, and until they can achieve the same standard without resorting to credit, making it more expensive to borrow could backfire heavily.

Credit cards. Photograph: Getty Images

Alex Hern is a technology reporter for the Guardian. He was formerly staff writer at the New Statesman. You should follow Alex on Twitter.

Getty
Show Hide image

There's nothing Luddite about banning zero-hours contracts

The TUC general secretary responds to the Taylor Review. 

Unions have been criticised over the past week for our lukewarm response to the Taylor Review. According to the report’s author we were wrong to expect “quick fixes”, when “gradual change” is the order of the day. “Why aren’t you celebrating the new ‘flexibility’ the gig economy has unleashed?” others have complained.

Our response to these arguments is clear. Unions are not Luddites, and we recognise that the world of work is changing. But to understand these changes, we need to recognise that we’ve seen shifts in the balance of power in the workplace that go well beyond the replacement of a paper schedule with an app.

Years of attacks on trade unions have reduced workers’ bargaining power. This is key to understanding today’s world of work. Economic theory says that the near full employment rates should enable workers to ask for higher pay – but we’re still in the middle of the longest pay squeeze for 150 years.

And while fears of mass unemployment didn’t materialise after the economic crisis, we saw working people increasingly forced to accept jobs with less security, be it zero-hours contracts, agency work, or low-paid self-employment.

The key test for us is not whether new laws respond to new technology. It’s whether they harness it to make the world of work better, and give working people the confidence they need to negotiate better rights.

Don’t get me wrong. Matthew Taylor’s review is not without merit. We support his call for the abolishment of the Swedish Derogation – a loophole that has allowed employers to get away with paying agency workers less, even when they are doing the same job as their permanent colleagues.

Guaranteeing all workers the right to sick pay would make a real difference, as would asking employers to pay a higher rate for non-contracted hours. Payment for when shifts are cancelled at the last minute, as is now increasingly the case in the United States, was a key ask in our submission to the review.

But where the report falls short is not taking power seriously. 

The proposed new "dependent contractor status" carries real risks of downgrading people’s ability to receive a fair day’s pay for a fair day’s work. Here new technology isn’t creating new risks – it’s exacerbating old ones that we have fought to eradicate.

It’s no surprise that we are nervous about the return of "piece rates" or payment for tasks completed, rather than hours worked. Our experience of these has been in sectors like contract cleaning and hotels, where they’re used to set unreasonable targets, and drive down pay. Forgive us for being sceptical about Uber’s record of following the letter of the law.

Taylor’s proposals on zero-hours contracts also miss the point. Those on zero hours contracts – working in low paid sectors like hospitality, caring, and retail - are dependent on their boss for the hours they need to pay their bills. A "right to request" guaranteed hours from an exploitative boss is no right at all for many workers. Those in insecure jobs are in constant fear of having their hours cut if they speak up at work. Will the "right to request" really change this?

Tilting the balance of power back towards workers is what the trade union movement exists for. But it’s also vital to delivering the better productivity and growth Britain so sorely needs.

There is plenty of evidence from across the UK and the wider world that workplaces with good terms and conditions, pay and worker voice are more productive. That’s why the OECD (hardly a left-wing mouth piece) has called for a new debate about how collective bargaining can deliver more equality, more inclusion and better jobs all round.

We know as a union movement that we have to up our game. And part of that thinking must include how trade unions can take advantage of new technologies to organise workers.

We are ready for this challenge. Our role isn’t to stop changes in technology. It’s to make sure technology is used to make working people’s lives better, and to make sure any gains are fairly shared.

Frances O'Grady is the General Secretary of the TUC.