Just over a century ago, an up-and-coming minister in his mid-30s stood up in the House of Commons and delivered a withering indictment of the outcome of a free market in pay. He argued that the state should now intervene in defence of living wages:
It is a serious national evil that any class of His Majesty’s subjects should receive less than a living wage in return for their utmost exertions.
The speaker was an unlikely young firebrand, the grandson of a duke. He was one Winston Churchill, then a Liberal, introducing the Trades Boards Bill of 1909. It introduced minimum wages, in selected trades, at a national level for the first time in Britain. Churchill’s words came at a period when income inequalities were close to their 20th-century peak. And the cosy establishment consensus that capitalism worked was facing unprecedented moral and political challenges, not least by the rise of the labour movement. Hence Churchill’s indictment:
It was formerly supposed that the working of the laws of supply and demand would naturally regulate or eliminate that evil and ultimately produce a fair price …
But where you have what we call sweated trades, you have no organisation, no parity of bargaining, the good employer is undercut by the bad, and the bad employer is undercut by the worst. Where those conditions prevail you have not a condition of progress, but a condition of progressive degeneration.
A century on, the campaign for a living wage is enjoying remarkable success. As in 1909, the living wage is not merely a campaigner’s pipe dream, but has re-entered the political mainstream and featured prominently in the party manifestos at the 2017 general election.
But is the resurgence of this term merely a passing fashion, or might it signal a sustainable shift in our approach to wages that could make serious inroads into Britain’s low pay culture? In researching my new book on the subject with Laura Valadez, I was struck by the degree to which the story of the past century sheds light on the role that the living wage is playing in today’s political economy.
An early crisis of capitalism
The breakdown of the Victorian political and economic equilibrium in the first years of the 20th century, memorably described by George Dangerfield’s classic The Strange Death of Liberal England, reflected a growing challenge to laissez-faire capitalism and its failure to prevent hunger and squalor. The economist Adam Smith had asserted that freely operating markets would assure decent living standards for workers, because otherwise capable labour would become scarce, and the price would be bid up.
This was belied by the appalling conditions of the sweatshops of the industrial revolution, publicised by reformers such as Beatrice Webb. She catalogued “earnings barely sufficient to sustain existence; hours of labour such as to make the lives of the workers periods of almost ceaseless toil, hard and unlovely to the last degree”.
Such injustices fed both a growing movement for reform and an assertion of labour power through trade unions. On the reform side, Churchill’s introduction of a basic minimum wage in only the worst-paid occupations was too modest in its scope to have much impact, in contrast to more comprehensive minimum wage legislation in countries such as Australia and later the United States, which introduced nationwide minimum wages in 1907 and 1938 respectively.
As Dangerfield pointed out, British Liberals tended to be strong on social outrage but mild on social regulation. This helped explain the party’s demise and replacement with a party representing organised labour. For much of the 20th century, Britain’s strong labour movement was able to negotiate higher pay selectively through collective bargaining, rather than through a general minimum wage.
The tax credit compromise
By the end of the century, a weakening of collective bargaining associated with global competition and a decline in union power created a stronger case for government action to increase low wages. Against strong opposition from the Conservative Party, a statutory minimum wage was introduced in 1999 by the Labour government.
It was set at an overly cautious level of £3.60 in order to reduce the risk that low-skill workers would be priced out of jobs. Even so, there remained fears among free-market, neoliberal economists that any wage set above the rate determined by the market would reduce employment levels.
Its introduction was a flagship policy of the Labour Party’s 1997 campaign, and complemented a new role of the state in helping workers and their families to make ends meet through means-tested top-ups of working incomes. But wage supplementation has in many ways been an alternative to intervening in pay. The fact that the first two versions of these top-ups (family income supplement and family credit) were introduced by the conservative governments of Edward Heath and Margaret Thatcher was no coincidence.
Thatcher in particular wanted to maintain incentives for former manufacturing workers to take up new kinds of jobs, such as in the low-paid service sector. It worked with the Conservatives’ laissez-faire approach to government by not interfering with business and giving employers cheap (and competitive) labour. To make work worthwhile, the state topped up family incomes.
Continuing in this spirit, the then-chancellor Gordon Brown’s tax credits featured far more prominently than his national minimum wage in efforts to boost low working incomes. Indeed, by his 2002 budget speech he was boasting that a £4.20 an hour minimum wage was really £6 an hour for working families once tax credits were taken into account.
Tax credits are better at addressing low household income than minimum hourly wages, since they take account of how many hours of work there is in a family and how many mouths it has to feed. So families with more children will get more of a top-up from the state.
Yet relying on them excessively to correct for low pay and sporadic working hours has increased the cost for taxpayers (about £30 billion is spent on tax credits overall). Meanwhile, it hasn’t caused any significant fall in the risk of poverty for children whose parents work because of the extent that the earnings of low paid workers earnings have fallen behind living costs.
Today, the debate has shifted from the idea of a minimum wage for workers to a “living wage”, which conveys the idea that wages should be enough to provide a decent standard of living. This idea originated with a grassroots movement involving cleaners and other workers protesting against wage levels insufficient to allow full-time workers even to support themselves, let alone a family, at an adequate level. It grew to a national movement to urge employers to sign up to a living wage calculated to meet minimum acceptable living standards, based on research I’ve led at Loughborough University.
The campaign’s success in recruiting thousands of living wage employers, combined with growing discontent over stagnant incomes following the 2008 financial crisis, helps explain a dramatic about-turn by the Conservative government in its July 2015 budget. Declaring that “Britain needs a pay rise”, George Osborne boldly reversed the principle of favouring state top-ups in order to keep wages competitive.
Osborne’s new mantra was to move the country from a “low-wage, high-tax, high-welfare society to a higher-wage, lower-tax, lower-welfare economy”. More tangibly, he introduced a compulsory National Living Wage for over-25s, set to rise to around £9 an hour from 2020 (compared to the £6.50 minimum wage in 2015) and pegged to increases in average earnings thereafter.
This new position has been seen partly as a smokescreen for welfare cuts, yet its implications go far wider. Osborne’s unexpected radicalism echoed that of Churchill in holding employers to account for poor wages. Both did so at a time when disparities between “fat cat” bosses and poorly paid workers were under fire.
A crucial feature of both eras has been the recognition that a “free” labour market does not work as predicted – either by Adam Smith or by neoliberal economists. Churchill pointed to the absence of a “parity of bargaining” in many sectors, in a period before unions had grown strong. Today, following the decline of organised labour, the evidence suggests that employers are once again powerful enough to pay workers below what they could afford. In both the UK and the United States, research has overwhelmingly concluded that state legislation to raise minimum wages has not cost jobs.
Given such evidence, Osborne’s gamble, and that of both the Labour and Conservative parties in promising higher minimum wages in this election, is that employers can afford to make a significantly bigger contribution to raising working families’ living standards through better pay, rather than the cost falling mainly on the taxpayer.
And it is a gamble because a minimum at the high levels being proposed is uncharted territory – so the possible effects on jobs are unknown. Up to now, each modest hike in the minimum wage has been followed by a careful assessment by the Low Pay Commission to check it is having no adverse impact. Unqualified political commitments to raise the minimum to a given level put this economic advice into the background.
Is a true living wage sustainable?
Election promises on the trajectory of the minimum wage look generous. But over the longer term, can voluntary and compulsory living wages remain true to their aim of covering minimum living costs while remaining affordable to employers? Here, the 20th century precedents are mixed.
In the US, Franklin Roosevelt introduced the federal minimum wage and called it a “living wage”, but there was no basis for maintaining its level. Persistent decline in its absolute and relative value since the 1960s has made the federal minimum effectively a poverty wage. By contrast, in Australia, influenced by the landmark Harvester judgement of 1907 which ruled that wages need to be sufficient to support “a human being in a civilised community”, the minimum wage system was more clearly established to support a living wage principle. Today, the national minimum wage in Australia is nearly double that of the US.
So embedding clear principles for setting a living wage can have long-term benefits. But the 21st-century context also brings its unique challenges, related to changes both in households and in work.
The composition of households and their working patterns have changed greatly. The traditional “male breadwinner” model fits only about one in four working families with children. Almost as many are supported by a lone parent, with just over half having two parents working, but for variable numbers of hours. This means that any one wage rate has very different results for the living standards of different families.
This does not make wages irrelevant, but neither can they fully replace help from the state in addressing working poverty. I have shown how wages and tax credits can work together to achieve this objective, rather than being seen as alternatives. Indeed, the calculation of a living wage capable of supporting adequate living standards needs to take account of how much help families get from tax credits and other subsidies.
The statutory National Living Wage does not directly meet this criterion. Its level will be pegged to average pay rather than to living costs and will not automatically rise if tax credits (or their successor, Universal Credit) fall. This could be justified for a compulsory wage rate by the need to provide a stable minimum which does not rise more quickly than the labour market can stand.
Changing world of work
The continuation of the voluntary living wage rate, calculated with reference to minimum living costs and adopted by employers committed to ethical pay, will provide a benchmark to which the statutory level can be compared. If the two rates start to diverge, the state could give greater non-wage help to families, again adjusting the balance between wages and top-ups.
But will wage rates even be relevant a few years from now, if we move towards a “gig economy” in which everyone is a self-employed free agent? The death of the employee may have been exaggerated: more than five workers in every six are still in employment. Yet governments would be foolish to address only pay. They must not neglect the terms on which workers are hired or their chance to access stable employment, which greatly affects their ability to translate a decent wage into an acceptable living standard.
On the surface, the political parties are committed to supporting workers’ rights in a changing labour market. The inclusion even in the Conservative manifesto of a section on workers’ rights is a sign of the times. It promises no specifics, but pledges to pay attention to the results of the forthcoming Taylor Review on this subject.
Just as Churchill’s Liberals in the early 20th century condemned sweatshops, so today’s politicians have this contemporary form of worker exploitation in their sights. We will have to wait and see whether this results in decisive action or only limited measures – like the ones in 1909 that failed to stem the decline of a reforming party and its replacement with more radical political forces.