1.2190234-2961622727
Image Credit: Supplied

Imagine being allowed to take as much paid holiday from work as you liked. All the time in the world, or at least as much as your guilty conscience will allow. A friend has just been offered this juicy-sounding perk by her company, and we mused over just how far it was reasonable to push it. A whole August off sounds tempting, but maybe it would be smarter to spin things out over a succession of long weekends. Or even to keep it as a get-out-of-jail-free card, deployed in case of burnout or rainy Mondays when you just can’t face getting out of bed.

Except none of that will happen, of course. My conscientious friend won’t take a day over what she took before, if that; and the same is true for most people where unlimited holiday has been pioneered (Netflix and Virgin were early adopters). If anything, people often end up taking less time off, not more. Nobody wants to be singled out as the office slacker, so people try to take roughly what everyone else in their team seems to be taking — only the average gets dragged down, by people hungry for promotion, or lacking anyone to cover for them, or otherwise unable to drag themselves away.

In other words, peer pressure does exactly the same job that strict holiday policies used to do, except this way everyone feels slightly better about it. Sometimes, just the knowledge that they could skive if they wanted is enough to stop people wanting to skive at all. It’s nice to feel trusted, treated like a grownup.

And that, increasingly, is the dividing line in modern workplaces: trust versus the lack of it; autonomy versus micro-management; being treated like a human being or programmed like a machine. Human jobs give the people who do them chances to exercise their own judgment, even if it’s only deciding what radio station to have on in the background, or set their own pace. Machine jobs offer at best a petty, box-ticking mentality with no scope for individual discretion, and at worst the ever-present threat of being tracked, timed and stalked by technology — a practice reaching its nadir among gig economy platforms controlling a resentful army of supposedly self-employed workers.

James Bloodworth’s new book, Hired: Six Months Undercover In Low-Wage Britain, about his time working for Amazon, Uber and a series of zero-hours employers, vividly describes the pressure heaped on warehouse staff by systems that electronically monitor the speed of their picking and packing. Managers would berate anyone seen as slow. There was barely time to bolt down lunch, let alone scope for having a bad day or for getting older, or for any of the other variables to which actual humans are prone.

The mark of human jobs is an increasing understanding that you don’t have to know where your employees are and what they’re doing every second of the day to ensure they do it; that people can be just as productive, say, working from home, or switching their hours around so that they are working in the evening. Machine jobs offer all the insecurity of working for yourself without any of the freedom.

There have always been crummy jobs, and badly paid ones. Not everyone gets to follow their dream or discover a vocation — and for some people, work will only ever be a means of paying the rent. But the saving grace of crummy jobs was often that there was at least some leeway for goofing around; for taking a fag break, gossiping with your equally bored workmates, or chatting a bit longer than necessary to lonely customers.

In machine jobs, those moments that used to make life bearable are being managed out in favour of ruthless efficiency. Sneaky Facebook breaks are a thing of the past in those offices where computer keystrokes are automatically monitored to check people are continuously working. If an Uber driver turns down a job, the app will know.

The other great dividing line between human and machine jobs, meanwhile, is a more subtle one — and that’s whether or not the opportunity exists to exercise judgment. The blizzard of targets, tests, new statutory duties and demands for data imposed on public sector workers over the last two decades has had its upsides, helping to reduce the scope of human error and prejudice, and drive changes in schools or hospitals that people wanted to see.

But they are also horribly blunt instruments, applied to good nurses or teachers as well as bad, and if taken too far they leave the good ones feeling insulted and deskilled. What’s the point of having decades of experience if you’re no longer allowed to use it to decide how best to help this individual child or patient? Doesn’t professional judgement count for anything any more? You might as well get a robot to do it.

And the wider social consequences of all this are worrying. For work isn’t just work, a set of daily tasks to grind through. It’s a form of human relationship, something we do with and for each other that helps reinforce ideas of mutual responsibility and belonging in wider society — or it does, so long as people feel their efforts and experiences are appreciated, that their employer actually cares about how they feel, that they are not just another cog in the machine.

The debate about whether robots will soon be coming for everyone’s jobs is real. But it shouldn’t blind us to the risk right under our noses: Not so much of people being automated out of jobs, as automated while still in them.

— Guardian News & Media Ltd

Gaby Hinsliff is a Guardian columnist.