By Robert Hickson 07/04/2021


Here’s a few short interesting developments or discussions I’ve seen recently. Loosely bundled together in a theme of “values.”

Irregular labour

Is the private sector the best provider and facilitator of “gig work”? That’s challenged in a New Yorker profile of Wingham Rowan, an English social entrepreneur. For many years he has been trying to get governments to develop a platform to help people find gig work (or “irregular labour”).

Rowan views gig work as being better run as a public utility rather than as a private company. This is because the utility model would focus on finding work for people that matches their skills across a range of employers, and in ways that are more likely to provide them with good working conditions and liveable incomes.

Current private sector gig work typified by Uber, DoorDash and similar companies, is largely about people getting relatively unskilled work, with no benefits, in one particular task.

Persuading governments (national and local) has proven difficult. This is due to political, ideological and technical reasons. Governments focus on full-time work, and they may lack (or think they do) the technical skills to develop good on-line platforms. Rowan thinks that governments can improve employment if they focus more on improving the nature and benefits of gig work.

This has elements of the “Sharing economy” but broader and larger in scale.

The article notes some recent promising developments by some local governments in the US and UK.

The pandemic is in part helping that shift, through challenging the notion that the private sector is always superior (a myth long challenged by Mariana Mazzucato). Though there are also societal changes too, with peoples’ preferences for when and how they work changing.

Traditional full-time employment may not be the norm in the future, so government perceptions need to change.

 “Twenty-somethings talk about having a job like having a fax machine,” Wingham Rowan

 

No Robo-pocalypse?

A recently published paper in the Journal of Information Technology suggests that the “artificial intelligence” story has been hijacked, but has become too good to be false. What is often called artificial intelligence today, the authors suggest, is just “statistics on data steroids”.

What we are seeing at the moment in terms of “AI” is physical robots, robotic process automation and cognitive automation (such as image identification).

Several influential studies have predicted that automation leads to job displacement, assuming that will be little job creation, that organisations and societies will be poorly prepared for the rapid spread of automation, and that human capabilities have little role to play in the future of work.

The paper links to recent research that challenge these predictions and assumptions, and forecast lower impacts on job losses. It suggests that too little consideration has been given to human qualities that are not easily automated or replaceable, especially in combination, and are likely to remain vital at work.

The paper concludes that a “Robo-pocalypse” may be more likely to happen not because of AI but because we fail to deal with the massive shift in skills required over the next one or two decades.

The Economist has also just highlighted that earlier predictors about automation-induced job losses haven’t yet come to pass.

Fragility

Matt Stoller notes that the now unstuck Ever Given cargo ship has highlighted not just the risks of concentration of global trade supply chains, but the problem of too much focus on efficiency in many areas.

“What is new isn’t the vulnerability of the Suez Canal as a chokepoint, it’s that we’ve intentionally created lots of other artificial chokepoints.” Matt Stoller

As Andrew Curry points out, two of Stoller’s suggestion for re-designing globalisation seem too simplistic because they undercut things that do make globalisation function.

“Just in time” efficiency works until it doesn’t, exposing the importance of building in robustness (or that ephemeral term “resilience”) or what Nassim Taleb called “antifragile.”

The pandemic refocused attention on some of New Zealand’s already recognised fragilities – being at the end of long, complex trade routes, and the heavy reliance of international tourists, workers, and student.

At the end of his piece Stoller asked for other examples of disasters-in-waiting. One suggestion was a looming shortage of phosphate, largely dependent on supplies from the Western Sahara (and that’s a discussion NZ has had before).

Another suggestion was the risk of a large cargo ship sinking at the entrance to a critical harbour like Rotterdam. That would take much more than a week to solve, and disrupt the port and its land-based transport networks (and not to mention the environmental impacts).

Getting away from these fragile states isn’t quick or simple, and some aren’t issues that NZ has much control over. But that doesn’t mean that we shouldn’t be putting energy and effort into addressing them.

Fixes vs Values

In a long essay on sustainability in the Boston Review Duncan Kelly suggests that a focus on technological fixes, constitutional amendments, and some political arrangements (like forms of constitutional dictatorship) can keep us in the status quo rather than opening up new futures. This is because, he suggests, they pre-commit us to contemporary values. (Though not all contemporary values should be discarded)

This risk of “colonising the future” is being recognised in discussions about futures thinking and foresight.

Quite a lot of futures thinking can be about reviving older models of growth, economics and governance and reframing them in new forms.

So, ideas like Universal Basic Income can be seen as just a tweak to existing social welfare systems (though with fewer strings attached).

Whereas, Rowan’s ideas about new public utilities, which connect broader questions of value to political and economic thinking, can open up new avenues that benefit individuals, communities and the state.

 

Featured photo by Nathan Powers on Unsplash