I was recently shown an AI analysis of long-term trends in the public’s attitude to government. The AI had been designed to look at changing attitudes to brands, but its creator had been curious to see what it revealed if the brand in question was The State. It was remarkably insightful.
‘New behavioural data reveals a structural shift: The State has moved from episodic authority to ambient friction.’ And ‘modern governments are, on the whole, stable – but increasingly so for structural rather than relational reasons. Stability is maintained through procedure… and compulsion, not persuasion… Authority now operates in an environment that never turns off – and never waits’.
You might call this the speed-camera approach to regulation: continuous, pervasive and unresponsive. In the old days, you could drive into London with the confident expectation that, provided you avoided committing some egregious offence, you could avoid any contact with the plod. Today, driving involves a level of permanent anxiety where you must be constantly alert to unwittingly committing some small infraction, then fined with no allowance for context or extenuating circumstances. Challenging or reversing a bad decision is almost impossible. I have even invented a new scientific unit – the Letby – which measures the difficulty of reversing any questionable decision: by this measure, challenging a parking fine perhaps scores 50 milliletbys.
The trend arises as a product of the dangerous feedback loop between technology and bureaucracy. The cost-cutting promise of technology can only be delivered if decision–making is codified and made formulaic. A bureaucracy establishes power and defends its legitimacy by pretending that all decisions can be made through adherence to context-blind, formal procedures of its own devising. Tech and bureaucracy therefore feed each other. This results in a world where compliance and consistency are prioritised over quality of outcome.
Just in case I sound a bit paranoid, I should explain that, even in my most tinfoil-hatted moments, I do not think we are ruled by lizard people. The distinction I would make is Alfred Hitchcock’s: ‘I was misquoted. I didn’t say actors were like cattle. I said they should be treated like cattle.’
There is a dangerous feedback loop between technology and bureaucracy
What I have learned from the work of Dan Davies is that any procedural, formalised mode of decision-making is effectively an alien intelligence and should be seen as such. As Dan explains, artificial intelligence is not new, since any bureaucracy is a form of artificial intelligence. So treating any bureaucracy like an alien space lizard is not wholly ridiculous.
In any institutional setting, otherwise sane human beings are capable of behaviour which is internally consistent but which produces an inhuman outcome that defies common sense. Good intentions mean nothing. In the words of Dan’s beloved Stafford Beer: ‘The purpose of a system is what it does.’
If legalistic behaviour is an artificial intelligence, a jury is, at the simplest level, a second line of defence or, to use the language of AI, a ‘kill switch’. And a jury possesses two great strengths in decision-making which are denied to officialdom. First, it is wholly unaffected by the consequences of any decision (a controversial decision does not affect your ‘career’ as a juror). Second, and most important of all, a jury is not required in any way to explain its reasoning. At first glance, this seems like a weakness: when you think about it more deeply, it is invaluable. The first requirement of any good kill switch is that you shouldn’t have to explain why you pressed the kill switch.
In a world where decision-making is increasingly automated, we should perhaps be adding juries to the system, not removing them.
Comments