The phrase ‘computer says no’ now has its own Wikipedia page. The first recorded use dates back to a Stasi-era 1970s East German film segment titled ‘Der Computer Sagt: Nein’. However, its idiomatic use arose in 2004 via a series of sketches in Little Britain, each illustrating an example of technology-enabled bureaucratic intransigence, typically flying in the face of common-sense human judgment. It is perhaps the 21st-century equivalent of ‘jobsworth’.
To behavioural scientists, the phrase illustrates something known as ‘defensive decision-making’, whereby the primary motivation for a decision is not the likely quality of the outcome but the decision-maker’s often unconscious urge to use any available means to offload accountability for his actions. Unlike many behavioural science findings, this is not a trivial bias: many serious commentators believe that defensive decision-making, with its massive expansion of legalistic rules into every realm of human activity, lies behind the loss of economic dynamism in western economies. As I remarked to a KC friend: ‘The trouble with Britain is that we were a country of really good pirates who then sent their children to law school.’
Defensive decision-making was evident when the appointment of Peter Mandelson degraded into a debate over ‘process’. The use of the word ‘process’ is generally a bellwether for DDM. Following ‘due process’ is hardly proof against making dumb decisions: it simply means the people involved can be absolved of blame. The other problem with ‘process’ is that once such processes are complete, reversing the outcome becomes almost impossible, even when countervailing information emerges.
But the fact that ‘computer says no’ is so widely used should perhaps serve as a warning to us all. For it shows how, placed in the hands of what Lord Glasman calls ‘the lanyard class’ (who are by no means confined to the public sector), any technology has the power to impose a kind of dystopian quality on everyday life by extending legal formalism into facets of it where it has no place, at the expense of common sense and intuitive judgment. It leads to what I call ‘the airport effect’ – that constant low-level anxiety you experience at airports by knowing that any deviation from imposed rules – from an overweight cabin bag to a misheard announcement – can spiral into a nightmare. The larger and more dehumanised the airport becomes, the worse the anxiety.
But there is a reason for airports to act defensively. The problems emerge when technology allows this rule-making tendency to extend everywhere else, with a stringency and intransigence we would not accept from a normal human actor.
Following ‘due process’ simply means the people involved can be absolved of blame
Few traffic policemen would fine a taxi driver £100 for driving at 25mph on a major road at 2 a.m. A speed camera is allowed to do this. The net result is that the poorest in society are being driven off the roads not only by the costs of car ownership but by the heightened risk of fines. Given that owning a car might boost your chances of getting a job more than having a degree, it seems astonishing that a government supposedly preoccupied with social justice is blind to this.
One interesting observation: people are less irked by average-speed cameras than the fixed kind, as they are more consistent with our instinctive idea of fairness. Within set parameters, you can exercise discretionary judgment about your speed, briefly breaking the limit to change lanes, for instance, provided your overall driving speed is safe. This illustrates the difference between a wide-context and a narrow-context rule. Perhaps there is something we can learn from this before we surrender control to our AI robot overlords.
Comments