Tuesday, 17 April 2018
Technology ... Don't Run Me Over
Technology is a tool but when it comes to problems, accidents or mistakes we always "blame the workman never the tool" and god help us if we should get in the way.
When the Liverpool and Manchester Railway opened in 1830 William HuskissonMP, understandably unfamiliar with railway travel, stepped on the tracks and was by struck and fatally injured by Stephenson's Rocket. Commercial interests prevailed and the directors and engineers of the company were explicitly absolved of all blame and the huge amounts of publicity greatly increased public awareness of the potential of the railway. This same story repeats through the history of technology development and today it is with automated vehicles or driverless cars ... "Police chief said Uber victim “came from the shadows”—don’t believe it" and "Tesla issues strongest statement yet blaming driver for deadly crash".
Technology developers may have grand visions but to get stuff done they need capital to build it and capital is of course driven by business and commercial interests. Technological optimism and powerful commercial self interest blinkers technology development to side issues and unintended consequences and turns it into a steamroller that crushes anything in its path.
Its easy to be a tech fanboy and jump on the tech bandwagon or at least go along for ride but criticise technology development, try to draw attention to the side issues and unintended consequences or try to stop technology and you will feel the wrath of tech zealots and risk being driven over by them ... it's heresy to go against determined technologists.
"Move fast and break things" in the gold rush to the next big thing and standards are dropped to promote innovation and push technology forward but its only in hindsight that we see the wake of unintended consequences and side issues we never saw through the blinkers looking forward. "Lessons will be learned" ... we hear time and time again ... and only then are the wider issues acknowledged, the safeguards and regulations applied and care taken ... because the technology is forced to.
Technology as a tool is one thing but when a tool becomes a machine, is automated and can even "think" for itself it can become dangerous. Within a factory we can be protected by putting machines in cages but what about machines that are "in the wild" and among us?
Technologists are determined and want us to design our lives, work and planet around it ... the technology solution puts people in technology cages. In the fight for our public space one narrative of "networked urbanism" envisages the city driven by data analytics and networks controlled in part by machines. In this "smart city", technological solutionism is rampant, with everything connected and automated. This is Googleville: a posthuman urban laboratory.
Technologists think of systems and are naturally enough systems thinkers and even see people as systems ... if not machines. The problem is that the real world and people aren't machines and blinkered technology will always get blindsided by something it cannot predict, determine or even see ahead. Systems thinking is appropriate for the innards of tech but when it comes to its application in the real we need more inclusive, diverse and holistic thinking .. we need to put humans not just in the loop but at the centre of our design thinking if we want to have a chance at averting major social and environmental problems caused by technology in the future .. move over, Stem: why the world needs humanities graduates.