In 2017, a study of employees revealed that one-third felt wearable devices would help them be more productive while reducing stress and assisting people with health issues. In the wake of Covid lockdowns and the rise of home-working, one wonders whether that survey would render the same result. People working from home may feel suspicious of devices which, while no doubt helping workers keep track of their progress, could also be used by employers to track worker activity.
Even in the pre-Covid 2017 study, 67 per cent of office workers feared that wearables might create an unnecessary surveillance culture.
Surveillance is, of course, essential in the struggle to contain illegal migration. Governments are right to investigate the potential of new technologies to help in this effort, in humane but just ways. However, technology is often more of a blunt instrument than a surgical scalpel. There are, for example, proven biases inherent in some artificial intelligence algorithms. The simple reason for this is that they are programmed by human beings and "learn" by crunching the data provided by human beings.
Advertisement
We all have cognitive biases, which our brains use to simplify the huge amount of information they process. Programmers may unconsciously build their own cognitive biases into their creations or provide data sets, used to train AI, which becomes tainted in this way.
What's more, if AI is exposed to incomplete data sets - for example, with study samples that are not truly representative - it will inevitably operate on wrong assumptions.
The proposed use of smartwatch tracking is not, of course, the Home Office's first foray into sophisticated surveillance. It recently began deploying drones built in Portugal and collectively worth £1 billion to track the movements of small boats carrying migrants across the channel.
Drones may be good at spotting small craft from afar, but how effective will their cameras be in helping us identify the actual traffickers? Unscrupulous criminals have often found ways of sidestepping technology. There are reports that boat owners often appoint a pilot from within the ranks of paying customers, to mislead the authorities. Drones or not, photographic technology may lead us no closer to singling out the actual profiteers.
The price we pay for using smart wearables and drones as civil surveillance tools may be greater than the benefits they bring, especially when we consider technology creep. Technologies introduced and sold to the public for one narrowly defined purpose may eventually be used in ways that the public is neither aware of nor willing to sanction.
CCTV cameras, introduced in cities like London in the 1990s, were not mounted to track parking infringements or match facial images against central databases. CCTV was ostensibly introduced to help reduce high rates of car theft. Yet it does so much more today, with very little by way of public debate on the matter.
Advertisement
Using drones for anything more than mapping the location and progress of migrant boats creates a possibility that the machines will later be used in domestic environments within the UK. Police attempts to use drones to identify people taking part in lawful protests have been met with howls of protest. Rightly so. The larger the amounts of data authorities collect the greater the potential for its use in ways the public might not approve of.
There is another, even more potent argument for limiting government reliance on surveillance technology. It has to do with the big-picture impacts of engaging with technology in ways that threaten our basic humanity.
The word techism, coined in 1994, describes a philosophy of industry that tries to humanise our engagement with technology. We could do with a surge of interest in techism today.
Discuss in our Forums
See what other readers are saying about this article!
Click here to read & post comments.
5 posts so far.