Surveillance for smarter cities – or an Orwellian nightmare
If you could, would you star in your own reality TV show? What if it turned out that you already are? CCTV is marketed as a protective measure, installed to keep you safe in the event of a crime, or to prevent break-ins and theft. Well, maybe, except researchers haven’t found evidence to suggest that cameras actually prevent crime.
In the UK, 96% of surveillance cameras are privately owned. In-home security systems; businesses monitoring who keeps putting empty crates in their bins; a live-feed of newborn kittens — cameras are everywhere. It might be ominous enough to take notice of self-declared surveillance, but are you showing up on other people’s home security? Scarier still, are you being watched by your own technology?
Shodan, a search engine designed to map and gather information about internet-connected devices and systems, makes it easy to watch a constant live stream of private cameras. The IoT search engine enables what has been referred to as “idle surveillance”. The problem is, like CCTV for policing, there is no official ruling about how footage can and can’t be used.
Does digital surveillance prevent crime?
In Memphis, taxpayers have paid US$10 million over the past decade for SkyCop, a city-wide CCTV network made up of 2100 cameras broadcasting to police command centers. Crime rates in Memphis have gone up. In fact, per the Washington Post, when police released footage of the murder of Tyre Nichols, much of it had been filmed by SkyCop.
Andrew Guthrie Ferguson, a professor at American University Washington College of Law, points out that “surveillance doesn’t prevent crime, even police crime.” Spending cuts in both America and the UK have urged industries to resort to technology to reduce expenses or mitigate staffing shortages. G4S’s Digital Guard is one such ‘solution’.
According to the company website, Digital Guard “screens for pre-defined situations which could indicate someone breaking the law or violating house rules. This could, for example, include shouting, loitering, glass being broken or an ATM shutter being tampered with.”
Supposedly, CCTV cameras are placed in high-risk, busy, or secluded public spaces such as town centers, train stations, shops, and car parks. As reported by IFSEC Global, in 2020 there were around 5.2 million CCTV cameras in operation in the UK — that’s the equivalent of one camera for every 13 people.
Police-owned cameras in America could be counted as unreasonable search: US Circuit Judge O Rogeriee Thompson asked, “Are we just going to put these cameras in front of everybody’s house and monitor them and see if anybody’s up to anything?” But what of the cameras that aren’t state-owned?
It comes down to an issue of transparency. Contestants on reality TV shows like Big Brother know that their every move could be being watched. It’s the same with policing: tools like SkyCop aren’t doing anything to prevent crime, but mean that there is endless footage of unknowing civilians going about their lives.
In 2020 Mississippi police conducted a 45-day pilot to livestream security cameras, including Amazon Ring, of participating residents. While people buy Ring cameras and put them on their front door to keep their packages safe, police use them to build comprehensive CCTV camera networks blanketing whole neighborhoods.
So-called ‘Smart Cities’ would use digital technology to improve quality-of-life. Theoretically, as with Digital Guard, this would include security and policing. Yet, not only is there a lack of positive effect, technology employed in policing tends to have an actively negative impact.
In 2022 the UK Government rejected a House of Lords report calling for the introduction of regulations and mandatory training to counter the negative impact that the current deployment of surveillance technologies has on human rights and the rule of law.
The London Metropolitan Police use Live Facial Recognition (LFR) technology across London, scanning faces for real-time matching with photos of faces from the Police National Database (PND). The photos on the PND are taken on arrest, and stored as images of ‘digital suspects’, even if the subject was cleared.
On 28 January 2022–one day after the UK Government relaxed mask wearing requirements–the Met deployed LFR with a watchlist of 9,756 people. Four people were arrested, including one who was misidentified and another who was flagged on outdated information.
We aren’t going as far as some conspiracists might – pigeons probably aren’t government drones tracking our every move – but as AI makes recognition software increasingly accomplished, how will we be kept aware of our images being stored? If Smart Cities will incorporate this type of surveillance, how can the population consent?
20 March 2023