MP Philp backs calls for ‘Orwellian’ surveillance cameras

Orwellian: Big Brother Watch says that police use of facial recognition technology puts Britain on a par with surveillance states such as Putin’s Russia and Communist China

If it’s Thursday, it must be Facial Recognition day in Croydon town centre.

The Met Police deployed its Big Brother-style CCTV cameras on North End again last week – for the third Thursday in succession – and now Croydon South Tory MP Chris Philp wants to roll out the technology across the country to help track down people wanted for “priority offences” such as rape, violence, robbery, theft and drug-related crime.

But civil liberties groups have warned against the potential for misuse of the technology, and scientists who have studied the cameras’ work have found that at some settings, they can misidentify black people 11 times more than white people.

But with Croydon shoppers used as unwitting guinea pigs in a massive social experiment, the camera trials have found “they can catch criminals in just a fraction of the time it would otherwise take”, according to reports this week.

‘What do you mean, they are following me on secret cameras?’: Chris Philp is backing facial recognition

Philp is the policing minister, and he has urged other police forces to adopt the technology. He is citing the results in Croydon to back-up the move.

Live facial-recognition cameras were deployed in Croydon for half a day amid the Christmas shoppers on December 7 and 14, catching 17 suspects in the space of a few hours (figures for last Thursday’s dragnet operation have not been released).

On December 14, the cameras identified 22 people who were on the Metropolitan Police’s wanted list. Of those, 10 people were arrested for offences including threats to kill, domestic abuse offences, theft, bank fraud and knife crime. One detainee was found to be carrying a cross-bow.

Seven arrests were made as a result of the deployment on December 7, including a man suspected of rape and burglary, a suspected fraudster and a man wanted for grievous bodily harm.

The technology uses a CCTV feed from a police van that monitors people walking past and is linked to facial recognition software. The police upload photos of wanted criminals and the software sets off an alert when a biometric match is found. A police officer reviews the match to confirm its accuracy.

The details of anyone who is not a match are immediately and automatically deleted, according to the police.

Before deploying facial recognition cameras police forces must notify the local community via social media, leaflets and clear signage in places where it will be used – as they did again last week, including the usual tweet from the Croydon police account.

Formal notice: the police have to give a warning that they are using the cameras

Police chiefs have said the technology will cut the amount of time spent trying to identify an offender from days and months to “just minutes”.

Civil liberty groups have likened the technology to “Orwellian surveillance” and warned that it was racially biased and inaccurate.

Minister Philp said the latest tests had addressed those concerns. “This technology enables wanted criminals who would otherwise go free to get caught,” Philp told The Times.

“This has a huge potential to be scaled up to catch thousands of wanted dangerous criminals. Anyone not on the watchlist has their image deleted immediately so there are no legitimate concerns about privacy.

“This technology has been tested by the National Physical Laboratory to ensure accuracy and no bias,” Philp said.

But not for the first time, what Philp says is not exactly true.

“If the system is run at low and easy thresholds, the system starts showing a bias against black males and females combined,” Dr Tony Mansfield, the author of the National Physical Laboratory report, told MPs on a Commons committee earlier this year.

Dire warnings: civil liberties groups say the use of facial recognition is Orwellian

Dr Mansfield’s study found that the Neoface technology used by the Met and South Wales police was highly accurate and unbiased if the algorithm used to trigger a match was set at a sufficiently high face-match confidence score.

But the report found that racial bias occurs when the software is set at lower thresholds. At some settings it is 11 times more likely to misidentify black women rather than white women, for example.

Big Brother Watch, the civil liberty campaign group, said the routine use of live facial-recognition would put Britain “more in step with the likes of China and Russia” than other European democracies.

“Live facial-recognition turns the streets of Britain into AI-powered police line-ups, with innocent members of the public being subject to biometric identity checks as they go about their business,” Big Brother Watch’s Madeleine Stone said.

“Everyone wants dangerous criminals off the street, but papering over the cracks of a creaking policing system with intrusive and Orwellian surveillance technology is not the solution.

“Police have written their own extraordinarily permissive guidance on how this Dystopian technology can be used, allowing victims and witnesses of crimes to be placed on watchlists, as well as peaceful protesters and people with mental health conditions.

“The UK’s reckless approach to face surveillance makes us a total outlier in the democratic world… This Orwellian tech has no place in Britain and must be banned.”



  • If you have a news story about life in or around Croydon, or want to publicise your residents’ association or business, or if you have a local event to promote, please email us with full details at inside.croydon@btinternet.com
  • As featured on Google News Showcase
  • Our comments section on every report provides all readers with an immediate “right of reply” on all our content
  • ROTTEN BOROUGH AWARDS: Croydon was named among the country’s rottenest boroughs for a SIXTH successive year in 2022 in the annual round-up of civic cock-ups in Private Eye magazine

About insidecroydon

News, views and analysis about the people of Croydon, their lives and political times in the diverse and most-populated borough in London. Based in Croydon and edited by Steven Downes. To contact us, please email inside.croydon@btinternet.com
This entry was posted in Business, Chris Philp MP, Crime, Croydon South, Policing and tagged , , , , , , , , . Bookmark the permalink.

7 Responses to MP Philp backs calls for ‘Orwellian’ surveillance cameras

  1. JohnG says:

    At some settings on all mobile phones some people will be unable to hear calls, it shows a bias against the elderly. One setting discriminates against all people when it is switched off or the battery is flat. This is surely discriminatory, but I believe it is no reason not to use mobile phone technology.

  2. Gary says:

    Whats it really for? numerous anti-social and criminal gangs frequently roam croydon all wearing hoodies and balaclavas? Not much use there! Think there maybe a little more to this, and it seems very little to do with crime.

  3. Andrew Pelling says:

    Croydon has already seen the abuse of technology that Labour got involved in with their associating themselves with the hacking of Inside Croydon.

    The Conservatives will regret giving such Big Brother powers to a possible Labour government that could use this technology to arrest quickly those that Labour felt had expressed dissension regarding Labour views. Labour’s reasoning would be to detain in the interest of public order.

    Labour have a nasty and dangerous authoritarian streak.

  4. Annabel Smith says:

    Certain more right wing members of my family say “if you’re not guilty of a crime why would you object to police surveillance”, but in this article we have some concrete stats on the error rates of this tech, we can easily shut down such simplistic arguments. Unless the police can guarantee the level of sensitivity being used in FRC is above the threshold that leads to bias, then this is a racist technology and should not be used!

    • Carl Lucas says:

      I don’t see this as a left or right argument because I see many people who identify on the left as supporting this technology. It’s more a liberal v authoritarian argument. True not to get the two confused because on many levels there’s not a lot of difference between communism and fascism.

  5. Philp is very vocal on using technology to keep us plebs in line. He’s silent on using technology to recover WhatsApp messages deleted by Boris Johnson and Rishi Sunak

  6. Jack Griffin says:

    “But civil liberties groups have warned against the potential for misuse of the technology…”

    You name it – Live Facial Recognition, ULEZ, LEZ, traffic light cameras, speeding cameras, any ANPR, all CCTV etc etc – it has potential for misuse.

    Yet most people indulge in exceptionalism and focus on immediate context only: in favour of ULEZ (as currently applied/ enforced): cameras good; against racist Live Facial Recognition technology: cameras bad.

    There’s a commentator on the RingGo post concerned about the use/ retention of their location data, yet is seemingly oblivious to the fact that, in Greater London at least, the state potentially snaps their car every 800 metres (even if it does nothing with that – for now).

    However incel, reddit, QAnon batshit bonkers it sounds, sadly, if you’re in favour of one sort of camera for one thing, you’re effectively in favour of all of them as we don’t get to choose when and why the camera that currently works for us is turned upon us.

Leave a Reply to Andrew PellingCancel reply