It appears that London is ready to roll out the gift of surveillance to its citizens this holiday season. The Met has announced that several areas around central London will be part of a field trial to be conducted today and tomorrow, despite previous reports that the tech is “staggeringly inaccurate.”
In September 2017 I reported about results from a previous test of public facial recognition that was conducted in Britain’s Notting Hill area. The results were horrendous. My emphasis added:
The controversial trial of facial recognition equipment at Notting Hill Carnival resulted in roughly 35 false matches and an ‘erroneous arrest’ …
The system only produced a single accurate match during the course of Carnival, but the individual had already been processed by the justice system between the time police compiled the suspect database and deployed it.
That test was “controversial” because it was apparently conducted without first notifying the public. Moreover, the Sky News investigation cited above revealed that the government already had built a secret database (including innocent people) that’s just been waiting for the implementation of the facial recognition systems.
In an attempt to remedy the secrecy behind the program, the UK Met has at least attempted to roll out this new trial in the full light of public scrutiny. A report by the BBC indicates that not only will people be notified, they will not be declared suspicious for not participating:
The Met said “clear uniformed” officers would carry out the trials and distribute information leaflets to the public for about eight hours on both days.
It added anyone who declined to be scanned during the deployment would “not be viewed as suspicious”.
Nevertheless, the UK’s main watchdog on this issue, Big Brother Watch, is not comforted even by the supposed openness of the trial. Instead of any improvements to the system that previously failed in dramatic fashion, they say it’s only gone from bad to worse:
The campaign group obtained statistics using Freedom of Information requests in May that exposed the Met’s facial recognition ‘matches’ as 98% inaccurate.
But new police figures obtained by Big Brother Watch reveal that 100% of the Met’s facial recognition matches since May have incorrectly matched innocent members of the public with people on police watch lists.
Their photos are stored on police databases for one month nonetheless.
(Source: Big Brother Watch)
After a series of these technical mishaps, as well as the exposure of previous secrecy surrounding deployment, Big Brother Watch has “crowdfunded legal challenge against the Metropolitan Police and the Home Secretary’s use of facial recognition surveillance.”
It’s somewhat fortunate that the UK public is learning about this technology at such a formative stage while it still can be resisted. If this is not stopped now, it is guaranteed to get worse as we see other nations rolling out social credit systems, pre-crime algorithms, and a host of other applications for this dubious technology that is a far bigger threat to freedom than the criminals and terrorists it purports to be protecting against.
Image credit: Kaspersky Lab
Nicholas West writes for Activist Post. Support us at Patreon for as little as $1 per month. Support us at Patreon. Follow us on Minds, Steemit, SoMee, BitChute, Facebook and Twitter. Ready for solutions? Subscribe to our premium newsletter Counter Markets.
Be the first to comment on "London Begins Testing Facial Recognition In Public Despite Being “Staggeringly Inaccurate”"