Doofus Of The Day #1,009

Today’s award goes to the South Wales Police in formerly great Britain.

South Wales Police has been testing an automated facial recognition system since June 2017 and has used it in the real-world at more than ten events. In the majority of cases, the system has made more incorrect matches than the times it has been able to correctly identify a potential suspect or offender.

. . .

During the UEFA Champions League Final week in Wales last June, when the facial recognition cameras were used for the first time, there were 2,470 alerts of possible matches from the automated system. Of these 2,297 turned out to be false positives and 173 were correctly identified – 92 per cent of matches were incorrect. A spokesperson for the force blamed the low quality of images in its database and the fact that it was the first time the system had been used.

. . .

“These figures show that not only is real-time facial recognition a threat to civil liberties, it is a dangerously inaccurate policing tool,” says Silkie Carlo the director of rights group Big Brother Watch. The group is planning on launching a campaign against facial recognition tech in parliament later this month. “South Wales’ statistics show that the tech misidentifies innocent members of the public at a terrifying rate, whilst there are only a handful of occasions where it has supported a genuine policing purpose,” Carlo adds.

. . .

South Wales Police, in its privacy assessment of the technology, says it is a “significant advantage” that no “co-operation” is required from a person.

. . .

… Martin Evison[,] a forensic science professor who has researched police recognition techniques at Northumbria University [says] “If you get a false positive match, you automatically make a suspect of somebody that is perfectly innocent.”

There’s more at the link.

Only in cloud cuckoo land could a “success” rate of just 7% (173 correct matches out of 2,470 alerts) be regarded as acceptable!  The police force argues that matches found by the system are investigated by an officer, who determines whether or not it’s accurate:  but that still means an awful lot of innocent citizens are going to be pestered by police.  What happened to the right to privacy, or, as it might be better termed, “the right to be left alone“?

The Fourth Amendment forms the basis of a “right to privacy,” the right to be left alone, as Justice Louis Brandeis put it. The enjoyment of financial and personal privacy is fundamental to a free and civil society. True liberty is to be able to walk down the street, cash a check, buy goods, talk on the telephone, or take a trip without being hassled, hounded, followed, or interrogated by government agents. People should be able to get away from the madding crowds without being followed or asked stupid questions.

Again, more at the link.

Sadly, the South Wales Police are anything but alone in introducing facial recognition software.  It may already be too late to implement civil rights safeguards against its misuse.  Sadly, far too few people care about their privacy to be bothered with it.  Those of us who do are dismissed as old fogeys, dinosaurs who’ve failed to keep up with the times.  Frankly, given many of the times I encounter, I’ll gladly flaunt that label – and I’ll label as Doofi all those who are either prepared to invade others’ privacy willy-nilly, and/or are prepared to be treated like that!

Peter

9 comments

  1. Might be worth asking what the current 'solve rate' is on low-priority crimes like tagging, mugging or destruction of property – In my area, at least, the cops essentially ignore these reports because they're busy and success rates are very low unless caught in the act. Similarly, 7% might well approximate the current 'suspect pool' for lots of other offences – it'd certainly be more specific than "Caucasian male, mid twenties, medium build, dark hair, wearing jeans and a Lakers jacket".

  2. The interesting question is "how many false negatives were generated"?

    A 7% – about 1 in 14 – chance of a positive ID being accurate is pretty bad. But what are the chances of a real culprit being ignored by the system?

    If the pool of possible suspects is 100 people, bringing it down to 14 might help you concentrate your efforts – if you can be reasonably certain that the real suspect wasn't ignored.

    If not, why not try putting photos of all possible suspects on the walls and try throwing darts – it seems just as likely to ID the right person.

  3. Three years ago I was involved in testing facial recognition for a major government documentation system. Despite having images from previously issued documents, the match rate peaked at a mere 3%. I understand testing is still ongoing with no improvement. Make of that what you will.

  4. Ten years ago there was a flurry of articles about facial recognition software. Some people were 'viewing with alarm' while others were hailing the coming dawn. I thought they were all nuts.

    Sure, some day soon somebody is going to produce facial recognition software that they can claim has a 90% accuracy. And it will, in good conditions when run by the people who came up with the algorithm.

    Run in day-to-day conditions by half-trained GS 4s? It will generate so many false arrests, so fast, with the attendant lawsuits, that the government won't touch it again for twenty years, if ever.

  5. While I admit that the 7% succes rate should make the system undeployable, how is facial recognition a violation of privacy? It is no different that the cop on the corner remembering your picture from a wanted poster.

  6. how about police time wasted investigating the other 93% misidentified?
    when they should be actually investigating real criminals.

  7. Low success rates around 7% are used elsewhere, they just aren't always recognized. A couple books I recently read talk about mammograms and the chance of cancer. When you factor in the frequency of breast cancer along with mammogram rates of false positives and negatives the claim is that a positive mammogram only means a 10% or so chance of cancer. Yet we insist on mammograms, complaining loudly of suggestions they might not be needed as often.

  8. They've been working on facial recognition programs for at least 20 years, with no improvement in the results.

Leave a Reply to Thomas W Cancel reply

Your email address will not be published. Required fields are marked *