What happens when facial recognition gets it wrong – Week in security with Tony Anscombe

Cyber Security

Video

A woman in London has been misidentified as a shoplifter by a facial recognition system amid fresh concerns over the technology’s accuracy and reliability

A woman from London has been wrongly accused of being a shoplifter after being flagged by a facial-recognition system, the BBC reports. The tech, called Facewatch, is used by a number of retailers across the United Kingdom, including by the Home Bargains store where the woman was misidentified.

Privacy, legal and other issues and risks have plagued facial recognition for years, to the point that San Francisco, Boston, Portland and other cities in the United States eventually banned the use of facial recognition software by the police and municipal agencies. While some cities are now making a bit of a U-turn on the bans in response to rising crime rates, a number of questions surrounding the tech still persist. Among them, how can issues such as false positives be handled and, indeed, can they be prevented?

Learn more about this controversial subject in Tony’s video.

Connect with us on FacebookTwitterLinkedIn and Instagram.

Products You May Like

Articles You May Like

Free Decryptor Released for BitLocker-Based ShrinkLocker Ransomware Victims
Google Warns of Rising Cloaking Scams, AI-Driven Fraud, and Crypto Schemes
THN Recap: Top Cybersecurity Threats, Tools, and Practices (Nov 04 – Nov 10)
Bitfinex Hacker Jailed for Five Years Over Billion Dollar Crypto Heist
Amazon MOVEit Leaker Claims to Be Ethical Hacker

Leave a Reply

Your email address will not be published. Required fields are marked *