Big Brother is watching—the impact of Facial Recognition Technology

Big Brother is watching—the impact of Facial Recognition Technology

Big Brother is watching you, no seriously. Facial Recognition Technology is no longer something dreamed up for sci-fi films but our daily reality. Whether you are doing your weekly shop at the supermarket, catching a flight at the airport, or simply unlocking your iPhone, Facial Recognition Technology is at play—without many of us realising it.

With the use of Facial Recognition Technology becoming more and more mainstream—such as being deployed in offices to avoid sign in queues and the complications with introducing it in ‘large-scale quasi or hybrid’ public spaces (for more on this issue see: LexisPSL: Private eyes—facial recognition technology in quasi-public spaces)—it is sure to be an interesting ride.

Have you ever stopped to think how many times your face has been captured by Facial Recognition Technology?

As reported by the Guardian, the UK is ‘the most spied upon county in the world’, with the Telegraph concluding that a Londoner’s image is captured at least 300 times a day by CCTV. Like it or not, this tech has already taken hold and has been integrated into industries such as the police force and even in health care.

In September this year, plans to develop a public facial recognition database sparked intense controversy. Further reports surfaced detailing that Google was targeting people based on their skin colour. Clearly there are many fears, challenges and barriers to be overcome—many of which we are still unaware of due to the technologies infancy.

Facial Recognition Technology lawful?

September also saw a landmark challenge over the Police use of Facial Recognition Technology to search for people in crowds. In the world’s first legal case of its kind, the High Court found the technology was being used lawfully.

The case follows the Metropolitan Police Service and South Wales Police ‘Live Facial Recognition’ trail, which uses this technology to identify wanted criminals.

The trial and the related legal challenges have thrown up many concerns, such as:

  • potential inaccuracies with individuals’ profiles built by the technology
  • questions around accountability
  • inaccuracies over the technologies use for women and BAME subjects

Subscription Form

Related Articles:
Latest Articles:

Already a subscriber? Login
RELX (UK) Limited, trading as LexisNexis, and our LexisNexis Legal & Professional group companies will contact you to confirm your email address. You can manage your communication preferences via our Preference Centre. You can learn more about how we handle your personal data and your rights by reviewing our  Privacy Policy.

Access this article and thousands of others like it free by subscribing to our blog.

Read full article

Already a subscriber? Login

About the author:

Hannah is one of the Future of Law blog’s digital and technical editors. She graduated from Northumbria University with a degree in History and Politics and previously freelanced for News UK, before working as a senior news editor for LexisNexis.