To begin the conversation, two experts in the field stare into the digital crystal ball, and share their opinions on what the future holds when it comes to the ethical implications of facial recognition software. With a collective wealth of experience in technology, privacy, and security, Matthew Hughes and Renate Samson share their thoughts.
Matthew is a reporter at TNW (The Next Web), with previous experience working in IT in the UK and Switzerland, while Renate is the chief executive of Big Brother Watch, a group founded to expose the extent of mass surveillance in the UK.
Matthew Hughes, reporter at TNW (The Next Web)
Hear that loud clopping noise? That’s the sound of a million horses bolting the stable. Yes, I’ve been asked to write a 300-word opinion piece about the ethical implications of facial recognition technology. But at this late stage, is there any point? To use another hackneyed cliché, that ship sailed long ago.
The more interesting question, I’d argue, is why. Regardless of whether or not they’re good for society, face-scanning robots are here, and like bedbugs in a Blackpool hotel room, they’re here to stay.
If you want to know the cause (or at least part of it), I’d suggest you look no further than Cupertino’s latest flagship handset – the iPhone X. Housed in the unsightly bump at the top of the screen is a sophisticated array of sensors. These can memorise the form and shape of your face, noting every crack and crevice, allowing users to log in without a password.
Faintly dystopian, ain’t it? I mean, tech companies know a lot of intimate details about us. Facebook and Google know more about us than our parents. But am I the only one who feels like this is a step too far? Apple’s technology doesn’t ask who we are, but rather what we are.
And it seems likely the iPhone X will be a rousing success. Analysts have attributed sluggish sales of the iPhone 8 and iPhone 8 Plus to people waiting for the release of the iPhone X, which is scheduled for 17 October 2017. People don’t care that the facial recognition technology is creepy as f**k.
To Apple’s credit, it seems to have thought hard about how best to protect the facial recognition data it keeps. Fair play. However, there’s a secondary issue, in that the iPhone X essentially normalises this technology.
It’s a given that we can expect to see similar features in phones from other manufacturers. Who knows if they’ll do a similarly sterling job when it comes to security? More importantly, who cares? If the price is right, punters will lap them up. Over time, facial recognition will become a standard feature in phones, much like cameras and 4G are today.
It’s also worth noting that there’s another side to the coin. Facial recognition is creeping into all parts of everyday life – visible and invisible. While the iPhone X normalises face-based authentication within the context of a consumer technology product, it’s also found a home in law and immigration-enforcement.
The US Customs and Border Protection, for example, is experimenting with replacing boarding passes with face scans, on select flights between the U.S. and the tiny Caribbean island of Aruba. Instead of holding on to a flimsy piece of paper, you’ll stand in-front of a camera, which will check your photo against government records.
Earlier this year, London’s Metropolitan Police controversially used facial recognition software at Notting Hill Carnival. This was largely seen as a failure, with 35 false matches, an erroneous arrest, and only one correct match.
One of the most ambitious deployments comes from Moscow, where it’s being gradually introduced to the city’s network of 160,000 CCTV systems.
The big difference here is that while one can choose to use a purchase an iPhone X, you can’t really opt-out when the cameras are at your departure gate, or hanging above your street.
Renate Samson, chief executive of Big Brother Watch
Apple’s FaceID is seen as the next stage in device security. By permitting a scan of your face to your new iPhone, just as with your fingerprint before, Apple promise to enhance your privacy and security.
But as with so much of the digital world, the thing that can keep us safe is also the thing that can leave us vulnerable.
You may have heard about facial recognition being used by the police at Notting Hill Carnival, at the Champion’s League Final, and at Download Festival. This [facial recognition] is a new bit of surveillance kit the police are testing to see if they can either pick people out of a crowd, or identify them from photo databases.
Both Apple and the UK police are using technologies that rely on facial biometrics – a mathematical code or algorithm – which is unique to every person based on their facial characteristics.
Apple use the biometric to ensure only you can open your phone. Your face is your security and the key to your privacy.
Under the control of the police however, your unique biometric makes you a target for surveillance, a target for analysis, and creates an opportunity to find you at any time or any place.
Whilst currently you have a choice as to whether Apple make a biometric of your face, you may have already lost your choice when it comes to the police.
If you have ever been arrested and had a custody image taken, there is a large chance that your facial biometric has been created and is held by the police - either by the force or on a national database. This applies even if you were found to be innocent and released without charge. Unlike your fingerprints and DNA, there is no law for automatic deletion of facial images or biometrics from police databases.
Facial biometrics may be the future of data security but they are also the future of surveillance. What keeps you private also leaves you exposed.
If you want to learn more about police use of facial biometric recognition technology and call for the automatic deletion of innocent people’s custody images and facial biometrics please visit the Big Brother Watch website, sign the petition, and write to your MP.