Facing realities

Face-recognition technology has suddenly become much more powerful. That, as Tom Standage argues, has troubling implications for privacy

By Tom Standage

How would you feel if you got a message, addressing you by name, from a stranger who’d glimpsed you in a pub last night? Most people would find this creepy. But such a scenario is now a reality in Russia, thanks to FindFace, a smartphone app launched in February. It is, in effect, a real-world search engine for people: take a photo of someone with your phone, and the app can tell you who it is, with remarkable accuracy, by comparing the photo with profile pictures on VKontakte, a Russian social-networking site with about 200m users. In one test, FindFace correctly identified 70% of people on the St Petersburg metro. In another, carried out by Kapersky Labs, a cyber-security firm, it identified nine out of ten people in the firm’s Moscow office.

FindFace considers itself to be a high-tech dating app or, as such products invariably plead, “an innovative platform to find new friends”. If someone catches your eye, all you have to do is snap them to find out who they are. The app is powered by face-recognition software from NTechLab, another Russian startup, whose founders, Artem Kukharenko and Alexander Kabakov, suggest it is more versatile than other dating apps. You could, they suggest, feed the app an image of your favourite film star or an ex-partner, and it will give you a list of people who look similar. The app has already been downloaded nearly 650,000 times and has been used to perform more than 3m searches. NTechLab’s website says it is “focused on building software that makes the world a safer and more comfortable place”.

But FindFace is making many people feel just the opposite. Users of Dvach, a mischief-making Russian messaging board, have already used it to identify and harass women working as prostitutes and porn stars, informing their friends and family about their activities. And if FindFace doesn’t already sound like a gift to stalkers, check out its marketing, which shows the app being used to determine the identity only of young women. FindFace, for its part, promises that it will “thoroughly monitor its usage…and ban those organisations and people who will try to use it inappropriately”.

For NTechLab’s founders, dating is just the beginning: they say their technology has huge potential for law-enforcement, identifying witnesses and suspects from photos or CCTV images. Crowd-sourcers have already used FindFace to alert the police to a pair of arsonists in St Petersburg. The company also sees scope for using FindFace in retail, targeting shoppers with ads based on their in-store behaviour. Face recognition offers (or threatens) to bring tracking and targeting, already routine online, into the real world. “Russian Face Recognition App Is a Cutting-Edge Privacy Nightmare”, wails one headline.

Don’t panic just yet, though. FindFace currently works only in Russia, where it takes advantage of the fact that VKontakte makes all profile pictures public. Facebook does not: it keeps as much of its users’ personal data as possible to itself. So only Facebook could build a similar app using Facebook profile images. It has chosen not to, though it does use face-recognition technology to help you tag people in photos and collate pictures of individuals, which has already led to legal trouble in a number of jurisdictions.

Facebook is not the only internet giant using such technology. Google, Tencent and Baidu have software that works almost as well. They all rely on an artificial-intelligence (AI) technique called “deep learning”, powered by neural networks modelled on the architecture of the brain. Nonetheless, NTechLabs’ software, called FaceN, seems to have the edge. In the world championship of face recognition – yes, there really is such a thing, and it’s called MegaFace – FaceN beat everyone else, correctly identifying 73.3% of faces in a test database of a million images. Google, whose FaceNet came second with 70.5%, has deliberately not made its technology publicly available, seemingly because of privacy concerns. But there are millions of tagged images of people available on photo-sharing sites and social networks, so there is no technical reason why a Western version of FindFace could not be built.

If the face-recognition genie is out of the bottle, should we perhaps welcome it? It could help find missing persons. You’d be able to work out who that familiar-looking chap is at a party. You could use your face instead of a password. And, anyway, people are already used to being tracked everywhere by their smartphones. Why is this any different? Well, you can opt out of surveillance by smartphone by turning your phone off or not using one at all, but you can’t stop using your face.

Imagine what authoritarian regimes might do with the technology: anyone who attends a protest march might be identified. Face recognition could completely destroy the expectation of anonymity in public – unless you wear a disguise, that is. A Japanese firm has started selling a visor that foxes facial-recognition systems; and Adam Harvey, an American artist, has devised hairstyles and make-up techniques that do the same thing. (Contouring is so passé.)

The debate over face recognition marks the culmination of a long series of arguments about personal privacy in the digital age: it is now possible to identify people even if they don’t use any technology at all. It’s also a reminder that concerns over AI are more immediate than you might have realised. Stephen Hawking, Elon Musk and others have raised fears that super-intelligent computers might take over the world, Terminator-style. But face recognition offers an example of the dangers of AI technology right now. The ethical and regulatory quandaries posed by AI are not theoretical: they are here already, inside your smartphone.

IMAGE: AKG

More from 1843 magazine

1843 magazine | “We have to make Biden lose”: Arab-Americans are switching to Trump

Anger over Gaza in the swing state of Michigan might cost the president the election

1843 magazine | Inside the Kenyan cult that starved itself to death

During covid-19 a preacher lured thousands of people into a remote forest. Then he told them to stop eating


1843 magazine | Houston, Texas: where asylum cases come to die

Some immigration lawyers relish a challenge