“I See You” – the Advent of Facial Recognition

Life lags a little behind CSI and the other forensic fantasies on the tube — but not so far behind as you might imagine. We’ve all heard about government-employed facial recognition software that, in theory, can pick a putative bad guy out of the madding crowd. Now that sort of knowing eye is headed everywhere, thanks to the race to employ face recognition capacity in mobile devices.

It’s all part of the current lust to know and be known — to “befriend,” “follow” and “share with” a wide range of people in (or near) one’s life. To see what will soon be commonplace, check out the video from Viewdle below:


Viewdle – Photo and Video Face Tagging from Viewdle

(This is a very cunning ad, it seems to me: in portraying strong and attractive young women, it appeals to both men and women; and, I’d suspect, women are the biggest target audience when it comes to sharing on social media. The music is both “kicky” and “creamy,” I’d say, suggesting things are carefree with the product: fun, no worries.)

All the big players are working on facial recognition. Google currently has a version on its photo storage service Picassa, as does Apple in the recent version of iPhoto. As well, Apple recently acquired Polar Rose, which developed a face detection software that could be useful on Apple’s iOS platform. And the rumour is that Google has an effective mobile facial recognition app in hand but is delaying its implementation because of real concerns over privacy.

At first it seems incongruous that social media sharing should be growing exponentially at the same time that privacy laws are being implemented and elaborated. Of course, it’s one thing to have every Tom, Amir, and Juan know at each moment where you are and what you’re doing, and quite another to have the government possess the same data. But that separation is less effective in practice than we might wish — just ask Julian Assange. Some privacy laws, however, direct themselves at you and me (and our friendly, neighbourhood corporations), and not just at governments. For an example that’s quite relevant here, you might consider Quebec’s concern about photographing citizens in public places: Aubry v. Éditions Vice-Versa inc., [1998] 1 S.C.R. 591.

I think it’s fair to say, we are conflicted when it comes to opening our lives to fellow citizens, even to friends. To borrow an imperfect comparison, we might say that the urge to share is the “id” and the concern about privacy is the “super ego,” always in conflict as the poor “ego” strives to find a balance.

As technological innovation races ahead, our legal response chugs along behind, losing ground. We know this is coming and will soon be as widespread as smart phones. What legal reaction is being considered?

Comments

  1. Simon,

    Interesting post and quite a catchy video, if not somewhat disturbing. Honestly, I don’t think regulators (of any sort) are equipped to handle the pace of technology in the context of social and sharing. Perhaps it is time to stop talking about the how the legal system is going to regulate or respond to privacy issue and begin talking about what life looks like without any clothes on?

  2. David Collier-Brown

    The elephant in the facial recognition room is it’s ability to mis-recognize people, given a bad or broad enough set of samples. A former employer developed facial recognition software for the German federal police, who were hoping to use it to pick particular individuals out of the boarding lines at an airport. To their dismay, my Smarter Colleagues found that facial recognition was excellent for picking grandma out of a collection of family pictures, but would also happily pick her out of the crowd when told to search for Osama bin Laden.

    This makes it a substantially more risky to depend upon in contexts where the operators don’t manually verify the identification, but instead engage in computer worship and assume that a program running on a portable phone will be more accurate than a much more careful program running on a massive array processor.

    This raises some real privacy concerns, as well as issues for the courts when they have to deal with evidence whose validity is highly dependent on the quality of a mathematical algorithm and the makeup of a sample set as random as the crowd at an airport.

    –dave