Menu

Blog

Dec 2, 2013

Google’s Anti-Facial Recognition Policy for Glass is Deadly

Posted by in categories: ethics, information science, policy, privacy

I believe Google is making a huge mistake in completely banning facial recognition systems for its Glass product. In my opinion, such a system could be used to help save thousands of lives. But then, we’re too damn caught up on absolute privacy that we’re willing to sacrifice actual, physical lives to ensure our privacy remains untainted. Such individualist dogma is deadly.

According to the Amber Alert webpage, “A child goes missing in the United States every 40 seconds,” and that “More than 700,000 children go missing annually.” That is an absolutely frightening statistic! Much more frightening than the prospect that some Glass user may know my name.

How far are we willing to go to ensure absolute privacy isn’t diminished whatsoever? When does the right of privacy begin interfering with the right of safety? Can the two come together in harmony, or are they destined to be in conflict until society finally reaches a decision over one or the other?

I understand the desire for privacy, but as I’ve argued in the past, as we as a society become more public and technologically open-source, the idea of privacy slowly fades away. That isn’t to say that some forms of privacy can’t be maintained. Surely we should have the right to say ‘yes’ or ‘no’ over whether or not our private data is to be shared publicly. That level of freedom and choice could easily maintain a sense of privacy to each individual.

But then, when it comes to missing children, or even missing adults, should we not then be willing to sacrifice a portion of our privacy to ensure the safety of those who’s gone missing? It doesn’t even have to be that large of a peek into each’s private lives — simply a facial recog. map, a name, and whether or not they’re reported missing, or even possibly wanted.

Picture this with me: It’s 2014 and only a few months have passed since the commercial launch of Google Glass. Hundreds of thousands of people already acquire their own device, scattered across the United States. A mandatory app was included with Glass, which was connected with Amber Alert systems. The app has Glass quietly scanning each face you cross paths with, but doesn’t reveal their names, nor does it alert you that it’s currently scanning. For all you know, it’s a normal day like any other.

Now, as you’re walking down a street, you walk past an adult male with a pre-teen female. You don’t even pay much attention to them. Just another group of people walking by, as far as you’re concerned. But then Glass, on the other hand, knows something you don’t — the little girl has been reported missing. As a result, without alerting you, the app then — albeit quietly — takes a snapshot of the girl and unknown male captor, contacts a 911 operator program, and delivers GPS coordinates of where the photo was taken and in which direction the girl was walking. The police show up, arrest the male captor, and contacts the parents of the missing child informing them that she’d been found and safe.

This was able to occur because each parent — or family member, guardian, etc. — had allowed the missing child’s name and facial recog. map to be archived in a Amber Alert system program, which connects via app on Glass. Was said child’s “privacy” diminished? Yes. But then she’s also alive because of it and a kidnapper is taken off the streets, not able to harm anyone else again.

Isn’t this very real prospect of technologically-enhanced safety worth sacrificing a bit of our own privacy? While I’m not a parent, if anyone of my family were to go missing, their privacy would be the last thing I’d be concerned about. And if I’d gone missing, I’d want everyone to do all they could to find me, even if it meant sacrificing my own privacy.

Google Glass is coming just next year. And with Google’s determination to ban facial recognition using Glass, we must ask ourselves: At what price?

The article above was originally published as a blog post on The Proactionary Transhumanist.

8

Comments — comments are now closed.


  1. David Brin says:

    I agree completely, as expressed in my nonfiction book: “The Transparent Society: Will Technology Make Us Choose Between Privacy and Freedom?” A ban on face registries is expected in the short term. But people will realize that soon, elites of government, wealth or criminality will have access anyway. Our only hope for freedom will come when we all can see as well as those elites can.

    See:
    http://www.scoop.it/t/the-transparent-society

    With cordial regards,

    David Brin
    http://www.davidbrin.com
    blog: http://davidbrin.blogspot.com/
    twitter: https://twitter.com/DavidBrin

  2. As a person with Asperger Syndrome, I need a face-recognition app to remember people I don’t know well. If Google Glass can’t have one, I no longer have a compelling reason to want Glass.

  3. Paul Wakfer says:

    A good article except that a penchant for privacy (except from government and its enforcers) is actually *not* individualistic, but rather simply short-range irrational. The long-range view of a true individualist is pride in one’s being and all one’s actions and desire for the positive or negative social preferencing by others which ensues. See my website for foundational details and its advanced technology section for my proposal for constant cloud recording of all of one’s entire environment whenever one is outside of one’s home.

  4. Ian Pearson says:

    Face recognition could certainly help somewhat in this particular problem. assuming that runaways are not disguising themselves and paedophiles are not aware of the system and therefore not hiding or disguising their victims. However, it is a false argument to suggest that we must choose either privacy or safety. Even if everyone were to be anonymous all the time, it is still possible to protect people. It is also wrong to assume that all runaways want to be found and protected. Do they have no right to stay hidden if that is their true desire? Can emergency services not be issued with licensed cameras to help with such problems if that’s what we think is appropriate? To use face recognition to identify paedophiles with minors would require extremely high reliability if many false positives are to be avoided.

    However, the main objection is the level of potential abuse of face recognition. It won’t only be used by benign systems. Just sticking with ones directly relevant to this issue, any paedophile or slaver (many thousands of vulnerable people are held and hidden away as slaves) would also see which children are missing and without protection, or which adults are vulnerable, so this would increase the danger for missing children and vulnerable adults, not decrease it. It will also force any children who genuinely want to vanish to go to more extreme means.

    In short, it can’t solve the problem, and could even make it worse. The other problems raised by face recognition stack on top of this, but even on this one the idea makes no sense and on this occasion, Google is right.

  5. Mark Lockie says:

    I have seen facial recognition demonstrated on Google Glass — despite the ban. It is a “huge” way short of being able to do what you describe. Facial recognition in this sort of unconstrained environment would also be likely to produce a massive number of false positive hits, so stretching law enforcement to the limit. In terms of privacy I think we must be careful about being able to use such devices — especially if they become ubiquitous.

    Niche applications could be a good idea, for instance at checkpoints where a person’s ability to use two hands is essential, but for the wider public there would need to be some robust safeguards.

  6. Tihamer Toth-Fejel says:

    As an engineer, the first question I would ask is: *How* is Google going to ban face recognition software from Glass? I don’t think it will be possible to enforce.
    As a citizen, the first question I would ask is: Is privacy really a right? No, it is not. You have no privacy in a family, or a small town, but you certainly have a right to life and liberty in both cases. The desire for privacy is understood because in a sense it is a forced one-way intimacy; However, if it is two-way, then it is less likely to be forced. Besides, evil loves anonymity. The loss of anonymity (the same you would lose in a small town) is a small price to pay for the added security and freedom you gain from knowing the identity of everyone you see.

  7. Normal Liberal Human Being says:

    Fast-forward to 2018. Imagine quick scans, imagine instantaneous scans: Glass 2.0 or 3.0

    Now imagine:
    - Sex Scan App: uses search software and advanced recognition software to immediately search out and flash up any nude or lewd pictures on internet of everyone you pass on the street. An alternative app could synthesize a naked pic, to give you a general idea.

    - Embarass Scan App: similar, but displays any embarassing or drunk photos of everyone you pass on the street

    - Religion Scan App: displays the religious affiliation of everyone you see, based on any Facebook declarations, hints or insinuations across a wide range of social media

    -Political Scan App: same as above, but displays political stance of everyone you see (maybe summed up in a single fun graphic — like Dick Cheney’s fat head.)

    Now fast-forward another 5 years, to the point where many people have gotten used to the idea that many people are continuously running one or more of these apps constantly. Or an app that displays all of this in a fun way. Imagine yourself buying a cup of coffee, or saying Hi on the street. Or interacting on a college campus. You’re not just being mentally undressed; you’re being seen at your worst at all times. How cool is that? People in a small town don’t literally see you drunk or naked WHILE you are buying a donut from them–even if they did happen to be at that Christmas party 12 years ago. But with Glass, everyone will. We might as well all get married.

    Since porn and bigotry are already primary uses of search engines for very many people, I think we just have to assume that this is where the road is going. It will happen. I just hope we have at least, say, 2 or 3 years to prep for it. But I actually doubt that we have that long. And I worry about my kids. I don’t know if they’re actually less likely to be kidnapped when a good quarter of everyone is walking around the street in a sea of porn.

  8. If your boss can watch you and you can watch your boss, it does not
    balance out: rather, it gives your boss more power over you. See
    http://ieet.org/index.php/IEET/more/stallman20121208.

    Beyond that, privacy is necessary for democracy. See
    http://www.gnu.org/philosophy/surveillance-vs-democracy.html.
    To make whistleblowers and democracy safe, we must redesign
    systems so that they do not accumulate dossiers about everyone.

    One way to do this is to make them remember only people who are
    specifically sought on legal grounds.

    For instance, if face recognition systems and license-plate
    recognition systems can only “see” people and cars that are being
    sought, under a court order, plus invalid plates, they would do some
    good while respecting most people’s privacy. They would be able to
    spot kidnapped children, to the extent such systems have any chance at
    it.

    I doubt a kidnapped child will walk past you on the street with a
    captor, acting as if everything were ok, such that you would need face
    recognition to tell you to call 911. But that’s a side issue.