Concerns Raised by Facial BiometricsCoupling Biometrics, Social Media Creates Anxiety over Privacy
"Facial recognition technology can be used without the knowledge or the consent of the individual, to be totally oblivious," Beth Givens, founder and director of the Privacy Rights Clearinghouse, a privacy advocacy organization, says in an interview with GovInfoSecurity.com (transcript below). "Yet, once you identify that person based on the unique characteristics of their face, you could then match it with other databases."
Use of facial biometrics could affect a wide-range of people. For example, Givens says, protesters could easily be identified at an assembly. Shoppers could be targeted based off of their shopping habits. And customers at banks could be given preferential treatment.
To make her point, Givens cites a study conducted by Carnegie Mellon University, in which researchers, using only a photo of a person's face and information made publicly available online, identified a person's birth date, personal interests and Social Security number.
"Once you know a person's name, birth date, and social security number, you have enough information to commit new account fraud or identity theft," Givens says.
For information security officers in banking, government and healthcare, using biometrics as a possible tool to protect their critical data is fine if such applications are backed up with solid privacy and security policies and practices, Givens says.
In the interview, Givens explains that use of facial recognition technology could:
- Violate privacy rights by not getting an individual's consent.
- Result in unequal treatment of consumers by businesses.
- Encourage stalking and violence.
Givens founded the Privacy Rights Clearinghouse in 1992. She developed the clearinghouse's Fact Sheet series that addresses a wide variety of privacy matters. Givens also authored the encyclopedia entries on identity theft for Encyclopedia of Privacy, World Book Encyclopedia and Encyclopedia of Crime and Punishment. She also authored The Privacy Rights Handbook: How to Take Control of Your Personal Information (Avon, 1997) and co-authored Privacy Piracy: A Guide to Protecting Yourself from Identity Theft (1999). She contributed a chapter on consumer and privacy rights to the 2006 book, RFID: Applications, Security and Privacy.
Privacy Rights ClearinghouseERIC CHABROW: Before we discuss your concerns with facial recognition technology, please take a few moments to tell us about the Privacy Rights Clearinghouse.
BETH GIVENS: We're not a new organization. We were founded nearly 20 years ago, before the Internet actually, and before a lot of these emerging technologies came on the scene. We started as a California-only non-profit consumer education and consumer advocacy group. Since the advent of the Internet and when our website went online in 1996, we are now a nationwide group with a two-part mission. We do consumer education. We are kind of a "Dear Abby" of privacy in that we invite people to contact us with their questions and their complaints. We learn a lot just by talking directly with people. Then secondly, we are involved in some advocacy privacy in the California state legislature.
Facial Recognition ThreatsCHABROW: The Privacy Rights Clearinghouse cautions that facial recognition technologies, especially as it becomes more sophisticated, may be one of the greatest privacy threats of our time. How so?
GIVENS: I have kind of a mantra in terms of privacy rights and that is: individuals deserve transparency and they need control. Those two key words - transparency and control - say a lot. Facial recognition technology can be used without the knowledge or the consent of the individual, to be totally oblivious, totally invisible to the individual and yet once you identify that person based on the unique characteristics of their face, you could then match it with other databases. You could connect online information to off-line information. There are a lot of possibilities in terms of where that simple capture of one's face will lead.
CHABROW: Can you give an example to that?
GIVENS: I'll give you two. One is sort of in the public arena and one is in the commercial arena. Let's just say that you are demonstrating at a public event. You may not like something that the government has done. You may be against a certain law, or a certain proposal, and you are out in public at an event. Those individuals who are participating in that event could have their faces captured say by law enforcement or other government agencies and then be identified in that way. That's kind of the constitutional side of the issue. I think most people when they're out in public take for granted that they're anonymous. I think they know that when they go into a commercial space, say a store, there are video cameras all over the place taking their photo, but now I'm moving over to the commercial sector application. You could actually be identified when you walk into a store and if that store has a database on you, let's just say you're a frequent shopper or maybe even this is the first time, perhaps the store is cooperating with an alliance of merchants, and you could be identified. They might know then: are you an impulse shopper? What sorts of items are you likely to buy? What is your income level? You might be treated a certain way based on those things that they know about you.
Or let's just say it's a bank. You walk in and they're able to immediately identify you as a top-notch, important, valued moneyed customer, and you might get shuttled to the front of the line or to a special area for preferred customers. What if you are somebody coming in who just has a small account and a few transactions? You might not get served as well.
Another commercial application that we are particularly concerned about is price discrimination. You might be offered one price, and this is particular in the online arena, if you have a certain profile and another price if you have another. There are some, I think, tremendous privacy implications both on the constitutional side of that privacy dividing line and the informational privacy side.
Facebook & Facial RecognitionCHABROW: In the press release you put out, you mentioned something about Facebook and combining that with other kinds of technologies including facial recognition. Can you discuss your concerns with that?
GIVENS: Yes. In fact, let me refer to the Carnegie Melon University study that really peaked our interest in this issue. They actually took Facebook photos. Now they didn't use the Facebook facial recognition technology. All they did was they went on Facebook and retrieved photos to then match against photos that they obtained from a different site, some online dating sites where people were not named. They took these essentially anonymous photos from the dating site and were able to match it to publicly displayed photos on Facebook using an off-the-shelf facial recognition software program called PITT PATT (Pittsburgh Pattern Recognition). By the way, Google has since purchased PITT PATT which I think is a significant matter. They were able to identify ten percent of all of those anonymous people from the dating site. In another, I'm assuming it was their Carnegie Melon University campus, they took photos of students walking around on campus and they were actually able to identify 31 percent of those, and then I think even more fascinating is they took a photo of a person's face and they took all the information they could find publicly available online and from that, and this is astounding, they figured out the person's birth date, their personal interest and their social security number. Now for me, since we've been involved in identity theft for a long time, once you know a person's name, birth date, and social security number, you have enough information to commit new account fraud or identity theft.
CHABROW: Are you aware of any laws that prohibit or limit the use of facial recognition technology or are you aware of any bills before Congress or state legislatures that would restrict the use of facial recognition products?
GIVENS: Well I know in Europe this is a big deal. In Germany, I know they're very, very concerned about this, and I think they've demanded that Google not use this technology. The European scene is quite different in terms of privacy laws than the U.S. scene however. I am not aware of the commercial side of the fence, a law specifically stating that facial recognition is prohibited or limited in anyway. I don't think we're there yet. This is just to the best of my knowledge, however. I think with the attention that this issue is getting, there could very well be some bills, especially at the state level, perhaps even in Congress that would address this issue. I know that Congress has addressed location-based identification services related to the mobile phone for example, and that's an example of an emerging technology that also has significant privacy implications. It wouldn't surprise me with the additional interest in this issue if there would be some attention paid, either at the state level or in Congress.
Biometrics & Protecting AssetsCHABROW: Our audience largely consists of those in government, healthcare, financial services and other industries responsible for safeguarding their digital and physical assets. Biometrics including perhaps facial recognition could be one of the tools in their arsenal to do just that, such as facial scanning to identify those authorized to enter secured buildings or access a database containing sensitive data. Do you see that as a problem?
GIVENS: I think if they back up those applications with good, solid privacy and security policies and practices than they will be in good shape. I think they should also pay attention to the whole emerging technology of biometric encryption. There is the biometric template. That's essentially the long string of zeros and ones that relates to the shape of your face and the key identification points on your face. And of course, all of this is then stored in a database and then other information about you is related to that long string of zeros and ones that identifies your face. It's a database really where all the action is. Using biometric encryption I think would be a very important thing for people who read and listen to your messages to consider, and they might want to look at what the Providence of Ontario Privacy Commissioner has been doing in that regard. Ann Cavoukian has certainly been leading the way on this issue.
CHABROW: And do you know much about what is going on in Ontario?
GIVENS: They've been using it in the gaming and lottery industries and have been apparently for quite some time, because as you probably know when you walk into a casino you're giving up all of your privacy. And as you probably already know you're on camera and identified from the moment you enter to the moment you leave. I think if you want to sort of examine a case history of surveillance, take a look at the casino industry. Up in Ontario they've been working with encrypting all of that data compiled through digital surveillance and facial recognition biometrics in order to safeguard that data and prevent abusive uses of it.
CHABROW: Anything else you would like to add?
GIVENS: There's another key issue that I'm very concerned about. We're contacted from time-to-time by victims of stalking and domestic violence. I must say that one of the key concerns that I have is the potential to use facial recognition technology and identification applications to actually identify individuals and then stalk them. I know that Google has been reluctant to actually put out a facial recognition app for mobile phones, and I would hope that other companies would think long and hard about this particular matter before they actually do that. If you see somebody on the street that catches your eye and you are somebody who may be obsessive or with a stalking mentality, imagine the harm that could be done with this technology. It's another concern that I want to toss out because I think we're going to see this as a growing problem.