FTC Beat
Posts Tagged ‘Facial recognition’
Jul 18
2013

Google Glass Sounds Exciting — But What About Privacy?

Beta testing is underway for Google Glass, a new technology that provides the functionality of a smartphone in a headset worn like glasses. Much like a smartphone, the Glass headset is able to exchange messages with other mobile devices, take pictures, record videos, and access search engines to respond to user queries. But unlike a smartphone, the device’s optical head-mounted display, voice recognition, and front-facing camera give users hands-free access to its features, including the ability to capture photographs and video recordings of the world in front of them.

For now, Glass is only available to developers and a small group of test users known as Google Explorers. The device is expected to go on sale to the public in 2014. In the meantime, public speculation swells, and the blogosphere is full of conflicting reports about what the device can and cannot do. Some suggest that the device will utilize facial recognition and eye-tracking software to show icons and statistics above people whom the user recognizes. A more common concern is that the device will be able to photograph and record what the user sees and then share that data with third parties without permission from the user or those whose likenesses are being captured.

Because of this lack of clarity, lawmakers around the world are urging Google to affirmatively address the myriad of privacy concerns raised by this new technology. Last month, an international group of privacy regulators – including representatives from Australia, Canada, Israel, Mexico, New Zealand, Switzerland, and a European Commission panel – signed off on a letter to Google’s CEO Larry Page asking for more information regarding the company’s plans to ensure compliance with their data protection laws.

Here in the United States, the House Bipartisan Privacy Caucus issued a similar letter of its own. In addition to a variety of questions regarding the device’s capabilities, the letters reference some of Google’s recent data privacy mishaps and ask whether Google intends to take proactive steps to ensure the protection of user and nonuser privacy.

Google’s Vice President of Public Policy and Governmental Relations (and former New York Congresswoman) Susan Molinari issued a formal response to the House Bipartisan Privacy Caucus. According to Molinari, Google “recognize[s] that new technology is going to bring up new types of questions, so [they] have been thinking carefully about how [they] design Glass from its inception.”

To address concerns about the picture and video capabilities, Molinari highlighted several features designed to “give users control” and “help other people understand what Glass users are doing.” For example, specific user commands are required to search the Internet or find directions, and the user must either press a button on the arm of the Glass or say “Take a photo” or “Record a video” in order to access those features.

Molinari’s letter plainly states that Google’s privacy policies will not change for Glass. Instead, Glass will be governed by the terms of the Google’s current privacy policy. However, the company has created policies for developers making Glass apps, also known as “Glassware.” For example, developers will not be allowed to incorporate facial recognition into their Glassware, and they are prohibited from disabling or turning off the display when using the camera. Glassware developers must also agree to terms of service for Glass’s application programming interfaces (APIs) and have and follow their own privacy policies. We are carefully observing Google’s actions to ensure that they are in keeping with Google’s promises, and we encourage all Glassware developers, as well, to comply with Google’s policies.

posted in:
Privacy
Nov 12
2012

Policing the Wide, Wild New World of Biometrics

Progress in the world of biometrics should cause us all to shudder. Cameras in public locations can now employ facial recognition to direct advertising to us based upon an assessment of our age, sex, and other characteristics. Cameras can determine our reaction to and engagement in video games and movies. It sounds a bit like a world composed of two-way mirrors. But instead of shuddering, we sometimes knowingly, sometimes carelessly, support the technology – and other data collection practices – through our online and commercial activities.

How many of us constantly update and tag our Facebook pages with pictures of us and our loved ones and where we’ve been? How many take advantage of product/service discounts by scanning our smart phones and “liking” products on Facebook? How many of us are now buying into dating apps and social apps that are based on facial recognition technology? The fact is that much of our data can be, and is being, collected and we consumers (especially in the United States) seem to have no problem with it, even volunteering for it.

Perhaps fortunately, some regulators are stepping in and keeping a watchful eye on these developments and looking for ways to curb the potentially nefarious use of consumer data. The FTC and its Division of Privacy and Identity Protection recently published its list of best practices for companies who use facial recognition technologies. The publication, “Facing Facts: Best Practices for Common Uses of Facial Recognition Technologies,” underlines important concerns about being able to identify anonymous individuals in public and about attendant security breaches such as hacking. The FTC’s proposed best practices include the following:

• Companies should maintain reasonable data security protections to prevent unauthorized information “scraping” of consumer images and biometric data.
• Companies should maintain appropriate retention and disposal practices.
• Companies should consider the sensitivity of information when developing facial recognition products and services, e.g., they should avoid placing signs in sensitive areas, such as bathrooms, locker rooms, health care facilities, or places where children congregate.
• Companies using digital signs capable of demographic detection should provide clear notice to consumers that the technologies are in use, before consumers come into contact with the signs.
• Social networks should provide users with clear notice – outside of a privacy policy – about how the feature works, what data it collects, and how it will use the data.
• Social networks should provide consumers with (1) an easy-to-find, meaningful choice not to have their biometric data collected and used for facial recognition; and (2) the ability to turn off the feature at any time and delete any biometric data previously collected.
• Companies should obtain a consumer’s affirmative express consent before using a consumer’s image or any biometric data in a materially different manner than they represented when they collected the data.
• Companies should not use facial recognition to identify anonymous images of a consumer to someone who could not otherwise identify him or her, without obtaining the consumer’s affirmative express consent.

The guidelines come only a few months after the FTC’s March 2012 Privacy Report (“Protecting Consumer Privacy in an Era of Rapid Change: Recommendations For Businesses and Policymakers”) and are a logical follow-on to the report. They incorporate the Privacy Report’s core principles: privacy by design, simplified consumer choice, and transparency. These principles and guidelines are a step in the direction of responsible data collection and responsible technological advancements.

We should point out that neither the Privacy Report nor the Best Practices in Facial Recognition are binding or enforceable as they do not fall under FTC’s legal authority. And the FTC prominently makes this disclaimer, noting that the guidelines are merely recommendations without the force of law. It is clear, however, that the FTC is appropriately preparing to assume enforcement authority, should Congress pursue privacy legislation (something the FTC recommends in the Privacy Report). That is obvious from the mere fact that the agency has established a Privacy and Identity Protection Division.

Companies that are developing or seeking to employ biometrics – or that employ other data collection practices – would be well advised to pay attention to the FTC’s recommendations. The guidelines provide insight into how an enforcement authority is likely to approach biometrics and other data collection practices. The guidelines also provide a framework for responsible use of consumer data. And even though consumers currently seem passive or dismissive about biometrics and data collection, it would take just one scandal or highly publicized incident for public opinion to change. Companies will benefit in the long run by building good will among consumers.

posted in:
Privacy
Connect with Us Share

About Ifrah Law

Crime in the Suites is authored by the Ifrah Law Firm, a Washington DC-based law firm specializing in the defense of government investigations and litigation. Our client base spans many regulated industries, particularly e-business, e-commerce, government contracts, gaming and healthcare.

Ifrah Law focuses on federal criminal defense, government contract defense and procurement, healthcare, and financial services litigation and fraud defense. Further, the firm's E-Commerce attorneys and internet marketing attorneys are leaders in internet advertising, data privacy, online fraud and abuse law, iGaming law.

The commentary and cases included in this blog are contributed by founding partner Jeff Ifrah, partners Michelle Cohen, David Deitch, and associates Rachel Hirsch, Jeff Hamlin, Steven Eichorn, Sarah Coffey, Nicole Kardell, Casselle Smith, and Griffin Finan. These posts are edited by Jeff Ifrah. We look forward to hearing your thoughts and comments!

Visit the Ifrah Law Firm website

Popular Posts