The Federal Trade Commission recently filed another complaint against a company for alleged data security lapses. As readers of this blog know, the FTC has initiated numerous lawsuits against companies in various industries for data security and privacy violations, although it is facing a backlash from Wyndham and large industry organizations for allegedly lacking the appropriate authority to set data security standards in this way.
The FTC’s latest target is LabMD, an Atlanta-based cancer detection laboratory that performs tests on samples obtained from physicians around the country. According to an FTC press release, the FTC’s complaint (which is being withheld while the FTC and LabMD resolve confidentiality issues) alleges that LabMD failed to reasonably protect the security of the personal data (including medical information) of approximately 10,000 consumers, in two separate incidents.
Specifically, according to the FTC, LabMD billing information for over 9,000 consumers was found on a peer-to-peer (P2P) file-sharing network. The information included a spreadsheet containing insurance billing information with Social Security numbers, dates of birth, health insurance provider information, and standardized medical treatment codes.
In the second incident, the Sacramento, California Police Department found LabMD documents in the possession of identity thieves. The documents included names, Social Security numbers, and some bank account information. The FTC states that some of these Social Security numbers were being used by multiple individuals, indicating likely identity theft.
The FTC’s complaint alleges that LabMD did not implement or maintain a comprehensive data security program to protect individuals’ information, that it did not adequately train employees on basic security practices, and that it did not use readily available measures to prevent and detect unauthorized access to personal information, among other alleged failures.
The complaint includes a proposed order against LabMD that would require the company to implement a comprehensive information security program. The program would also require an evaluation every two years for 20 years by an independent certified security professional. LabMD would further be required to provide notice to any consumers whose information it has reason to believe was or could have been accessible to unauthorized persons and to consumers’ health insurance companies.
LabMD has issued a statement challenging the FTC’s authority to regulate data security, and stated that it was the victim of Internet “trolls” who presumably stole the information. This latest complaint is yet another sign that the FTC continues to monitor companies’ data security practices, particularly respecting health, financial, and children’s information. Interestingly, the LabMD data breaches were not huge – with only 10,000 consumers affected. But, the breach of, and potential unauthorized access to, sensitive health information and Social Security numbers tend to raise the FTC’s attention.
While industry awaits the district court’s decision on Wyndham’s motion to dismiss based on the FTC’s alleged lack of authority to set data security standards, companies should review and document their data security practices, particularly when it comes to sensitive personal information. Of course, in addition to the FTC, some states, such as Massachusetts, have their own data security standards, and most states require reporting of data breaches affecting personal information.
Since 2003, online marketers and merchants have been gathering twice a year to take part in the Affiliate Summit Conferences. In recent years, Ifrah Law has become a fixture at these shows, and our associate Rachel Hirsch is not only widely recognized as the face of the Ifrah Law Power Booth station, but also as a well-respected and preferred attorney counseling online advertisers on compliance-related matters and representing them in nationwide litigation.
After Rachel recently returned from this year’s Affiliate Summit East conference in Philadelphia, we interviewed her about new and emerging trends at this conference and in the industry.
Q. What struck you about the crowd at the conference this year?
A. In addition to the new venue, there were plenty of new faces at the conference this year. Surprisingly, however, despite the conference’s name, there weren’t as many affiliates there as there have been in the past. Traditionally, affiliates, sometimes known as “publishers,” are independent third-parties who generate or “publish” leads either directly for an advertiser or through an affiliate network. This year, with a reported crowd of about 4,000 people, the conference included more individuals representing networks, brokers, and online merchants than affiliates. (Official conference statistics bear this out. Only 29 percent of attendees were affiliates.)
Q. What about vendors?
A. According to the organizers, one out of every 10 people there was a vendor. The term “vendor,” however, is something of a misnomer. A vendor can be another term for an online merchant – someone who is actually selling a product on the market – or it can be a generic category for marketers who do not fit into the traditional categories of affiliates, merchants, or networks.
Q. What new industry trends did you notice?
At every conference, one or two markets always seem to have a dominant presence. At the Las Vegas conference in January, there was a large turnout of marketers in the online dating space. This year, two different markets emerged– diet/health and downloads.
Some of the exhibitors this year were manufacturers of neutraceuticals, which can include weight-loss products or testosterone-boosting products. The trend seems to be for online marketers to “white label” or “private label” neutraceuticals from bigger manufacturers. What this means is that online marketers or advertisers actually attach their brand names to a product and product label that they purchase from a manufacturer, either based on their own formulations or based on the manufacturer’s product specifications. Well-known products that would fall into this category include Raspberry Ketone, Green Coffee Bean, and Garcinia Cambogia.
There were also a lot of individuals and companies there in the so-called “download” space. This often means the use of browser plug-ins that the consumer can download himself or herself. These can install targeted advertising (often pop-ups or pop-under ads) on an existing web page.
Q. Are there any risks involved in private labeling?
A. Definitely. If your name is on the label, it doesn’t matter that you didn’t manufacture the product. Your company and your label are subject to FTC scrutiny to the extent that you make claims about the product that you cannot substantiate. And beyond that, the Food and Drug Administration will also flex its enforcement power to the extent you or your manufacturer fail to institute good manufacturing practices, or “GMPs.” While many companies claim that they are GMP-certified, many do not have practices and processes in place to account for defective product batches, serious adverse events resulting from product use, or product recalls.
Q. What are some other hot areas of enforcement by the federal government?
A. Well, how you market your product may be as closely scrutinized as the underlying message. Online marketers who make outbound calls to consumers, or who engage third-party vendors (such as call centers) to make these calls can run afoul of the Telephone Consumer Protection Act. Under the TCPA, anyone who calls customers without their express advance consent, or who hires anyone else to do so, can be hit with a $500 fine for each violation. That adds up, and the TCPA can be enforced by the Federal Communications Commission or by private plaintiffs. Upcoming changes in the TCPA, which will be effective in October 2013, make it even harder to stay on the right side of the law.
Q. How would you put it all together as far as the legal issues?
A. It’s not just the FTC any more. These days, online marketers need to be aware of other agencies with broad enforcement powers, such as the CFPB, the FDA, and the FCC. And don’t forget about the threat of private consumer litigation.
On August 22, 2013, the U.S. Court of Appeals for the Third Circuit ruled unanimously that under the Telephone Consumer Protection Act (TCPA), consumers may withdraw their consent to have robo-callers call them. The full text of the opinion is available here.
The appeals court ruled in favor of Ashley Gager, who was contacted by Dell Financial Services after she revoked her prior express consent to be contacted. In 2007, Gager applied for a line of credit from Dell, which she received and upon which she later defaulted. Gager’s application for a credit line required that she provide her home phone number. In that place in the application she listed her cell phone number. After she defaulted on her credit line, Dell began calling Gager from an automated telephone dialing system. In 2010, Gager sent Dell a letter listing her phone number, which she did not indicate was a cell number, asking Dell not to call her anymore. Gager alleged that after receiving her letter, Dell called her cell phone using an automated dialing system approximately 40 times over a three week period. The TCPA, among other things, bars companies from using an automatic telephone dialing system or a prerecorded voice to call mobile phones, absent prior express consent or an emergency.
The district court granted Dell’s motion to dismiss the complaint for failure to state a claim, holding that Gager could not revoke her prior express consent to receive calls. The district court held that because Dell did not qualify as a “debt collector,” the revocation rules under the Fair Debt Collection Practices Act (FDCPA) did not apply. Thus, the court reasoned that since the revocation rules were inapplicable and the TCPA is silent on revocation of consent, such a right did not exist. The court also noted that the Federal Communications Commission, which has the power to implement rules and regulations under the TCPA, had not issued any advisory opinions at the time that specifically addressed the right to revoke consent.
The Third Circuit reversed the district court’s ruling and found that consumers do have a right to revoke consent. The court rejected Dell’s argument that because the TCPA is silent as to whether a consumer may revoke consent to be contacted via an autodialing system, such a right to revoke did not exist. The Third Circuit’s opinion emphasized that the TCPA is a remedial statute that was passed to protect consumers from unwanted calls and should be construed to benefit consumers. Preventing consumers from revoking their consent to receive calls would not be consistent with the purpose of the statute.
The Third Circuit also noted that the FCC issued a declaratory ruling In the Matter of Rules and Regulations Implementing the Telephone Consumer Protection Act of 1991, SoundBite Communications Inc., after the district court dismissed Gager’s claim, which primarily addresses other issues under the TCPA, but also touched on the issue of the right of consumers to revoke express consent. The SoundBite decision notes that neither the text of the TCPA, nor the legislative history, directly addresses how prior express consent can be revoked, but also notes that “consumer consent to receive . . . messages is not unlimited.” The Third Circuit relied on the SoundBite decision in finding that a consumer may revoke informed consent after it has been given and that there is no temporal limitation on the revocation period.
Dell will still be able to call Gager regarding her delinquent account, but the TCPA prohibits Dell from using an automated dialing system to do so, since the TCPA prohibits autodialed or prerecorded calls to mobile phones without express written consent (or in an emergency). Presumably, Dell can still contact Gager via live calls or through technology that does not amount to an automatic telephone dialing system.
In light of this decision in the Third Circuit, businesses should review their TCPA policies to ensure that they are complying with all rules and regulations. Additionally, on October 16, two additional changes to the TCPA rules will go into effect that impose stricter requirements on claiming exceptions to TCPA liability and all TCPA policies should be reviewed to account for these changes. Businesses should also specifically review their TCPA policies to endure that there is a procedure in place for consumers to opt out of receiving calls and text messages, even if they have previously provided consent. Taking and respecting opt-out requests is an important compliance practice that, if not followed, can lead to significant litigation — and potential damages and penalties.
The credit reporting industry – dominated by Experian, Equifax and Transunion – maintains a precarious balance of obligations: On the one hand, these companies bear a responsibility to banks and other businesses at large to retain reliable information to ensure that the credit scores they report are a fair representation of the individual’s credit-worthiness. On the other hand, federal law, including the Fair Credit Reporting Act, imposes an obligation upon the credit reporting agencies and other related companies to conduct reasonable investigations to address disputes about errors in individuals’ credit files. In both instances, the companies bear a weighty responsibility.
For this reason, companies in the credit reporting industry are subject to intensive regulatory scrutiny – historically by the Federal Trade Commission and, more recently, by the Consumer Financial Protection Bureau. Both agencies have issued reports on their studies of the way in which credit reporting companies handle the information entrusted to them, and how they respond to consumer disputes.
This past Sunday, CBS’s 60 Minutes – a show that most people associate with responsible news reporting – ran a segment that unfairly distorted these reports about credit reporting agencies’ compliance with their obligations. The show, which was largely based on an advance copy of an FTC study, relied upon selective interpretation of the data in that study, throwing out snippets of information without being specific on what the data meant.
The vast majority of the story can hardly be viewed as unbiased: interviews with a politically motivated state attorney general, two plaintiffs’ attorneys who spend their careers suing the credit reporting agencies, a handful of dissatisfied consumers, and several disgruntled former call center employees whose role in addressing consumer complaints was never really explained in a meaningful way. The result was a show clearly intended to convey a message that the credit data retained by these companies is riddled with errors, and that the credit reporting agencies fail to comply with their legal obligations to take steps when there is a claim of an inaccuracy.
In fact, as the Consumer Data Industry Association has pointed out, the FTC study shows that 98 percent of credit reports are materially accurate. In this regard, 60 Minutes missed the most critical point in the research – that the measure of accuracy is tied to the question of whether an error has consequences for consumers and not just whether there is an error that has little or no impact on credit scores. The FTC study actually concluded that only 2.2 percent of credit reports have an error that would lead to higher-priced credit for the consumer.
60 Minutes compounded its error by repeatedly asserting that it was “nearly impossible to expunge” an error in a credit report, and providing a forum for a state attorney general and two plaintiffs’ attorneys to assert that the credit reporting companies do not comply with their obligations under federal law. This one-sided treatment does not square with a 2011 study from the Political and Economic Research Council that showed that consumers were satisfied with the resolution of their disputes in 95 percent of the cases. It also does not square with the results of a year-long study of the dispute process by the FTC in which the agency found no violations of law.
It is not hard to understand what motivated 60 Minutes to run this story: Because everyone has a credit score, an inflammatory story about credit scores is likely to get everyone’s attention. But the one-sided and distorted way in which 60 Minutes presented this information was a disservice to the public. And even if credit reporting agencies are not perfect, they deserve better treatment at the hands of those who have the public’s ear.
On October 16, 2013, two changes will go into effect in the rules implementing the federal Telephone Consumer Protection Act (TCPA). Importantly, these rules impose stricter requirements on mobile messaging and prerecorded telemarketing calls. The rule changes, announced back in February 2012, may spur further litigation concerning the scope of the TCPA. All businesses should review the new requirements to ensure compliance or risk significant potential litigation expenses and negative publicity.
TCPA litigation has been increasing significantly in recent years. The number of TCPA-related cases filed in 2012 increased by 34 percent compared to 2011 and was more than three times the number of cases brought in 2010. Part of the reason fueling the uptick in TCPA litigation is the increasing use of mobile messaging, combined with the enormous potential damages possible under the statute. Every individual text, call or fax that is found to be in violation of the TCPA can result in damages from $500 to $1,500 and there is no limit on the number of violations that can be included in an individual suit. The Federal Communications Commission (FCC) and state attorney generals, as well as private litigants, may also enforce the TCPA.
Some major companies have been hit with significant penalties under the TCPA. In May, Papa John’s International agreed to pay $16.5 million as part of a settlement of a TCPA class action stemming from claims that the company sent unsolicited text messages to more than 200,000 people through a third-party marketer. Steve Madden and Domino’s Pizza have also both reached settlements this year agreeing to fines of nearly $10 million to settle TCPA claims.
The two changes going into effect in October are as follows. One exception from liability under the TCPA for phone calls or text messages using an autodialer or a prerecorded message is for those that are made with “prior express consent.” Under the new interpretation from the FCC of the prior consent exception, with limited exceptions, a business can only invoke the prior express consent exception for autodialed or prerecorded calls to a mobile phone or for prerecorded telemarketing calls to a residential line if the called party has physically or electronically signed an agreement that clearly authorizes calls or texts to be made to their phone number by that particular sender. Additionally, a recipient’s signing the agreement must be optional and cannot be tied to the purchase of any goods or services.
The other significant change to the TCPA rules is the elimination of the “established business relationship” exception for prerecorded telemarketing calls to residences. Previously, businesses could avoid TCPA liability for prerecorded telemarketing calls that otherwise were prohibited by claiming that they had an established business relationship with the consumer by virtue of a previous purchase or other business interactions. The new regulations have eliminated this exemption, meaning businesses are now required to obtain written consent for all prerecorded telemarketing to residential phone numbers, even those that are for previous customers. With this change, the FCC followed the Federal Trade Commission (FTC), which made a similar express consent requirement under the Telemarketing Sales Rule for prerecorded telemarketing calls a few years ago.
As some of the recent cases have shown, businesses can face enormous potential liability under the TCPA, including liability for actions of third-party marketers acting on behalf of them. The statistics demonstrate that plaintiffs’ lawyers are aggressively pursuing TCPA actions, and the changes in the rules may lead to yet more TCPA cases. Given the changes that will go into effect in October, businesses should review their TCPA policies to ensure that they are in compliance, so that they can avoid the possibility of paying onerous penalties.
This week the Federal Trade Commission entered into a consent decree with Certegy Check Services, one of the nation’s check authorization service companies, pursuant to which Certegy has agreed to pay $3.5 million to settle charges that it violated the Fair Credit Reporting Act (FCRA). This massive penalty – the second largest ever – reinforces the perception that the FTC will continue vigorous enforcement against what it perceives as violations of that venerable statute, first passed in 1970.
The FCRA establishes obligations not only for the three big consumer reporting agencies (CRAs) – Experian, Transunion, and Equifax – but also for “nationwide specialty consumer reporting agencies”. These are CRAs that compile and maintain files on consumers on a nationwide basis relating to medical records or payments, residential or tenant history, check writing history, employment history, or insurance claims. Certegy, which falls within this latter category of entities subject to the FCRA, was obligated to “follow reasonable procedures to assure maximum possible accuracy” in the reports it provided concerning consumers’ financial information, and was also obligated to investigate any consumer dispute regarding such informationwithin a reasonable period of time, to report back to the consumer, and to delete any information that is inaccurate, incomplete, or unverifiable.
While Certegy is not as well known to consumers as the big three credit reporting agencies, it plays an important role in consumer transactions. When people want to pay by check, many businesses rely on Certegy for a check authorization recommendation that is based in part on information in its files about consumers’ check writing history. Certegy also furnishes information to other credit reporting agencies, which may multiply the effect of any inaccuracies.
The FTC alleged in its complaint that Certegy failed to comply with many of its obligations as a nationwide specialty consumer reporting agency. Among other things, the FTC asserted that Certegy would not undertake the required reinvestigation of allegedly inaccurate information, and would place unfair burdens on consumers in connection with requests for such reinvestigations. The FTC states that this is its first case alleging a violation of the so-called “Furnisher Rule” relating to regulations governing such entities that furnish credit report information on consumers.
The stipulated order will certainly change the way that Certegy does business but, perhaps even more important, the $3.5 million penalty should attract the attention of other entities whose businesses are subject to the FCRA. Such businesses would be wise to revisit their policies and procedures to ensure that they comply with the obligations under the statute and related regulations to ensure that they will not be the next target of FTC enforcement in this area.
Manufacturers and marketers know that the more consumer data they have, the more they can tailor and direct their advertising, their products, and their product placement. This helps them to maximize sales and minimize costs. Thanks to the combination of cheap data storage and ubiquitous data capturers (e.g., smart phones, credit cards, the Web), the amount of consumer data out there to mine is astounding. Hence the recently-popularized term, “Big Data.”
But the misuse of data could result in government enforcement actions and, more importantly, serious privacy violations that can affect everyone.
Some of the practical challenges and concerns flowing from the use of big data were addressed recently by FTC Commissioner Julie Brill at the 23rd Computers, Freedom and Privacy conference on June 26. Issues raised include noncompliance with the Fair Credit Reporting Act and consumer privacy matters such as transparency, notice and choice, and deidentification (scrubbing consumer data of personal identifiers).
The FCRA: Those whose business includes data collection or dissemination should determine whether their practices fall within the boundaries of the FCRA. As Brill pointed out, “entities collecting information across multiple sources and providing it to those making employment, credit, insurance and housing decisions must do so in a manner that ensures the information is as accurate as possible and used for appropriate purposes.” If Brill’s comments are any indication of enforcement actions to come, businesses should be aware that the FTC is on the lookout for big data enterprises that don’t adhere to FCRA requirements.
Consumer Privacy: Brill gave some credit to big data giant Acxiom for its recent announcement that it plans to allow consumers to see what information the company holds about them, but she noted that this access is of limited use when consumers have no way of knowing who the data brokers are or how their information is being used. Brill highlighted how consumer data is being (questionably) used by national retailer Target: the somewhat funny yet disturbing news story about Target Stores identifying a teen’s pregnancy. It is a classic example of why consumers ought to have notice of what data is being collected on them and how that information is being used.
Consumers also need to have, as Brill suggested, the opportunity to correct information about themselves. This makes sense. Data collection is imperfect – different individuals’ information may be inaccurately combined; someone’s information may have been hacked; someone could be the victim of cyber-bullying, and other mishaps and errors can occur. Consumers should be able to review and correct information for errors. Finally, Brill highlighted concerns that current efforts to scrub consumer data may be ineffective, as companies are getting better at taking different data points and still being able to accurately identify the individual. “Scrubbed” data in the wrong hands could be as harmful as a direct security breach.
Brill encouraged companies to follow the “privacy by design” recommendations issued by the FTC in order to build more protections into their products and services. She further emphasized her initiative “Reclaim Your Name,” which is set to promote consumer knowledge and access to data collection. Companies that are in the business of data collection, mining and analytics should take note of the FTC’s efforts to empower the consumer against the overuse or misuse of consumer data. If you want to stay on the good side of the FTC – and on the good side of the informed consumer – work with the consumer, and provide meaningful notice, choice and consent.
Following a public comment period, the Federal Trade Commission recently approved a final order settling charges against mobile device manufacturer HTC America, Inc. HTC develops and manufactures mobile devices based on the Android, Windows Mobile, and Windows Phone operating systems. This case, which focuses on device security, is the FTC’s first case against a device manufacturer.
The FTC alleged that HTC failed to take reasonable steps to secure the software it developed for its smartphones and tablet computers. According to the FTC, HTC’s failures introduced various security flaws that placed consumers’ sensitive information at risk. The FTC’s action against HTC signals the agency’s continued focus on data security and data privacy issues and use of its broad “Section 5” authority, which the FTC has repeatedly asserted against various organizations, including its ongoing litigation with Wyndham Hotels. The HTC case also reiterates the agency’s strong interest in securing mobile networks,[link to blog regarding mobile apps], now that mobile phones, which are full of sensitive contact, financial, and other personal information, have become so prevalent.
Companies may be asking what HTC actually did to warrant this FTC action. The FTC claims that HTC, when customizing the software on mobile devices, failed to provide its staff with sufficient security training, failed to review or test the software on its mobile devices for potential security vulnerabilities, failed to follow commonly accepted secure coding practices, and did not have a process for receiving and addressing vulnerability reports from third parties.
In particular, the FTC asserted that HTC devices potentially permitted malicious applications to send text messages, record audio, and install additional malware onto a consumer’s device, without the user’s consent or even knowledge. These malicious applications allegedly could access financial and medical information and other sensitive information such as a user’s geolocation and text message content.
In particular, in the case of Android devices, the FTC claimed that HTC pre-installed a custom application that could download and install applications outside the normal Android installation process. However, HTC did not include an appropriate permission check code to protect the pre-installed application from installation. Consequently, a third party application could command this pre-installed application to download and install any additional applications onto the device without a user’s knowledge or consent.
The FTC further charged that HTC’s actions actually undermined Android consent mechanisms that, but for HTC’s actions, would have prevented unauthorized access and transmission of sensitive information. The FTC’s complaint alleged that the vulnerabilities have been present on approximately 18.3 million HTC devices running Android. The complaint further alleged that HTC could have prevented these vulnerabilities through readily available, low-cost measures, such as adding a few lines of permission check code when programming its pre-installed applications.
In a precedent-setting remedy, the FTC’s final order requires HTC to develop and release software patches within 30 days of service of the FTC’s final order on HTC. The patches must fix vulnerabilities in millions of HTC’s devices, including every covered device having an operating system version released on or after December 2010. HTC must also establish a comprehensive security program designed to address security risks during the development of HTC devices. The FTC requires the program to include consideration of employee training and management; product design, development and research; secure software design and testing; and review, assessment, and response to third party security vulnerability reports.
Further, HTC must undergo independent security assessments every other year for the next 20 years. Among other requirements, the independent, professional assessment must certify that HTC’s security program operates with sufficient effectiveness to provide reasonable assurance that the security of covered device functionality and the security, confidentiality, and integrity of covered information is protected and has operated during the reporting period. HTC is barred from making false or misleading statements about the security and privacy of consumers’ data on HTC devices.
The FTC’s action against HTC has broad application beyond the mobile device and software marketplace. The agency’s action further solidifies the FTC’s role as the leading enforcer of data security standards. Once again the FTC has demonstrated that it is setting data security standards and will continue to monitor and police the marketplace when it believes companies have not incorporated what it believes are commonly accepted security features or when organizations have failed to take steps to prevent vulnerabilities.
Beta testing is underway for Google Glass, a new technology that provides the functionality of a smartphone in a headset worn like glasses. Much like a smartphone, the Glass headset is able to exchange messages with other mobile devices, take pictures, record videos, and access search engines to respond to user queries. But unlike a smartphone, the device’s optical head-mounted display, voice recognition, and front-facing camera give users hands-free access to its features, including the ability to capture photographs and video recordings of the world in front of them.
For now, Glass is only available to developers and a small group of test users known as Google Explorers. The device is expected to go on sale to the public in 2014. In the meantime, public speculation swells, and the blogosphere is full of conflicting reports about what the device can and cannot do. Some suggest that the device will utilize facial recognition and eye-tracking software to show icons and statistics above people whom the user recognizes. A more common concern is that the device will be able to photograph and record what the user sees and then share that data with third parties without permission from the user or those whose likenesses are being captured.
Because of this lack of clarity, lawmakers around the world are urging Google to affirmatively address the myriad of privacy concerns raised by this new technology. Last month, an international group of privacy regulators – including representatives from Australia, Canada, Israel, Mexico, New Zealand, Switzerland, and a European Commission panel – signed off on a letter to Google’s CEO Larry Page asking for more information regarding the company’s plans to ensure compliance with their data protection laws.
Here in the United States, the House Bipartisan Privacy Caucus issued a similar letter of its own. In addition to a variety of questions regarding the device’s capabilities, the letters reference some of Google’s recent data privacy mishaps and ask whether Google intends to take proactive steps to ensure the protection of user and nonuser privacy.
Google’s Vice President of Public Policy and Governmental Relations (and former New York Congresswoman) Susan Molinari issued a formal response to the House Bipartisan Privacy Caucus. According to Molinari, Google “recognize[s] that new technology is going to bring up new types of questions, so [they] have been thinking carefully about how [they] design Glass from its inception.”
To address concerns about the picture and video capabilities, Molinari highlighted several features designed to “give users control” and “help other people understand what Glass users are doing.” For example, specific user commands are required to search the Internet or find directions, and the user must either press a button on the arm of the Glass or say “Take a photo” or “Record a video” in order to access those features.
While Google is already subject to commitments it made to the FTC regarding the requirement to afford advertisers non-discriminatory access to its search engine, the FTC’s latest guidance makes clear that Google and other search engines must also maintain clear disclosures to the public about sponsored content in search results.
On June 24, 2013, in a series of letters to general search engines such as Google, Yahoo, and Ask.com, as well as to specialized search engines, the FTC issued updated guidance concerning disclosures regarding paid advertisements in search results.
This latest FTC action follows on the heels of the Commission’s recent updates to the Dot Com Disclosures and the updated Endorsements and Testimonials Guides. The FTC’s letters came in response to industry and consumer organizations’ requests to the Commission to update its policies on search engine results, last released in 2002. The FTC also noted that it has observed a decline in search engines’ compliance since 2002.
The FTC’s central concern, first articulated in 2002, remains the problem that consumers may be deceived in violation of Section 5 of the FTC Act unless search engines clearly and prominently distinguish advertising from natural search results.
Consumers assume that search results reflect the most relevant results. When results appear because the advertiser has paid a search engine for, say, prominent placement, that placement could be deceptive to consumers if they are unaware of the commercial relationship between the advertiser and the search engine.
The growth of mobile commerce in particular has spurred the FTC to issue new guidelines. Search results on a mobile phone screens are, by their nature, small, and consumers could be easily confused by paid search results if the “paid” nature of those results is not clear.
In the new guidance, the FTC states that if search engines continue to distinguish advertising results by giving a different background color or shading combined with a text label (such as “sponsored” or “ad”), the search engines should consider multiple factors to ensure that any labels and visual cues are sufficiently “noticeable and understandable” to consumers. The agency clarified that there is no “one size fits all” and that search engines may use various methods, provided the disclosures are noticeable and understandable.
Proper disclosures, according to the FTC, include the following:
• Visual Cues – Search engines must select hues of sufficient luminosity to account for varying monitor types, technology settings, and lighting conditions. The FTC notes that search engines should consider using web pages of different luminosities for mobile devices and desktop computers. Further, the FTC recommends that search engines should use:
o more prominent shading that has a clear outline;
o a prominent border that distinctly sets off advertising from the natural search results; or
o both prominent shading and a border
• Text Labels – The FTC asserts that text labels must be used in addition to the visual cues a search engine may use to distinguish advertising. Text labels must:
o use language that explicitly and unambiguously conveys that a search result is advertising;
o be large and visible enough for consumers to notice it;
o be located near the search results (or group of search results) that it qualifies and where consumers will see it; and
o be placed immediately in front of an advertising result, or in the upper-left hand corner of an ad block, including any grouping of paid specialized results in adequately sized and colored font.
The new guidance also recognizes that technology will continue to evolve, such as voice assistants on mobile devices (e.g., the iPhone’s “Siri”). While technology may change, the new guidance makes clear that the FTC Act’s Section 5 prohibition on deceptive practices remains. Therefore, businesses must make sure that they differentiate advertising from other information. For instance, if a voice interface is used to deliver search results (for example, “find me a Mexican restaurant”) the search engine should disclose audibly any paid advertisements in adequate volume and cadence for ordinary listeners to hear and comprehend.
The FTC continues to be vigilant in monitoring the online marketplace. Search engines and advertisers need to review their practices, keeping in mind that disclosures that may be readily apparent on a desktop may be hidden on a mobile screen. As with the “Dot Com Disclosures,” the agency is providing guidance to businesses; however, FTC enforcement remains vigilant and companies that do not clearly disclose paid advertising in search results could face an FTC investigation.