Some lawyers who deal regularly with the Federal Trade Commission in investigations of allegedly false and deceptive online advertising have noticed that the agency is beginning to take steps in these investigations that are unprecedented and draconian – and that judges seem to be going along. Below is a set of questions and answers with Jeff Ifrah, founding partner of Ifrah Law, on these new enforcement methods.
1. What is the first thing that a lawyer representing a company being probed by the FTC on false-advertising charges can expect to see?
IFRAH: Agency lawyers will go to a federal district judge with a copy of a temporary restraining order (TRO) for the judge to sign on an ex parte basis (without the defendant or its lawyers being present). Judges are allowed to do this as long as a hearing is set in a few days for a preliminary injunction, at which the defendant is represented. Meanwhile, the company is essentially barred from doing business by the terms of the TRO.
2. What is the FTC’s usual next step?
IFRAH: The agency will then go before the same judge with a draft of a preliminary injunction that is pretty much identical to the temporary restraining order. These injunctions basically require the business to continue to remain at a standstill until a trial is held and a settlement is reached. In addition, they require the company to disclose on all its web sites that it is being investigated for false and deceptive practices and to disclose online all of its sensitive financial information and that of its owners. Very often, the defendant will not contest this injunction request by the FTC. It is remarkable how many lawyers simply capitulate and agree to these draconian orders and set their clients up to fail.
3. What’s wrong with that? Isn’t the injunction lifted when the defendant agrees to settle the case?
IFRAH: Yes, but by that time, it may be too late, and the company may have gone out of business as a result of the restrictions that were imposed on it by the injunction and as a result of the disclosures that it had to make.
4. Are there other problems with these preliminary injunctions?
IFRAH: Yes. The FTC usually asks for a preliminary injunction with many standard features, and the judge usually grants it. But no two cases or defendants are the same. The courts are not taking into account the fact that different situations require different results. Instead, the injunctions are overbroad and reach behavior that is beyond what is alleged in the complaint.
Some of these restraining orders and injunctions restrict how much money a defendant can spend in a month or what type of online advertising it can use while the case is pending. Other injunctions require affirmative behavior, such as a requirement that the defendant report to the FTC every time it creates or operates any type of business. In either case, the defendant is forced to open its entire existence to the FTC, and everything it does is subject to scrutiny.
Another problem with standard, overbroad injunctions is that a defendant may become uncertain as to what it must do to prevent being held in contempt of court for non-compliance. The language in the injunction is often so vague and undefined that the FTC can act in its discretion to find a defendant in contempt.
5. And is that the end of the story?
IFRAH: No, unfortunately, plaintiffs lawyers often look to copycat an FTC action, and as a result companies may then have yet another headache to deal with, if they haven’t already been irreparably damaged by the FTC’s actions.
The increasing difficulties faced by internet providers and data gatherers in the international realm have yet again come to the fore. Privacy regulators in France, Germany, Spain, the Netherlands, the United Kingdom and Italy have banded together to investigate whether to fine Google for what they perceive to be violations of European Union privacy laws.
The background is that in March 2012, Google replaced its disparate privacy policies applicable to its various products (such as Gmail and YouTube) with a single policy that applied to all of its services.
However, as part of a report issued in October 2012, the EU’s Article 29 Data Protection Working Party then declared that Google’s unified policy did not comply with EU data protection laws. The EU’s primary, but not only, quibble with Google’s new policy involved the sharing of personal data across multiple Google services and platforms. At that time, the president of the French regulatory body, the CNIL, indicated that litigation would be initiated if Google did not implement the Working Party’s recommendations within three to four months.
As a result, Google now faces the time and costs of substantial regulatory oversight and investigation, as well as potential fines, from multiple national privacy protection watchdogs. In fairness, the EU privacy regulators have tended to be rather inclusive in their interpretation of what is and is not required by law. This is unfair to Google and to other companies that comply with what they believe to be the letter and spirit of the law, only to have regulators reinterpret the law to move the goal posts. But this is typical in the EU regulatory realm.
Google’s predicament sends a stern warning to all internet providers that gather personal data. Any provider’s natural inclination is to focus on complying with the applicable privacy rules applicable in the country where the provider is located. But the internet is borderless, subjecting providers to multiple laws in multiple jurisdictions. This creates the need for each provider to carefully analyze privacy policies to ensure as best as possible that it complies with the rules applicable across the globe. EU regulators and others are no longer content to allow the United States to set the guidelines for privacy and other rights, creating new challenges for privacy compliance in the United States and abroad.
Earlier this month, the Federal Trade Commission released a staff report outlining key issues facing consumers and companies as they adopt mobile payment services, entitled “Paper, Plastic . . . or Mobile? An FTC Workshop on Mobile Payments.” The report is based on a workshop held by the FTC in 2012 to examine the mobile payment industry.
Consumer use of mobile payment services continues to grow quickly. Mobile payment systems have the potential to be beneficial for both companies and consumers. However, many issues regarding fraud, privacy and security arise, and the FTC is looking to the industry to take the lead on establishing sound policies.
The FTC encourages companies that use mobile payment systems to develop clear policies on the resolution of disputes regarding unauthorized or fraudulent charges. Consumers fund their mobile purchases from a variety of sources (e.g., credit cards, bank account, mobile phone bills) and under current regulations each different method of funding has a different process for consumers to dispute an unauthorized or fraudulent charge. The FTC wants to create a clearer and streamlined process for consumers if an issue were to arise regarding a disputed charge. The FTC is planning to hold a separate roundtable on this issue in May.
The report highlights the problems associated with “cramming,” which involves placing unauthorized charges on a consumer’s phone bill. The FTC suggests that mobile carriers should perform some due diligence on companies from which they accept charges.
The report also discusses the idea of “privacy by design,” which involves strong privacy policies and transparency for consumers from inception of a company’s offerings. Consumers understand that they will need to provide some information to access a company’s services, but consumers may want to control how that information is stored and shared. The FTC and the industry realize that mobile payment systems can be an efficient, favored payment method. However, companies offering mobile payments need to be clear to consumers about how their data is being collected, maintained and used. Privacy issues are of paramount concern when using mobile payment systems because of the enormous amount of data available on smartphones.
The report also notes the potential privacy issues that can occur in the mobile payment process. Since mobile payment providers have access to both the financial information and contact information of the payer, they are in a position to create a serious privacy breach. The report suggests that companies consider privacy throughout the process of development, be transparent regarding data practices, and allow consumers options on how they want their information to be collected.
The report also encourages the industry to adopt measures to ensure that the entire mobile payment process is secure since financial information could potentially be disclosed. The FTC notes that there is technology available to make the protection of payment information more secure and suggests that financial information should be encrypted at all points in the transaction.
Companies should take note of the FTC’s report and adjust their practices. The FTC has put companies on notice about its expectations in mobile payments. It would not surprise us to see enforcement actions in the future in the area. Companies should, in particular, make clear their policy for explaining charges, and how they can be authorized. The more support a company has in showing that a charge is justified, the easier it will be to defend. This kind of specificity may also help influence authorities from even bringing charges. When offering mobile payment services, opt-in screens requiring a click or a password to make a charge and making sure the network is secure are best practices that may save an organization from being on the receiving end of an enforcement action.
Google recently agreed to a settlement after a three-year investigation conducted by 38 state attorneys general stemming for allegations that it had violated individuals’ privacy rights when it collected information from unsecured wireless networks while Google was engaged in its Street View mapping project. Full text of the settlement is available here.
Google used special vehicles to create the pictures that are seen on Google Street View. Google tried to improve its location services by identifying wireless Internet signals that could provide reference points on the map. In the process, the vehicles collected network identification information, as well as data, from unsecured wireless networks.
Google has stated that the collection of any personal information from the wireless networks was unintentional and that the information was never used or looked at. The company has agreed to destroy the personal data that it collected. Google will also be required to pay a $7 million fine as part of the settlement.
As part of the settlement, Google also agreed to launch a new internal privacy education program. The settlement requires Google to hold an annual privacy week event for its employees and to make privacy education available for select employees. Additionally, it must provide refresher training for its lawyers that oversee new products.
The settlement also requires Google to educate the public on privacy. Google will be required to create a video for YouTube explaining how people can easily encrypt data on their wireless networks and run an ad online every day for two years promoting it. It must also run educational ads in the biggest newspapers in the 38 participating states. Google will have to submit data to the state attorneys general to show that it is in compliance with the requirements of the settlement.
The Connecticut Attorney General’s office led an eight-state committee that investigated the data collection and led to this settlement. Connecticut Attorney General George Jespen said in a statement, “Consumers have a reasonable expectation of privacy. This agreement recognizes those rights and ensures that Google will not use similar tactics in the future to collect personal information without permission from unsuspecting consumers.”
This is another example of states taking a more aggressive approach to protecting consumer privacy rights when the federal government does not. The Federal Trade Commission investigated this activity by Google but closed its case without a fine. The Federal Communications Commission also investigated, and issued a $25,000 fine, but that fine was largely for Google allegedly hindering the investigation. Companies that do business on the Internet should be aware that states will continue to enforce privacy laws. Companies must make sure that they do not unintentionally collect unnecessary sensitive information in the course of their business activities.
This week, the FTC released updated guidance to its 2000 “Dot Com Disclosures,” a guide covering disclosures in online advertising. The online world has certainly changed in 13 years, and the new guidelines, available here, cover advances in online advertising, including mobile advertising.
One central theme still prevails: existing consumer protection laws and rules apply no matter where you offer products and services: newspapers, magazines, TV and radio commercials, websites, direct marketing, and mobile marketing. Thus, the basic principle applies that companies must ensure that their advertisements are truthful and accurate, including providing disclosures necessary to ensure that an advertisement is not misleading. Further, the disclosures should be clear and conspicuous – irrespective of the medium of the message.
In determining whether a disclosure is “clear and conspicuous” as the FTC requires, advertisers should consider the disclosure’s placement in the ad. Importantly, the 2000 guidelines defined proximity of disclosures to ads as “near, and when possible, on the same screen.” The new guidelines state that disclosures should be “as close as possible” to the relevant claim. The closer the disclosure is to the claim, the better it is for FTC compliance purposes.
Advertisers should also consider: the prominence of the disclosure; whether it is unavoidable (e.g., consumers must scroll past the disclosure before they can make a purchase); whether other parts of the ad distract attention from the disclosure; whether the disclosure should be repeated at different places on the website; whether audio message disclosures are of sufficient volume and cadence (e.g., too fast); whether visual disclosures appear long enough; and, whether the language of the disclosure is appropriate for the intended audience. The FTC suggests avoiding “legalese” or technical jargon.
Mobile marketers should take note that the FTC provided some additional guidance regarding disclosure issues particular to mobile marketing. In particular, the FTC stated that the various devices and platforms upon which an advertisement appears or a claim is made should be considered. For example, if the advertiser cannot make necessary disclosures because of the limit of the space (e.g., in a mobile app), then the claim should not be made on the platform.
The FTC does permit hyperlinks for disclosures in certain circumstances. However, hyperlinks must:
– be obvious
– be labeled appropriately to convey the importance, nature and relevance of the information they lead to (such as “Service plan required. Get service plan prices here”)
– be used consistently
– be placed as close as possible to the relevant information the hyperlink qualifies and made noticeable
– take consumers directly to the disclosure after clicking
Companies should assess the effectiveness of the hyperlink by monitoring click-through rates and make changes accordingly. The agency also suggests that advertisers design ads so that scrolling is not necessary to find a disclosure. The FTC discourages hyperlinks for disclosures involving product costs or certain health and safety issues (similar to its 2000 guidelines).
Probably the most helpful part of the new guidelines are the 22 different examples of proper/improper disclosures the FTC provides at the end of the guidelines. As companies move forward in promoting products and services online, particularly on mobile platforms, reviewing these examples along with the general principles of truthful and complete statements in advertising may save a company from an FTC enforcement action.
Organizations are increasingly marketing their products and services on mobile platforms. Advertisers should take note that special considerations apply in the mobile marketplace, especially the space and text size limitations. If a disclosure is necessary to prevent an advertisement from being deceptive, unfair, or otherwise violative of an FTC rule, it must be clear and placed next to the offer. If that can’t be done, the safest course would be to move the offer to another platform, such as a traditional website. The FTC and the states have demonstrated that they take a keen interest in mobile marketing and they will be watching claims and disclosures in the smartphone/tablet universe.
The Federal Trade Commission recently announced that it has approved a final order settling charges against Compete, Inc., a Boston-based web analytics company. Compete, Inc. sells reports on consumer browsing behavior to clients looking to drive more traffic to their websites and increase sales. Compete, Inc. obtained the information by getting consumers to install the company’s web-tracking software in their computers. The FTC alleged that the company’s business practices were unfair and deceptive because the company did not sufficiently describe the types of information it was collecting from its users.
With all the heightened concerns among consumers about internet privacy, one might wonder why consumers would be willing to install web-tracking software in their computers in the first place. Well, Compete, Inc. sweetened the pot by offering gift cards, cash rewards, and other incentives to entice consumers.
The fact that Compete, Inc. was using web-tracking software to track consumers’ visits to websites was not the problem for the FTC. The major issue was that the software was recording far more than just which websites a consumer was visiting. It was recording everything the user entered on the websites – usernames, passwords, detailed credit card information, Social Security numbers, etc. – all without the consumer’s knowledge or consent.
Reports indicate that the company may not have known that its software was collecting all of this user information. Compete, Inc. representatives stated that in January 2010, when they first learned that there was a potential security issue, they immediately disabled data collection from affected versions of the software and deleted inadvertently-collected information from their servers. The company also responded by implementing new data filters and security measures. The company took these steps even before the order was handed down and said that it would continue to develop and uphold new standards of transparency and security.
Perhaps the company’s commitment to correcting its behavior is part of the reason that the FTC settlement order didn’t include a monetary sanction. Instead, the order focuses on ensuring that such intrusive data is not collected in the future. Pursuant to the order, Compete, Inc. must implement a comprehensive information security program with biannual audits from an independent third party for the next 20 years (a fairly typical obligation in recent FTC settlements of this type); disclose the types of information that will be collected and obtain consumers’ express consent through their website before collecting any data from its web-tracking software; delete or anonymize the use of the consumer data it has already collected; and provide consumers with directions on how to uninstall the web-tracking software. The settlement also bars the company from misrepresenting its privacy and data security practices.
In the age of affiliate marketing, web analytics are extremely valuable for merchants seeking to increase web traffic to drive revenue. However, FTC investigations and resulting sanctions are costly, time-consuming, and quite simply bad for business. Companies interested in using this technology should make sure they know exactly what information they are collecting and should ensure that they are following FTC guidelines regarding data privacy. Clear disclosures to the public as to what software is being installed, what information is viewed or collected, and how that information is used, are all critical. Taking steps to get it right in the beginning will help them avoid costly investigations and bad press in the end.
According to a recent NBC News report, Equifax, one of the three largest American credit reporting agencies, has assembled an enormous database containing employment and salary information for more than 190 million U.S. adults. Very few people knew of the existence of the database, but the information in it allegedly is being sold to third parties without consumers’ consent.
According to the report, an Equifax-owned company, The Work Number obtains substantial information– through the assistance of human resources departments and other sources around the country including government agencies and Fortune 500 companies. The Work Number then sells this information. According to The Work Number’s website, payroll information comes from over 2,000 employers. Reports have stated that the database is so detailed that for many individuals it has weekly pay information, as well as other sensitive information such as the identity of the individual’s health care provider and whether the individual has ever filed a claim for unemployment benefits.
Seven members of Congress recently wrote a letter to Equifax asking for more information on the legality of The Work Number. “What is most concerning to us is that this massive database appears to generate revenue using consumers’ sensitive personal information for profit,” the letter states.
Companies state that they agree to sign up for The Work Number because it gives them a simple way to outsource employment verification of former employees. Companies provide their human resources information to The Work Number and The Work Number automates the process. There is no longer a need for companies to spend the time to verify a former employee’s work history.
In 2009, according to the NBCNews.om report, Equifax said that the data The Work Number had amassed covered 30 percent of the working U.S. population, and the database is now adding 12 million records annually according to NBCNews.com.
It is not entirely clear what Equifax is doing with the data, where it is selling it, and what can be sold without consent. In a statement after NBCNews.com broke the story Equifax said, “The Work Number does not provide debt collectors with salary/pay rate/income information. They can request only employment verification data which The Work Number will provide if there is permissible purpose as detailed by the Fair Credit Reporting Act.” Equifax also denied reports that the salary information is sold to debt collectors.
Equifax did confirm that “pay rate” information is shared with third parties including mortgage, automobile, and other financial services companies — as authorized under the Fair Credit Reporting Act.
Since the data is considered a credit report, consumers are entitled to one free report every year, which shows the data contained in the reports and what entities have requested the data.
Companies that collect and share data will continue to face scrutiny from state and federal government agencies that have shown a consistent effort focused on protecting consumers’ privacy rights. Consumer protection laws continue to evolve and provide individuals with specific rights as well as restrictions on companies regarding information that can be shared. All companies that deal with consumer information need to take a proactive approach to make sure that they are in compliance with all governing laws. The FTC, in particular, has shown a willingness and focus to utilize laws such as the Fair Credit Reporting Act to take enforcement action against companies offering employment and credit data.
Once again, the FTC has completed a major enforcement action against the illegal use of robocalls, a form of prerecorded, computerized telemarketing calls. This time, the action resulted in a $1.1 million civil penalty against Roy M. Cox, an individual whom the FTC considered to be the architect of an illegal robocall operation. The FTC alleged that Cox and several companies he controlled were using robocalls to market credit card interest-rate reduction programs, extended automobile warranties, and home security systems. Due to Cox’s inability to pay, the dollar penalty has been waived and Cox has been permanently banned from participating in any telemarketing activities.
According to the December 2011 complaint, Cox and his co-defendants were not only making prerecorded sales calls to consumers without their consent, in violation of the Telemarketing Sales Rule, but they were also illegally disguising their identity on customers’ caller ID displays. Instead of displaying the companies’ actual name and contact information, generic names such as “CARD SERVICES,” “CREDIT SERVICES,” or “PRIVATE OFFICE” would appear on a recipient’s caller ID. This tactic, known as “caller ID Spoofing,” is also prohibited by law.
As we reported in October, the FTC has been struggling to keep pace with these technological advancements, so it called on the public to come up with a solution. The commission offered a $50,000 prize to whoever could design a program to screen out illegal robocalls. The challenge was open to the public for three months and garnered nearly 800 submissions. The agency expects to announce a winner in early April.
The case against Cox and many of the FTC’s previous enforcement actions indicate that the FTC may be most concerned with robocalls that use patently deceptive advertising to lure in vulnerable, unsuspecting customers. Companies offering fraudulent credit card services, auto-warranty protection, and medical plans have made themselves an easy mark for the FTC, because of the likelihood that they will be reported by recipients or advocacy groups. However, companies interested in using computerized telemarketing must remember that even innocuous content can violate the Telemarketing Sales Rule (and the Telephone Consumer Protection Act) if recipients have not given prior written consent to receive such calls. Also, any company engaging in telemarketing should be subscribing to the federal “do not call” list and scrubbing its calling lists against the federal list. Some states still maintain their own lists as well. In addition to FTC or FCC enforcement, illegal robocalling can result in costly civil litigation, including class actions.
Any company that collects personal information about individuals, such as credit card numbers and social security numbers, must be very careful about the way in which it stores and secures that information. Even a blood bank that stores umbilical cord blood needs to keep these privacy rules in clear view. That is one of the messages of a recent Federal Trade Commission action.
California-based Cbr Systems is one of the leaders in the growing field of umbilical cord storage. Umbilical cords are rich in stem cells, and new parents are paying to have the cord or cord blood stored away for the child’s possible medical use later in life. Cbr acquires and stores the cords for an annual fee.
Cbr also stores a vast amount of information related to these tissues, including names, dates and times of birth, Social Security numbers, credit card numbers, checking account numbers, addresses, and driver’s license numbers. In December 2010, a Cbr employee removed four backup tapes containing this sensitive information in order to transport them to a different office. Soon after, a thief stole the tapes and other company devices from the employee’s car. In all, personal information of nearly 300,000 Cbr customers was compromised. The tapes and other devices were not encrypted.
Under the terms of the settlement, Cbr must establish an information security system, submit to security audits every other year for the next 20 years, and ensure that it does not misrepresent its privacy and security practices. A violation of the final order could result in Cbr paying up to $16,000 per violation.
In addition to the FTC action, Cbr clients filed a class action against the company alleging that the company failed to adequately protect the information, and belatedly notified customers of the privacy breach. On February 5, 2013, a federal judge in Johansson-Dohrmann v. CBR Systems Inc., in the U.S. District Court for the Southern District of California, No. 12-1115, granted preliminary approval of a proposed settlement in which CBR must provide credit monitoring and identity theft insurance to each affected class member, as well as make cash reimbursements for any losses resulting from identity theft. The settlement also provides up to $600,000 in payments to the plaintiffs’ lawyers.
Data privacy breaches are a serious concern for any company. They can result in serious reputational harm, as well as financial loss through costly legal actions initiated by the FTC, states, or class actions. The cost of developing and implementing an effective data privacy protocol is a worthwhile investment to guard against these losses. Companies should refer to the FTC’s guides and manuals for protecting consumers’ personal information. Implementing these procedures will serve to protect both consumers and the company itself.
Maryland Attorney General Douglas Gansler (D) has announced that his office is launching a new Internet Privacy Unit designed to address issues related to online privacy and to ensure that companies are in compliance with state and federal consumer protection laws. The unit will also handle issues related to cyberbullying and cybersecurity.
Gansler, who also serves as the president of the National Association of Attorneys General (NAAG), has previously stated that online privacy was a priority. Gansler said in a statement that Internet privacy is “one of the most essential consumer protection issues of the 21st century.”
The Internet Privacy Unit will also work with major industry stakeholders and privacy advocates to provide outreach and education to businesses and consumers. The unit may also pursue enforcement actions “where appropriate” to ensure that consumers’ privacy is protected.
One area of online privacy that the unit will examine is whether companies are complying with the Children’s Online Privacy Protection Act (COPPA), a federal law that restricts site operators from knowingly collecting personal data from children younger than 13. The Federal Trade Commission (FTC) announced in December that it adopted new rules governing COPPA that will go into effect in July 2013, which were the first significant revisions since the original rules went into effect in 2000. The new rules significantly increase the number of types of companies that are required to obtain parental permission before knowingly collecting personal details from children, as well as the types of information that will require parental consent to collect.
The unit will also “examine weaknesses” in online privacy policies. Not only will companies be required to have privacy policies in place, but these policies need to be thorough and comprehensive to ensure compliance with all relevant privacy laws. And, of course, companies need to be following in practice what they “preach” in their privacy policies.
The FTC and state attorney general offices will doubtless continue to be aggressive in their enforcement of privacy laws. Companies with an online presence should review their privacy policies and practices, particularly as affected by recent rule changes such as the COPPA revisions. Also, Maryland is signaling that it will be an active player in monitoring and enforcement of personal privacy and cybersecurity. While federal legislation continues to stall, the states are most definitely moving ahead.