On April 3, 2013, the Federal Trade Commission issued a press release that marks yet another step in its continuing trend of actions involving data brokers and data providers. As we have noted in earlier blog posts, the agency is making a concerted effort on a number of fronts to enforce the laws that protect consumer data and privacy.
The FTC’s current action involves a letter that it sent to a number of data brokerage companies that provide tenants’ rental histories to landlords. The letter is simply a notification to the companies that they may be considered credit reporting agencies under the Fair Credit Reporting Act (FCRA) and that they thus may be required to ensure that their websites and practices comply with that law.
The FTC letter also listed some of the obligations of credit reporting agencies to take reasonable steps to ensure the fairness, accuracy, and confidentiality of their reports — such as (1) ensuring that landlords are actually using the report for tenant screening purposes and not as a pretext, (2) ensuring the maximum possible accuracy of the information in the tenant reports, (3) if the company is a nationwide provider, providing consumers with a free copy of their report annually, and (4) ensuring that all obligations are met concerning notifications to landlords (e.g. letting consumers know about a denial based on a tenant report, the right to dispute information in the report, and the right to get a free copy of the report).
The FTC letter specifically noted that the agency has not evaluated whether the company receiving the letter is in compliance with the FCRA but that “we encourage you to review your websites and your policies and procedures for compliance.”
We have discussed FTC actions against data brokers before. In March, we discussed the FTC’s announcement of a settlement with Compete, Inc., a web analytics company. Compete sells reports on consumer browsing behavior to clients looking to drive more traffic to their websites and increase sales. Compete obtained the information by getting consumers to install the company’s web-tracking software in their computers. The FTC alleged that the company’s business practices were unfair and deceptive because the company did not sufficiently describe the types of information it was collecting from its users.
We are confident that the companies that received the letter regarding tenant information are reviewing their websites and polices, as encouraged by the FTC. However, what really intrigues us is the motivation behind the FTC sending the letters to the companies.
Of course, part of that motivation is to help ensure that the companies follow rules for privacy protection. Nonetheless, it is also interesting to note that there is a significant consequence under the FCRA – namely, individuals are permitted to seek punitive damages for deliberate violations of the FCRA. Thus, the letter arguably provides notice for the companies to become compliant immediately since future violations may be considered deliberate breaches that warrant punitive damages.
The increasing difficulties faced by internet providers and data gatherers in the international realm have yet again come to the fore. Privacy regulators in France, Germany, Spain, the Netherlands, the United Kingdom and Italy have banded together to investigate whether to fine Google for what they perceive to be violations of European Union privacy laws.
The background is that in March 2012, Google replaced its disparate privacy policies applicable to its various products (such as Gmail and YouTube) with a single policy that applied to all of its services.
However, as part of a report issued in October 2012, the EU’s Article 29 Data Protection Working Party then declared that Google’s unified policy did not comply with EU data protection laws. The EU’s primary, but not only, quibble with Google’s new policy involved the sharing of personal data across multiple Google services and platforms. At that time, the president of the French regulatory body, the CNIL, indicated that litigation would be initiated if Google did not implement the Working Party’s recommendations within three to four months.
As a result, Google now faces the time and costs of substantial regulatory oversight and investigation, as well as potential fines, from multiple national privacy protection watchdogs. In fairness, the EU privacy regulators have tended to be rather inclusive in their interpretation of what is and is not required by law. This is unfair to Google and to other companies that comply with what they believe to be the letter and spirit of the law, only to have regulators reinterpret the law to move the goal posts. But this is typical in the EU regulatory realm.
Google’s predicament sends a stern warning to all internet providers that gather personal data. Any provider’s natural inclination is to focus on complying with the applicable privacy rules applicable in the country where the provider is located. But the internet is borderless, subjecting providers to multiple laws in multiple jurisdictions. This creates the need for each provider to carefully analyze privacy policies to ensure as best as possible that it complies with the rules applicable across the globe. EU regulators and others are no longer content to allow the United States to set the guidelines for privacy and other rights, creating new challenges for privacy compliance in the United States and abroad.
Earlier this month, the Federal Trade Commission released a staff report outlining key issues facing consumers and companies as they adopt mobile payment services, entitled “Paper, Plastic . . . or Mobile? An FTC Workshop on Mobile Payments.” The report is based on a workshop held by the FTC in 2012 to examine the mobile payment industry.
Consumer use of mobile payment services continues to grow quickly. Mobile payment systems have the potential to be beneficial for both companies and consumers. However, many issues regarding fraud, privacy and security arise, and the FTC is looking to the industry to take the lead on establishing sound policies.
The FTC encourages companies that use mobile payment systems to develop clear policies on the resolution of disputes regarding unauthorized or fraudulent charges. Consumers fund their mobile purchases from a variety of sources (e.g., credit cards, bank account, mobile phone bills) and under current regulations each different method of funding has a different process for consumers to dispute an unauthorized or fraudulent charge. The FTC wants to create a clearer and streamlined process for consumers if an issue were to arise regarding a disputed charge. The FTC is planning to hold a separate roundtable on this issue in May.
The report highlights the problems associated with “cramming,” which involves placing unauthorized charges on a consumer’s phone bill. The FTC suggests that mobile carriers should perform some due diligence on companies from which they accept charges.
The report also discusses the idea of “privacy by design,” which involves strong privacy policies and transparency for consumers from inception of a company’s offerings. Consumers understand that they will need to provide some information to access a company’s services, but consumers may want to control how that information is stored and shared. The FTC and the industry realize that mobile payment systems can be an efficient, favored payment method. However, companies offering mobile payments need to be clear to consumers about how their data is being collected, maintained and used. Privacy issues are of paramount concern when using mobile payment systems because of the enormous amount of data available on smartphones.
The report also notes the potential privacy issues that can occur in the mobile payment process. Since mobile payment providers have access to both the financial information and contact information of the payer, they are in a position to create a serious privacy breach. The report suggests that companies consider privacy throughout the process of development, be transparent regarding data practices, and allow consumers options on how they want their information to be collected.
The report also encourages the industry to adopt measures to ensure that the entire mobile payment process is secure since financial information could potentially be disclosed. The FTC notes that there is technology available to make the protection of payment information more secure and suggests that financial information should be encrypted at all points in the transaction.
Companies should take note of the FTC’s report and adjust their practices. The FTC has put companies on notice about its expectations in mobile payments. It would not surprise us to see enforcement actions in the future in the area. Companies should, in particular, make clear their policy for explaining charges, and how they can be authorized. The more support a company has in showing that a charge is justified, the easier it will be to defend. This kind of specificity may also help influence authorities from even bringing charges. When offering mobile payment services, opt-in screens requiring a click or a password to make a charge and making sure the network is secure are best practices that may save an organization from being on the receiving end of an enforcement action.
Google recently agreed to a settlement after a three-year investigation conducted by 38 state attorneys general stemming for allegations that it had violated individuals’ privacy rights when it collected information from unsecured wireless networks while Google was engaged in its Street View mapping project. Full text of the settlement is available here.
Google used special vehicles to create the pictures that are seen on Google Street View. Google tried to improve its location services by identifying wireless Internet signals that could provide reference points on the map. In the process, the vehicles collected network identification information, as well as data, from unsecured wireless networks.
Google has stated that the collection of any personal information from the wireless networks was unintentional and that the information was never used or looked at. The company has agreed to destroy the personal data that it collected. Google will also be required to pay a $7 million fine as part of the settlement.
As part of the settlement, Google also agreed to launch a new internal privacy education program. The settlement requires Google to hold an annual privacy week event for its employees and to make privacy education available for select employees. Additionally, it must provide refresher training for its lawyers that oversee new products.
The settlement also requires Google to educate the public on privacy. Google will be required to create a video for YouTube explaining how people can easily encrypt data on their wireless networks and run an ad online every day for two years promoting it. It must also run educational ads in the biggest newspapers in the 38 participating states. Google will have to submit data to the state attorneys general to show that it is in compliance with the requirements of the settlement.
The Connecticut Attorney General’s office led an eight-state committee that investigated the data collection and led to this settlement. Connecticut Attorney General George Jespen said in a statement, “Consumers have a reasonable expectation of privacy. This agreement recognizes those rights and ensures that Google will not use similar tactics in the future to collect personal information without permission from unsuspecting consumers.”
This is another example of states taking a more aggressive approach to protecting consumer privacy rights when the federal government does not. The Federal Trade Commission investigated this activity by Google but closed its case without a fine. The Federal Communications Commission also investigated, and issued a $25,000 fine, but that fine was largely for Google allegedly hindering the investigation. Companies that do business on the Internet should be aware that states will continue to enforce privacy laws. Companies must make sure that they do not unintentionally collect unnecessary sensitive information in the course of their business activities.
The Federal Trade Commission recently announced that it has approved a final order settling charges against Compete, Inc., a Boston-based web analytics company. Compete, Inc. sells reports on consumer browsing behavior to clients looking to drive more traffic to their websites and increase sales. Compete, Inc. obtained the information by getting consumers to install the company’s web-tracking software in their computers. The FTC alleged that the company’s business practices were unfair and deceptive because the company did not sufficiently describe the types of information it was collecting from its users.
With all the heightened concerns among consumers about internet privacy, one might wonder why consumers would be willing to install web-tracking software in their computers in the first place. Well, Compete, Inc. sweetened the pot by offering gift cards, cash rewards, and other incentives to entice consumers.
The fact that Compete, Inc. was using web-tracking software to track consumers’ visits to websites was not the problem for the FTC. The major issue was that the software was recording far more than just which websites a consumer was visiting. It was recording everything the user entered on the websites – usernames, passwords, detailed credit card information, Social Security numbers, etc. – all without the consumer’s knowledge or consent.
Reports indicate that the company may not have known that its software was collecting all of this user information. Compete, Inc. representatives stated that in January 2010, when they first learned that there was a potential security issue, they immediately disabled data collection from affected versions of the software and deleted inadvertently-collected information from their servers. The company also responded by implementing new data filters and security measures. The company took these steps even before the order was handed down and said that it would continue to develop and uphold new standards of transparency and security.
Perhaps the company’s commitment to correcting its behavior is part of the reason that the FTC settlement order didn’t include a monetary sanction. Instead, the order focuses on ensuring that such intrusive data is not collected in the future. Pursuant to the order, Compete, Inc. must implement a comprehensive information security program with biannual audits from an independent third party for the next 20 years (a fairly typical obligation in recent FTC settlements of this type); disclose the types of information that will be collected and obtain consumers’ express consent through their website before collecting any data from its web-tracking software; delete or anonymize the use of the consumer data it has already collected; and provide consumers with directions on how to uninstall the web-tracking software. The settlement also bars the company from misrepresenting its privacy and data security practices.
In the age of affiliate marketing, web analytics are extremely valuable for merchants seeking to increase web traffic to drive revenue. However, FTC investigations and resulting sanctions are costly, time-consuming, and quite simply bad for business. Companies interested in using this technology should make sure they know exactly what information they are collecting and should ensure that they are following FTC guidelines regarding data privacy. Clear disclosures to the public as to what software is being installed, what information is viewed or collected, and how that information is used, are all critical. Taking steps to get it right in the beginning will help them avoid costly investigations and bad press in the end.
Any company that collects personal information about individuals, such as credit card numbers and social security numbers, must be very careful about the way in which it stores and secures that information. Even a blood bank that stores umbilical cord blood needs to keep these privacy rules in clear view. That is one of the messages of a recent Federal Trade Commission action.
California-based Cbr Systems is one of the leaders in the growing field of umbilical cord storage. Umbilical cords are rich in stem cells, and new parents are paying to have the cord or cord blood stored away for the child’s possible medical use later in life. Cbr acquires and stores the cords for an annual fee.
Cbr also stores a vast amount of information related to these tissues, including names, dates and times of birth, Social Security numbers, credit card numbers, checking account numbers, addresses, and driver’s license numbers. In December 2010, a Cbr employee removed four backup tapes containing this sensitive information in order to transport them to a different office. Soon after, a thief stole the tapes and other company devices from the employee’s car. In all, personal information of nearly 300,000 Cbr customers was compromised. The tapes and other devices were not encrypted.
Under the terms of the settlement, Cbr must establish an information security system, submit to security audits every other year for the next 20 years, and ensure that it does not misrepresent its privacy and security practices. A violation of the final order could result in Cbr paying up to $16,000 per violation.
In addition to the FTC action, Cbr clients filed a class action against the company alleging that the company failed to adequately protect the information, and belatedly notified customers of the privacy breach. On February 5, 2013, a federal judge in Johansson-Dohrmann v. CBR Systems Inc., in the U.S. District Court for the Southern District of California, No. 12-1115, granted preliminary approval of a proposed settlement in which CBR must provide credit monitoring and identity theft insurance to each affected class member, as well as make cash reimbursements for any losses resulting from identity theft. The settlement also provides up to $600,000 in payments to the plaintiffs’ lawyers.
Data privacy breaches are a serious concern for any company. They can result in serious reputational harm, as well as financial loss through costly legal actions initiated by the FTC, states, or class actions. The cost of developing and implementing an effective data privacy protocol is a worthwhile investment to guard against these losses. Companies should refer to the FTC’s guides and manuals for protecting consumers’ personal information. Implementing these procedures will serve to protect both consumers and the company itself.
Maryland Attorney General Douglas Gansler (D) has announced that his office is launching a new Internet Privacy Unit designed to address issues related to online privacy and to ensure that companies are in compliance with state and federal consumer protection laws. The unit will also handle issues related to cyberbullying and cybersecurity.
Gansler, who also serves as the president of the National Association of Attorneys General (NAAG), has previously stated that online privacy was a priority. Gansler said in a statement that Internet privacy is “one of the most essential consumer protection issues of the 21st century.”
The Internet Privacy Unit will also work with major industry stakeholders and privacy advocates to provide outreach and education to businesses and consumers. The unit may also pursue enforcement actions “where appropriate” to ensure that consumers’ privacy is protected.
One area of online privacy that the unit will examine is whether companies are complying with the Children’s Online Privacy Protection Act (COPPA), a federal law that restricts site operators from knowingly collecting personal data from children younger than 13. The Federal Trade Commission (FTC) announced in December that it adopted new rules governing COPPA that will go into effect in July 2013, which were the first significant revisions since the original rules went into effect in 2000. The new rules significantly increase the number of types of companies that are required to obtain parental permission before knowingly collecting personal details from children, as well as the types of information that will require parental consent to collect.
The unit will also “examine weaknesses” in online privacy policies. Not only will companies be required to have privacy policies in place, but these policies need to be thorough and comprehensive to ensure compliance with all relevant privacy laws. And, of course, companies need to be following in practice what they “preach” in their privacy policies.
The FTC and state attorney general offices will doubtless continue to be aggressive in their enforcement of privacy laws. Companies with an online presence should review their privacy policies and practices, particularly as affected by recent rule changes such as the COPPA revisions. Also, Maryland is signaling that it will be an active player in monitoring and enforcement of personal privacy and cybersecurity. While federal legislation continues to stall, the states are most definitely moving ahead.
Angered by the recent tragic suicide of Internet activist Aaron Swartz, a group of hackers claiming to be from the group Anonymous, made threats over the weekend to release sensitive information about the United States Department of Justice. The group claimed to have a file on multiple servers that is ready to be released immediately.
Swartz’s suicide has served to mobilize the group Anonymous, a loosely defined collective of Internet “hacktivists” that oppose attempts to limit Internet freedoms. Anonymous is a staunch advocate of open access to information, as was Swartz. Anonymous said that Swartz “was killed” because he “faced an impossible choice.”
Swartz was facing federal computer fraud charges that carried a maximum sentence of 35 years in prison, although in reality he probably would not have been given a sentence anywhere near approaching the statutory maximum. Prosecutors told Swartz’s legal team they would recommend to the judge a sentence of six months in a low-security setting.
The charges arose from allegations that he made freely available an enormous archive of research articles and similar documents offered by JSTOR, an online academic database, through the computers at the Massachusetts Institute of Technology.
Swartz was a leading activist involved in the movement to make information more freely available on the Internet and is credited with helping to lead the protests that ultimately defeated the Stop Online Piracy Act (SOPA), a statute that would have significantly broadened law enforcement powers in policing Internet content that may violate U.S. copyright laws.
Earlier this month, Rep. Zoe Lofgren (D-Calif.) indicated that she is drafting a bill that she terms “Aaron’s Law,” which would limit the scope of the Computer Fraud and Abuse Act, a 1986 law that prosecutors used to help bring these charges against Swartz.
The hackers reportedly hijacked the website of the United States Sentencing Commission, the federal agency responsible for the federal sentencing guidelines for criminal offenses. They said that the Sentencing Commission’s website was chosen because of its influence in creating sentences that they deemed unfair. The hackers posted a message that demanded reform of the criminal justice system or threatening that sensitive information would be leaked. Anonymous also posted an editable version of the website, which invited users to edit it as they pleased.
Today is Data Privacy Day. These recent incidents serve to show that no organization – not even the U.S. Department of Justice – is immune from security breaches. Data breaches and data losses will occur and it is crucial for an organization to be prepared and have policies in place to allow a quick response when something does happen.
The legal ramifications and bad publicity that follow such an incident can be very damaging to an organization. However, by making sure that you are prepared, you can minimize your damages. Preparedness involves consultation across a range of specialties, including information technology, legal advice, and public relations. The impact that a data breach or loss can have on the bottom line of any organization is enormous and preparation is the best method to combat it.
A data breach or data loss can also have far-reaching legal consequences under international, federal and various state laws. For example, companies may not realize that if they have even a few employees or customers in a state, it may trigger a number of different requirements under state privacy laws. In order to avoid problems with federal agencies or state attorney general offices, it is best for companies to have a plan in place in advance and make sure they are already compliant with all relevant laws.
As we cautioned in a September post, the FTC is stepping up enforcement actions against mobile app developers for failure to comply with consumer protection principles. This month, the FTC took another major step in that direction with a groundbreaking settlement applying the Fair Credit Reporting Act (FCRA) to app developers Filquarian Publishing, LLC, Choice Level, LLC, and Joshua Linsk.
The FCRA is a consumer protection statute designed to regulate the collection, dissemination, and use by companies of consumer information. Filquarian markets mobile apps that run background checks using criminal records obtained from Choice Level, and Linsk is the owner and sole officer of both companies.
Although this was the first time that the FTC has applied the FCRA to a mobile app developer, the prospect has been on the horizon for quite some time. Last February, the Commission issued a press release announcing that it had issued official warning letters to marketers of six mobile apps for background screening. The warnings were explicit: “If you have reason to believe that your background reports are being used for employment or other FCRA purposes,” both you and those customers must comply with the FCRA. Additionally, the FTC posted a “Word of Warning” on its Business Center Blog, informing the public about the warning letters and cautioning app marketers that “disclaimers or not, the FCRA would still apply.”
According to the FTC, Linsk and his companies failed to heed these conspicuous warnings. As detailed in the FTC complaint, since at least 2010, Filquarian had been specifically targeting employers with ads like this one: “Are you hiring somebody and wanting to quickly find out if they have a record? Then Texas Criminal Record Search is the perfect application for you.” Instead of attempting to comply with the FCRA, the FTC’s complaint said, Filquarian and Choice Level posted a disclaimer stating that the companies were not complaint with the FCRA, that their reports were not to be considered screening products for the various FCRA-proscribed purposes, and that the users of their reports assume sole responsibility for FCRA compliance.
The complaint against them cited numerous FCRA violations: (i) regularly furnishing reports to individuals who did not have a permissible purpose to use them, (ii) failing to maintain any procedures for assuring maximum possible accuracy of information provided in the reports, and (iii) failing to provide required notices to users of the consumer reports. The agency concluded that the disclaimers were not enough to absolve the company of FCRA liability, especially when the disclaimer directly contradicts express representations in the company’s advertisements.
Again, we urge all mobile app developers to be aware of the following principles to reduce the likelihood of an FTC enforcement action: (i) an app is no different from an Internet website, which is no different from a print ad, (ii) you’d be smart to pay attention to the FTC’s warnings to other companies and their enforcement actions, and (iii) disclaimers are important but often they simply aren’t enough to avoid liability. Also, the FTC has definitely shown that it will use its broad statutory authority and apply existing laws and regulations – including the 1970s -era FCRA — to mobile apps and other online offerings.
The Federal Trade Commission announced on December 19, 2012, that it has adopted final amendments to the Children’s Online Privacy Protection Act (COPPA) that strengthen privacy protections online and give parents greater control over their children’s personal information. FTC officials said that they updated the rules to keep pace with the increasing use of mobile phones and tablets by children.
The original rules have not seen significant changes since they went into effect in 2000. The FTC has been examining possible changes to the COPPA rules since March 2010 and has received hundreds of comments from interested parties through multiple comment periods.
“Congress enacted COPPA in the desktop era and we live in an era of smartphones and mobile marketing,” FTC Chairman Jon Leibowitz said. “This is a landmark update of a seminal piece of legislation.”
The new rules go into effect on July 1, 2013. The vote was approved by a 3-1 vote, with one commissioner abstaining. Commissioner Maureen Ohlhaussen voted no on the ground that she believes a core provision of the new rules, extending the statutory definition of “operator” to impose obligations on certain websites or online services that do not collect personal information from children or have access to or control of such information collected by a third party, exceeds the scope of the authority granted by Congress in COPPA.
The new rules significantly increase the types of companies that are required to obtain parental permission before knowingly collecting personal details from children, as well as the types of information that will require parental consent to collect.
Under the new amendments, the FTC said companies must seek permission from parents to collect a child’s photographs, videos, audio files, and geo-location information.
The new rules also expand the definition of personal information to include persistent IDs, such as a unique serial number on a mobile phone or the IP address of a browser, if they are used to show a child behavior-based ads. It requires third parties such as advertising networks and social media networks that know they are operating on children’s sites to notify and obtain consent from parents before collecting personal information. Additionally, the rule makes children’s sites responsible for notifying parents about data collection by third parties that are integrated into their services.
The FTC said that the new amendments will now require apps and websites that are targeted at children with third-party plug-ins to websites such as Twitter and Facebook, to require parental consent to collect personal information. Those third parties must obtain parental consent when they have “actual knowledge” that they are collecting information from a website or service targeted at children.
In a departure from the rule changes that were proposed by the government in August, the FTC explicitly exempted app stores, such as those run by Google and Apple, from responsibility for privacy violations by games and software sold in their stores. The government also reversed a prior proposal by agreeing to continue to allow parental consent to be obtained by email as long as apps and websites only collect the data for internal usage.
Now that these new guidelines have been issued, all operators need to review their policies to ensure compliance. These revisions have significantly expanded the type of information that is considered private and the number of companies that will need to comply. The FTC has previously brought enforcement actions against companies that were in violation of COPPA in the past, and these new rules will allow for more actions to be brought in the future.