FTC Beat
Archive for the ‘Privacy’ Category
Mar 03
2015

Another Class Action Pops Up For Complaints About Pop-Ups

Confused businessman in front of computer

A class action lawsuit recently instituted in federal court in the Northern District of California, Hunter v. Lenovo et al., alleges that Lenovo Inc., a computer manufacturer, violated its customers’ rights by selling computers which came preinstalled with alleged spyware manufactured by Superfish Inc., another named defendant.  The purported class alleges that the Superfish software monitors user activity and displays pop-up ads, among other things, as part of an “image-based search” function which identifies images on the user’s screen and seeks out similar images on the web. The complaint states causes of action for violations of the Electronic Communications Privacy Act and the Stored Communications Act, as well as unjust enrichment.

The Stored Communications Act (“SCA”), 18 U.S.C. §§ 2701-2712 provides criminal penalties for anyone who “intentionally accesses without authorization a facility through which an electronic communication service is provided” or “intentionally exceeds an authorization to access that facility.”  The SCA has been cited by plaintiffs in other class actions in which users allege that a technology company has overstepped its bounds.  For instance, in Perkins v. LinkedIn Corp., No. 13-CV-04303-LHK, 2014 WL 2751053 (N.D. Cal. June 12, 2014), a putative class of LinkedIn users alleged that the social networking company violated the SCA by collecting contacts from users’ external email accounts.  The court granted LinkedIn’s motion to dismiss the SCA claims, noting that the users consented to the collection of email addresses in a prominent disclosure, and therefore LinkedIn was “authorized” to collect the information, an exception to the SCA pursuant to 18 U.S.C. §2701(c).

The complaint in Hunter v. Lenovo attempts to preempt a consent defense, alleging that “Plaintiff never agreed to any terms or conditions regarding the Superfish Surveillance Software.  Accordingly, Plaintiff never consented to Defendants’ monitoring of, access to, and/or interception of his internet communications.”  However, according to a January 23, 2015 forum post by a Lenovo administrator (since edited to link to Lenovo advisory), users had the opportunity to decline the Superfish software Terms of Use, thus disabling the software.  If this proves to be true, then it would be consistent with the court’s determination in LinkedIn that a user’s consent may serve as a defense against an SCA claim.  Unlike LinkedIn, however, the Hunter SCA claim may not be appropriate for resolution at the motion to dismiss stage because it raises an issue of disputed fact which may require discovery.

Although the suit is still pending, Lenovo has reversed course on the Superfish software.  Lenovo has disabled Superfish on computers which came pre-installed with the software, its websites offer instructions for users to uninstall the software altogether, and Lenovo computers no longer come preinstalled with the program.  While these remedial actions may be an appropriate response to user concerns, they do not constitute an admission of legal liability in the class action suit.   The defendants may still argue that users consented to the software, even as they remove it from the computers.

Mar 02
2015

The Federal Wiretap Act and the Law of Unintended Consequences

getting data

The law of unintended consequences – a distant cousin of Murphy’s Law – states that the actions of human beings will always have effects that are unanticipated and unintended. The law could prove a perfect fit for recent efforts by class action counsel to rely upon the Federal Wiretap Act in lawsuits arising from adware installed on personal home computers.

Take, for example, the recently filed case of Bennett v. Lenovo (United States), Inc. In that case, the plaintiff seeks to represent a class of purchasers of Lenovo laptop computers complaining that “Superfish” software that was preloaded on the laptops directed them to preferred advertisements based on their internet browsing behavior. The most interesting claim included in the complaint is the assertion that Lenovo and Superfish violated the Federal Wiretap Act.

Wiretap? What wiretap?

The Federal Wiretap Act was originally passed as Title III of the Omnibus Crime Control and Safe Streets Act of 1968. These provisions were included, at least in part, as a result of concerns about investigative techniques used by the FBI and other law enforcement agencies that threatened the privacy rights of individuals. In passing the Wiretap Act, Congress was clearly focused on the need to protect communications between individuals by telephone, telegraph and the like. The Electronic Communications Privacy Act of 1986 (ECPA) broadened the application of the statute by expanding the kinds of communications to which the statute applied. But the focus was still on communications between individuals.

As is often the case, technology is testing the boundaries of this nearly 50-year-old law. The Bennett case is not the first case in which a plaintiff has argued that software on his or her computer that reads the user’s behavior violates the Wire Act.  In some cases, the software in question has been so-called “keylogging” software that captures every one of a user’s keystrokes. Cases considering such claims (or similar claims under state statutes modeled after the federal Act) have been split – some based on the specifics of when and how the software actually captured the information, and others based possibly on differences in the law in different parts of the country.

One of the more interesting cases, Klumb v. Gloan, 2-09-CV 115 (ED Tenn 2012), involved a husband who sued his estranged wife when he discovered that she had placed spyware on his computer.  At trial, the husband demonstrated that during his marriage, his wife installed eBlaster, a program capable of not only recording key strokes, but also intercepting emails and monitoring websites visited.  The husband alleged that once intercepted, the wife altered the emails and other legal documents to make it appear as if the husband was having an affair.  The motive?  Money, of course.  Adultery was a basis to void the pre-nuptial agreement that the parties had executed prior to their ill-fated marriage.  The wife – who was a law school graduate – argued that the installation was consensual.  Although consent is a recognized defense to a claim of violating the Federal Wiretap Act, for a variety of reasons, the court discredited the wife’s testimony regarding the purported consent and awarded damages and attorney’s fees to the husband plaintiff.

The Bennett plaintiffs may or may not succeed in showing the facts and arguing the law sufficient to prevail in their claim, and we know too little about the facts in that case to express a prediction of the result in that case. But we can state with confidence that the continued expansion of how the Wiretap Act is applied will, at some point, require that Congress step in and update the statute to make clear how it applies in the new internet-based world in which we now live.

Feb 20
2015

Employers Running Background Checks: Top 10 Tips to Avoid Joining the Fair Credit Reporting Act Litigation “Club”

Human resources and CRM

What do Whole Foods, Chuck E. Cheese, Michael’s Stores, Dollar General, Panera, Publix, and K-Mart have in common?  Each of these companies has faced lawsuits (including class actions) under the Fair Credit Reporting Act (“FCRA”).  Although Congress passed the FCRA way back in 1970 and litigation has focused on credit reporting agencies’ duties under the law, class action plaintiff firms have recently focused on the FCRA’s employer-related provisions.  Several large settlements (such as Publix’s $6.8 million class action settlement, Dollar General’s $4 million, and K-Mart’s $ 3 million) have spurred further litigation.  While some of the alleged FCRA violations may appear minor or technical in nature, these “technical violations” still result in costly lawsuits.  Employers should re-familiarize themselves with the FCRA to avoid becoming class action defendants.

The FCRA’s Employer-Related Provisions

Many employers understandably want to conduct background checks on prospective employees, or current employees who may be obtaining new responsibilities or accessing sensitive information.  In particular, companies in the retail and restaurant sectors, whose employees have access to cash receipts and credit card account numbers, want to guard against employees whose background checks may reveal issues of concern.  Further, organizations whose employees enter homes and businesses (such as service providers – e.g., carpet cleaners, plumbers, contractors) have additional concerns about potential liability.

The FCRA is usually thought of as a federal law that regulates consumer reporting agencies, like credit bureaus.  However, the FCRA also prescribes certain requirements for employers who use consumer reports.  The FCRA broadly defines the term “consumer reports” as information prepared by a consumer reporting agency “bearing on a consumer’s credit worthiness, credit standing, credit capacity, character, general reputation, personal characteristics, or mode of living which is used or expected to be used or collected in whole or in part for the purpose of serving as a factor in establishing the consumer’s eligibility for—credit or insurance to be used primarily for personal, family, or household purposes; employment purposes” or other permitted purposes. This definition draws in more than a traditional credit report. It can include driving records, civil lawsuits, and reference checks, among other information.

Disclosure and Consent

Employers may not obtain a consumer report from a consumer reporting agency unless they first make a “clear and conspicuous” written disclosure to the prospective employee/employee.  The disclosure document must consist “solely” of the disclosure that a consumer report may be obtained.  The job applicant/employee must provide written permission for the employer to obtain a consumer report.  The FTC has indicated the disclosure form may include a signature line for the individual’s consent.  (In 2001, the FTC also issued an opinion letter stating it believes such consent can be obtained electronically, consistent with the federal E-Sign law).  The employer further certifies to the consumer reporting agency that is has a permissible purpose for the report and that it has complied with the FCRA and applicable equal opportunity laws.

These steps sound simple enough, however, litigation has ensued based upon employers’ alleged failures to comply.  For instance, in the Whole Foods case in federal court in California, the plaintiffs claim the online application process included a liability waiver in the disclosure form for the background check, allegedly violating the FCRA requirement that a disclosure form not include other information.  In a separate case in federal court in Florida involving retailer Nine West, the plaintiff alleges he did not receive a separate form, and that the background check authorization was on a web page with various other types of information.

Adverse Action Based on Report

If the employer intends to take “adverse action” against the prospective employee/employee (based even in part on the information in the report), the FCRA requires the employer to follow certain additional steps. The term “adverse action” includes “a denial of employment or any other decision for employment purposes that adversely affects any current or prospective employee.”

Before the employer takes the adverse action, it must provide a “pre-adverse action” notice to the affected person. This notice must include a copy of the consumer report and a statutory “Summary of Rights.” (This is an updated form, required since January 2013 by the new Consumer Financial Protection Board, which now has responsibility for FCRA rulemaking).  The purpose of this notice requirement is to permit the individual to discuss the report with the employer before the employer implements the adverse action.

Next, if the employer intends to take the adverse action, the FCRA requires the employer to provide an adverse action notice to the individual.  This notice must contain certain information, including:this is a test one

 bulletthe name, address, and telephone number of the consumer reporting agency that provided the report;

 bulleta statement that the consumer reporting agency did not make the adverse decision and is not able to explain why the decision was made;

bulleta statement setting forth the applicant’s or employee’s right to obtain a free disclosure of his or her report from the consumer reporting agency if the individual      requests the disclosure within 60 days; and

bulleta statement regarding the individual’s right to dispute directly with the consumer reporting agency the accuracy or completeness of any information contained in the       report.

In a case involving Domino’s Pizza employees, the company settled a class action that included allegations that it took adverse employment actions against certain individuals based on information contained in consumer reports without providing those individuals the required notice and a copy of such reports in advance.  K-Mart settled a class action suit based upon allegations that the statement of consumer rights provided to individuals after a background check contained outdated disclosures, among other alleged FCRA failures.

Liability and Enforcement

Plaintiffs can pursue a private right of action against employers for negligently or willfully violating the FCRA.  Claims regarding negligent violations allow actual damages and reasonable attorneys’ fees and costs.  Willful violations can result in actual damages or statutory damages ranging between $100 and $1,000, plus punitive damages and attorneys’ fees and costs.  The Federal Trade Commission (“FTC”) has also brought actions against employers for FCRA violations.

10 Steps to Avoid Becoming a FCRA Defendant When Using Employment Background Checks

1.       Review your current background check practices for prospective and current employees, including any online application materials.

2.      Review disclosure/consent forms for compliance. Ensure you are presenting applicants or current employees with a simple, one page disclosure form. The form should inform individuals that you intend to obtain a consumer report for employment purposes.

3.      You must obtain consent from the prospective employee/employee. You may include a line on the disclosure form for the individual to acknowledge and grant consent.  Do not include other material, such as liability waivers, confirmation of at-will employment, or seek other consents.

4.      If your application process is online, ensure the disclosure/consent is displayed separately, on one screen, without other content.

5.      If you intend to conduct background checks periodically during an individual’s employment, state that in the disclosure and consent form.

6.      Do not seek consent verbally. FCRA requires “written” consent (though FTC has stated it may be electronic).

7.      Maintain backup of the disclosure and consent forms for at least 5 years from the date they were provided. (Lawsuits must be brought by the earlier of two years after the date of the plaintiff’s discovery of the violation, or five years after the date on which the violation occurred).

8.      If you intend to take adverse action based on information in the consumer report, you should be providing the individual with a pre-adverse action notice, a copy of the consumer report, and the “Summary of Rights.” Ensure you are using the most updated “Summary of Rights.”

9.      You should wait a reasonable amount of time (at least 5 days) before issuing an adverse action notice. Your company’s adverse action notice must contain the information required under the FCRA (see bulleted information, above).

10.    Check state law regarding background checks for the states in which you operate/solicit employees. Some states have similar requirements to FCRA; others may further restrict the types of information you can request.

 

*                                  *                                  *

The FTC/EEOC have issued a joint statement on background checks.  While many employers need to conduct background checks to avoid liability and risks to their businesses, employers also need to follow the FCRA’s mandates to avoid the deep end of litigation “pool.”

posted in:
Privacy
Jan 28
2015

International Data Privacy Day: Our Top 10 Data Privacy Tips

iStock_000052810800_Large

It’s International Data Privacy Day!  Every year on January 28, the United States, Canada and 27 countries of the European Union celebrate Data Privacy Day.  This day is designed to raise awareness of and generate discussion about data privacy rights and practices.  Indeed, each day new reports surface about serious data breaches, data practice concerns, and calls for legislation.  How can businesses manage data privacy expectations and risk amid this swirl of activity?

Here, we share some tips from our firm’s practice and some recent FTC guidance.  We don’t have a cake to celebrate International Data Privacy Day but we do have our “Top 10 Data Privacy Tips”:

1. Review Your Organization’s Privacy Policy. Remember that privacy policy you had counsel prepare a few years ago?  It’s a good time to review it and assess whether it still reflects company practices.  What kind of personal information does your company collect? How does it move through your business?  How is it shared?  Has your organization’s policy on sharing personal information changed?  Does the privacy policy reflect legal changes in the states where you operate?  Privacy policies are not meant to be stagnant documents.  You should review them at least twice a year to ensure they are accurate. Even something as simple as the privacy officer’s contact information may need an update.

2. Do What You Say.  When you post a privacy policy, you are committing to the practices in the policy.  If your policy says “we will never share your information with third party marketers” – then you shouldn’t be sharing with third party marketers.  Common sense?  Yes, but companies have faced enforcement actions and litigation for pledging to “never share” when they did share.  Other companies like Snapchat settled with the FTC over statements in their privacy policies concerning how their apps operate and secure information that the FTC claimed were not true. Privacy policies should carve out disclosures for sharing information where sharing is likely to take place, such as in response to legal process, like a court order.  We also recommend a carve out in the event of a sale or reorganization of the business or of its assets. Other carve-outs may be warranted.

3. Ensure Your U.S.-E.U. Safe Harbor Is Up-to-Date. Last year, the FTC took action against several companies, including the Atlanta Falcons and Level 3 Communications, for stating in their privacy policies that they were U.S.-E.U. Safe Harbor Certified by the U.S. Department of Commerce when, in fact, the companies had failed to keep their certification current by reaffirming their compliance annually. While your organization is not required to participate in Safe Harbor, don’t say you are Safe Harbor Certified if you haven’t filed with the U.S. Department of Commerce. And, remember that your company needs to reaffirm compliance annually, including payment of a fee.  You can check your company’s status here.

4. Understand Your Internal Risks. We’ve said this before – while malicious breaches are certainly out there, a significant percentage of breaches (around 30 percent, according to one recent study) occurs due to accidents or malicious acts by employees.  These acts include lack of firewalls, lack of encryption on devices (such as laptops and flash drives), and failing to change authentications when employees leave or are terminated.  Many data breaches are While you are at it, review who has access to confidential information and whether proper restrictions are in place.

5. Educate Your Workforce. While today is International Data Privacy Day, your organization should educate your workforce on privacy issues throughout the year. Depending on the size of the company and the type of information handled (for instance, highly sensitive health information versus standard personal contact details), education efforts may vary. You should review practices like the confidentiality of passwords, creating a secure password and changing it frequently, and avoiding downloading personal or company sensitive information in unsecured forms.  Just last week, a security firm reported that the most popular passwords for 2014 were “123456” and “password.”  At a minimum, these easily guessed passwords should not be allowed in your system.

6. Understand Specific Requirements of Your Industry/Customers/ Jurisdiction. Do you have information on Massachusetts residents?  Massachusetts requires that your company have a Written Information Security Program.  Does your company collect personal information from kids under 13?  The organization must comply with the federal Children’s Online Privacy Protection Act and the FTC’s rules.  The FTC has taken many actions against companies deemed to be collecting children’s information without properly seeking prior express parental consent.

7. Maintain a Data Breach Response Plan. If there were a potential data breach, who would get called?  Legal?  IT?  Human Resources?  Public relations?  Yes, likely all of these. The best defense is a good offense – plan ahead.  Representatives from in-house and outside counsel, IT/IS, human resources, and your communications department should be part of this plan. State data breach notification laws require prompt reporting. Some companies have faced lawsuits for alleged “slow” response times.  If there is potential breach, your company needs to gather resources, investigate, and if required, disclose the breach to governmental authorities, affected individuals, credit reporting agencies, etc.

8. Consider Contractual Obligations. Before your company commits to data security obligations in contracts, ensure that a knowledgeable party, such as in-house or outside counsel, reviews these commitments.  If there is a breach of a contracting party’s information, assess the contractual requirements in addition to those under data breach notification laws. The laws generally require notice to be given promptly when a company’s data is compromised while under the “care” of another company. On the flip side, consider the service providers your company uses and what type of access the providers have to sensitive data. You should require service providers to adhere to reasonable security standards, with more stringent requirements if they handle sensitive data.

9. Review Insurance Coverage. While smaller businesses may think “we’re not Target” and don’t need cyber insurance, that’s a false assumption. In fact, smaller businesses usually have less sophisticated protections and can be more vulnerable to hackers and employee negligence.  Data breaches – requiring investigations, hiring of outside experts such as forensics, paying for credit monitoring, and potential loss of goodwill – can be expensive. Carriers are offering policies that do not break the bank. Cyber insurance is definitely worth exploring.  If you believe you have coverage for a data incident, your company should promptly notify the carrier. Notice should be part of the data breach response plan.

10. Remember the Basics! Many organizations have faced the wrath of the FTC, state attorneys general or private litigants because the companies or its employees failed to follow basic data security procedures. The FTC has settled 53 data security law enforcement actions. Many involve the failure to take common sense steps with data, such as transmitting sensitive data without encryption, or leaving documents with personal information in a dumpster. Every company must have plans to secure physical and electronic information. The FTC looks at whether a company’s practices are “reasonable and appropriate in light of the sensitivity and amount of consumer information you have, the size and complexity of your business, and the availability and cost of tools to improve security and reduce vulnerabilities.” If the FTC calls, you want to have a solid explanation of what you did right, not be searching for answers, or offering excuses.  Additional information on the FTC’s guidance can be found here.

*                            *                            *

 Remember, while it may be International Data Privacy Day, data privacy isn’t a one day event. Privacy practices must be reviewed and updated regularly to protect data as well as enable your company to act swiftly and responsively in the event of a data breach incident.

Nov 07
2014

Report from an Energized Brand Activation Association Marketing Law Conference

Group Of Multi-Ethnic People Social Networking

Ifrah Law is a proud member the Brand Activation Association (“BAA”). This week, we attended the BAA’s 36th annual BAA Marketing Law Conference in Chicago.  Just as “Mad Men” reflects the 1960’s era advertising business, this year’s BAA conference demonstrated this generation’s marketing dynamic – where mobile is key, privacy concerns abound, and the Federal Trade Commission (“FTC”) and other agencies are watching and enforcing. Other key “take aways” from the conference are that sweepstakes, contests, and other promotions remain hugely popular via mobile devices and social networks.

Digital Rules

Advertisers representing top brand names made clear that companies must reach consumers through various digital devices.  Smartphones, tablets, and wearable technologies each represent ways to advertise a product or service.  Today’s consumers, especially younger consumers, rely extensively mobile devices. Many actually welcome behavioral and other advertising.  Consumers in the U.S. and abroad have shown receptiveness to “flash sales,” instant coupons and other deals, including those geared to their geo-location.

Emerging Privacy and Consumer Protection Trends

While advertisers interact with consumers and many consumers welcome offers and information, regulators’ and individuals’ concerns with the privacy of personal information dominate the landscape.  Almost a year after the notorious Target data breach, and with the holiday shopping season approaching, all stakeholders are understandably cautious about how to utilize various methods of marketing while securing consumer information.  Even assuming a network is secure, the FTC, state attorney generals, foreign regulators, consumer advocacy groups and consumers want to know how personal data is being collected, utilized and shared.  In the consumer protection context, the FTC actively enforces the Federal Trade Commission Act’s prohibition on “deceptive acts and practices,” requiring that advertisers have substantiation for product claims.

Two Significant Forces – the FTC and California’s Attorney General

Top representatives from the FTC and the California Attorney General presented at the conference.  Both representatives asserted their agencies remain active in enforcing their consumer protection and privacy laws, especially as to certain areas.  Jessica Rich, Director, Bureau of Consumer Protection at the FTC, discussed the agency’s focus on advertising substantiation, particularly as to claims involving disease prevention and cure, weight loss, and learning enrichment (such as the “Your Baby Can Read “ case).

On the privacy side, Ms. Rich also noted the FTC’s specialized role in enforcing the Children’s Online Privacy Protection Act (“COPPA”).  The FTC’s recent action against Yelp demonstrates that the FTC will not hesitate to enforce COPPA even where a website is not a child-focused website, per se. If a website or online service (such as a mobile app) collects personal information from children under 13, it must comply with COPPA’s notice and consent requirements. The agency is also exploring the privacy and consumer protection concerns associated with interconnected devices, known as “the Internet of Things.”

The representative from the California Attorney General’s office noted that California has a keen interest in mobile apps, as demonstrated by its action against Delta for allegedly failing to have a privacy policy available through its mobile app.  California is also gearing up for its “Eraser Law,” set to go in effect on January 1, 2015. This law provides an opportunity for young people under 18 to “erase” embarrassing or damaging content they posted online, including on social media.

Promotions – Sweepstakes, Contests, Games

While some may think sweepstakes and contests are outdated, the opposite is true. Companies are utilizing mobile and social networks to engage with consumers through promotions.  Facebook and Pinterest-based sweepstakes and contests continue to grow in popularity. Advertisers also increasingly look to “text-based” offerings.

These promotions can generate great marketing visibility and grow consumer relationships. However, advertisers need to be aware of many legal minefields.  First and foremost is the federal Telephone Consumer Protection Act (“TCPA”), which requires prior express “written” consent for advertisements sent to mobile phones via text or calls utilizing an autodialer or prerecorded message.  Plaintiffs’ lawyers continue to file hundreds of TCPA class actions based on texts without consent.  Second, the social networks have their own policies. For instance, Facebook now bars advertisers from requiring consumers to “like” a company Facebook page in order to participate in a promotion.

Take Aways

BAA conference sessions were packed – many standing room only.  The popularity of programs about comparative advertising, native advertising, sweepstakes and contests, and enforcement trends demonstrates that advertisers are finding innovative ways to reach consumers across devices. These marketing initiatives face a host of federal, state, and international laws and regulations, as well as restrictions imposed by social networks and providers.  It’s an exciting and complex juncture in global marketing.

Sep 04
2014

Federal Trade Commission Checks Out Mobile Shopping Apps

Happy young Asian woman shopping.

In August, the Federal Trade Commission (“FTC”) released a staff report concerning mobile shopping applications (“apps”).  FTC staff reviewed some of the most popular apps consumers utilize to comparison shop, collect and redeem deals and discounts, and pay in-store with their mobile devices.  This new report focused on shopping apps offering price comparison, special deals, and mobile payments. The August report is available here.

Popularity of Mobile Shopping Apps/FTC Interest

Shoppers can empower themselves in the retail environment by comparison shopping via their smartphones in real-time.  According to a 2014 Report by the Board of Governors of the Federal Reserve System, 44% of smartphone owners report using their mobile phones to comparison shop while in retail store, and 68% of those consumers changed where they made a purchase as a result.  Consumers can also get instant coupons and deals to present at checkout.  With a wave of a phone at the checkout counter, consumers can then make purchases.

While the shopping apps have surged in popularity, the FTC staff is concerned about consumer protection, data security and privacy issues associated with the apps. The FTC studied what types of disclosures and practices control in the event of unauthorized transactions, billing errors, or other payment-related disputes.  The agency also examined the disclosures that apps provide to consumers concerning data privacy and security.

 Apps Lack Important Information

FTC staff concluded that many of the apps they reviewed failed to provide consumers with important pre-download information.  In particular, only a few of the in-store purchase apps gave consumers information describing how the app handled payment-related disputes and consumers’ liability for charges (including unauthorized charges).

FTC staff determined that fourteen out of thirty in-store purchase apps did not disclose whether they had any dispute resolution or liability limits policies prior to download.  And, out of sixteen apps that provided pre-download information about dispute resolution procedures or liability limits, only nine of those apps provided written protections for users.  Some apps disclaimed all liability for losses.

Data Security Information Vague

FTC staff focused particular attention on data privacy and security, because more than other technologies, mobile devices are personal to a user, always on, and frequently with the user. These features enable an app to collect a huge amount of information, such as location, interests, and affiliations, which could be shared broadly with third parties.  Staff noted that, “while almost all of the apps stated that they share personal data, 29 percent of price comparison apps, 17 percent of deal apps, and 33 percent of in-store purchase apps reserved the right to share users’ personal data without restriction.”

Staff concluded that while privacy disclosures are improving, they tend to be overly broad and confusing. In addition, app developers may not be considering whether they even have a business need for all the information they are collecting.  As to data security, staff noted it did not test the services to verify the security promises made.  However, FTC staff reminded companies that it has taken enforcement actions against mobile apps it believed to have failed to secure personal data (such as Snapchat and Credit Karma).  The report states, “Staff encourages vendors of shopping apps, and indeed vendors of all apps that collect consumer data, to secure the data they collect.  Further those apps must honor any representations about security that they make to consumers.”

FTC Staff Recommends Better Disclosures and Data Security Practices

The report urges companies to disclose to consumers their rights and liability limits for unauthorized, fraudulent, or erroneous transactions.  Organizations offering these shopping apps should also explain to consumers what protections they have based on their methods of payment and what options are available for resolving payment and billing disputes.  Companies should provide clear, detailed explanations for how they collect, use and share consumer data.  And, apps must put promises into practice by abiding by data security representations.

Consumer Responsibility Plays Role, Too

Importantly, the FTC staff report does not place the entire burden on companies offering the mobile apps. Rather, FTC staff urge consumers to be proactive when using these apps.  The staff report recommends that consumers look for and consider the dispute resolution and liability limits of the apps they download.  Consumers should also analyze what payment method to use when purchasing via these apps. If consumers cannot find sufficient information, they should consider an alternative app, or make only small purchases.

While a great “deal” could be available with a click on a smartphone, the FTC staff urges consumers to review available information on how their personal and financial data may be collected, used and shared while they get that deal.  If consumers are not satisfied with the information provided regarding data privacy and security, then staff recommends that they choose a different app, or limit the financial and personal financial data they provide.  (Though that last piece of advice may not be practical considering most shopping apps require a certain level of personal and financial information simply to complete a transaction).

Deal or No Deal?  FTC Will be Watching New Shopping Apps

               FTC Staff has concerns about mobile payments and will continue to focus on consumer protections.  The agency has taken several enforcement actions against companies for failing to secure personal and payment information and it does not appear to be slowing down.  While the FTC recognizes the benefits of these new shopping and payment technologies, it is also keenly aware of the enormous amount of data obtained by companies when consumers use these services. Thus, companies should anticipate that the FTC will continue to monitor shopping and deal apps with particular attention on disclosures and data practices.

Aug 04
2014

Google/Viacom Win Video Privacy Protection Act Case – Common Sense Finally Emerges

iStock_000017353675Small

In an important decision in a federal court case in New Jersey, In Re Nickelodeon Privacy Litigation, Google and Viacom obtained a dismissal of a claim against them under the Video Privacy Protection Act (“VPPA”).  The decision narrows the scope of who can be liable under the VPPA and what information is within the scope of the statute.

Congress passed the VPPA in 1988 after Robert Bork, a nominee for the U.S. Supreme Court, had his video rental history published during the nomination process.  While Judge Bork’s viewing habits were unremarkable, members of Congress became understandably concerned that any individual’s private viewing information could easily be made public.  The VPPA makes any “video tape service provider” that discloses rental information outside the ordinary course of business liable for $2,500 in damages per person, in addition to attorneys’ fees and punitive damages.  There is no cap on the damages that plaintiffs can be awarded under the statute and cases are typically brought as class actions with large groups of plaintiffs.

In 2013, Congress passed and President Obama signed the first major change to the VPPA since it was enacted, the Video Privacy Protection Act Amendments Act of 2012.  These amendments made it easier for companies to obtain consent from consumers to share their video viewing history.  The amendment removed the requirement that video service providers obtain written consent from users every time a user’s viewing choice is disclosed.  Additionally, the amendment allowed for a provider to obtain a user’s consent online and that the consent can apply on an ongoing basis for two years as long as the user is given the opportunity to withdraw that consent.  The amendments were enacted in response to the interest by consumers in sharing videos on social media platforms.

Viacom owns and operates three websites through which users can stream videos and play video games.  The plaintiffs in the lawsuit were registered users of those websites.  When a user registered with the site, that individual would be assigned a code name based on that user’s gender and age.  The plaintiffs alleged that the user code name would be combined with a code that identified which videos the user watched and that code was disclosed by Viacom to Google.  The plaintiffs sued Viacom and Google alleging among other things that this disclosure was a violation of the VPPA.

The VPPA claim against Google was dismissed because the court found that Google was not a “video tape service provider” (“VTSP”) as required for liability under the statute.  The court reasoned that Google is not “engaged in the business of renting, selling, or delivering either video tapes or similar audio materials.”  Some courts have shown a willingness to extend the definition of a VTSP to companies such as Hulu and Netflix that offer video-streaming services, but the court in this case stopped short of extending it to Google, a company that does not offer video services as its main business.

The VPPA claim against Viacom failed because the court found that, even if Viacom were a VTSP, an issue the court did not reach, Viacom did not release personally identifiable information to Google, which is required to have occurred under the VPPA.  The court concluded that “anonymous user IDs, a child’s gender and age, and information about the computer used to access Viacom’s websites” – even if disclosed by Viacom – were not personally identifiable information.

With its potential for large damages there has been a recent uptick in cases filed under the VPPA.  Recently, plaintiffs have filed cases against well-known media companies including Hulu, Netflix, ESPN, the Cartoon Network, and The Wall Street Journal. These cases have started to show a trend in shifting away from the intended defendants, companies whose main line of business is renting and selling videos, and toward companies that provide streaming video as part of their business.

The line drawn by the court in this case of who can be considered a VTSP could be a significant win for companies that offer mobile apps with streaming video capabilities by limiting the definition of a VTSP to companies that are in the business of renting or selling videos.  Such a limitation would be welcome by many operators of new technologies.  Given the vast number of devices and platforms that deliver video content of some kind, an expansion of the definition of a VTSP could lead to a flood of litigation involving companies that are not in the business of renting or selling videos and were not the intended defendants under the statute.

While this decision will not stop the recent uptick in VPPA litigation, it will provide courts with guidance as how to determine who should be liable under the VPPA.  The text of the VPPA was written in a way that did not anticipate the current environment where streaming video is available on a multitude of devices.  As more cases are filed, the limits of the statute’s scope will be tested.  However, this court’s decision provides precedent for a common sense approach to determining who should be held liable under the VPPA.

posted in:
Privacy
Jun 24
2014

Disappearing Act Fails – Maryland Attorney General and FTC “snap” back at Snapchat

Recently, the Maryland Attorney General’s Office announced that it reached a settlement with Snapchat, Inc. over alleged deceptive trade practices in violation of Maryland law and violations of federal laws that are intended to protect children’s online privacy.  This is another reminder that state attorneys general’s offices will continue to be vigilant in addressing consumer privacy issues under both state and federal laws, when the federal laws permit state attorney general action.

Snapchat is a photo and video messaging app that allows users to take photos and videos, add text and drawings, and send them to selected contacts.  The sent images are commonly referred to as “snaps” and users can set a time limit of up to ten seconds for how long the image will be visible to the contact.  According to Snapchat, its app’s users were sending 700 million photos and videos per day in May 2014.

Maryland’s Attorney General asserted that Snapchat misled consumers when it represented that snaps are temporary and disappear after they are opened by a recipient.  The Attorney General claimed that, in fact, the snaps could be copied or captured by recipients.  Additionally, the Maryland Attorney General alleged that Snapchat collected and maintained the names and phone numbers from contact lists on consumers’ electronic devices, which was a practice that Snapchat had not always disclosed to consumers and to which consumers did not always consent.  Lastly, the Attorney General alleged that Snapchat was aware that some users were under the age of 13, but it failed to comply with the federal Children’s Online Privacy Protection Act (“COPPA”), when it collected personal information from children without verifiable parental consent.  COPPA has a provision that empowers state attorneys general to bring enforcement actions under the statute on behalf of residents of their states.

Snapchat agreed to pay the state of Maryland $100,000 to settle this case.  Additionally, as part of its settlement, Snapchat agreed to not make false representations or material omissions in connection with its app.  Furthermore, Snapchat is specifically enjoined from misrepresenting the temporary nature of the snaps and must disclose to users that recipients of snaps have the ability to copy the image they receive.  Snapchat must also obtain affirmative consent from consumers before it collects and saves any contact information.  In response to the COPPA allegations, Snapchat agreed to comply with COPPA for a period of ten years and to take specific steps to ensure that children under the age of 13 are not creating Snapchat accounts.

Snapchat has faced other actions as well.  Last month, Snapchat reached a settlement with the Federal Trade Commission (“FTC”) on charges that it deceived consumers with promises about the disappearing nature of messages sent through the service.  According to the FTC, Snapchat promised users that messages and images sent through the app would self-destruct and disappear in ten seconds or less despite there being ways for recipients to save the snaps.  The FTC case also alleged that Snapchat told users that it did not collect information about their location when one version of the app did collect location information.

The FTC case did not include any accusation of violating COPPA, nor did it include any financial penalty. As part of the settlement, Snapchat agreed to implement privacy programs that will be subject to monitoring for 20 years and agreed not to misrepresent the confidentiality, privacy, and security of user information.  Snapchat is also prohibited from misrepresenting how it maintains the privacy and confidentiality of user agreements.

On its official blog, Snapchat emphasized that its app does not retain users’ snaps and that both investigations largely revolved around how well users understood that recipients of their snaps could save their snaps.  In response to the COPPA claims, Snapchat pointed out that its terms of service have always provided that the app is intended for users who are 13 years of age or older and has instituted controls to ensure it.

Mobile app companies need to be aware of the fact that they are being closely monitored by both the FTC and state attorneys general offices.  In particular, any claim made by an app about consumer privacy may be scrutinized by regulators.  Companies need to be prepared to justify their claims and must be forthcoming about any data that is collected from consumers.  In other words:  if you say you do something then you need to do it; if you say that you do not do something, do not do it.  Your company does not want the FTC or a state attorney general “snapping” at your privacy practices.

posted in:
Privacy
May 20
2014

Sprint Gets a Wallop of a Reminder – Company-Specific Do Not Call Lists Still Matter – $7.5 Million Record Do Not Call Consent Decree

Yesterday, the Federal Communications Commission (“FCC”) announced a consent decree with Sprint Corporation for federal do not call violations. Specifically, under the terms of the agreement, Sprint will make a $7.5 million “voluntary contribution” to the United States Treasury.  This payment represents the largest do not call settlement reached by the FCC.  Sprint also agreed to various ongoing compliance initiatives, including enhanced training and reporting requirements.  Importantly, the action also serves as an important reminder on an often overlooked section of the do not call rules – the requirement that companies maintain and abide by “company-specific” or internal do not call lists.

Under the federal do not call rules, organizations making telemarketing calls to residential customers (including mobile phones) are required to scrub the federal do not call database before initiating those calls, unless the calls meet certain exceptions – the called party has an existing business relationship (“EBR”) with the caller or has provided prior express consent for the calls or the call is from a tax-exempt non-profit.  Of course, as we have written before, there are additional requirements for autodialed or prerecorded calls to mobile mobiles and prerecorded telemarketing calls to residential lines.

Another, sometimes overlooked requirement is that companies making permissible calls (for instance, after scrubbing the do not call database or with an existing business relationship or prior express consent) must maintain an internal, company-specific do not call list where companies log individuals’ subsequent requests not to be called.  In other words, even if a consumer has an existing business relationship or has given prior express consent to be called, once the consumer tells the company not to call again, that request trumps the existing business relationship/prior consent or the do not call scrub.  This company-specific do not call request must be implemented within 30 days and  honored for five years from the date the consumer made the request.  (The federal do not call registration, in contrast, lasts indefinitely).  A company must also have a do not call policy, available upon request.

In 2009, the FCC investigated Sprint for do not call violations relating to the company-specific do not call list.  Sprint subsequently settled that enforcement action in 2011 through a consent decree (which included a $ 400,000 payment).  The decree required Sprint to report to the FCC’s Enforcement Bureau, for two years, any noncompliance with the consent decree or the FCC’s company-specific do not call rules.

In March 2012, Sprint disclosed to the FCC that it had discovered additional issues involving human error and technical malfunctions relating to Sprint’s or its vendor’s do not call processes that caused potential noncompliance with consumers’ do not call or do not text preferences, or prevented the timely capture of the preferences.  Sprint represented that it had subsequently implemented improvements in its do not call data management systems.  It had also ceased telemarketing and text campaigns to investigate the issues.  The FCC investigated Sprint’s do not call compliance and ultimately entered into this record-setting $7.5 million settlement.

Under the terms of the consent decree, in addition to the settlement payment, Sprint will designate a Compliance Officer to administer a new compliance plan and to comply with the consent decree.  Sprint also must implement a compliance manual which will instruct “covered personnel” (including Sprint personnel and independent contractors who provide telemarketing services for Sprint) on Sprint’s do not call policies.  The consent decree further requires Sprint to establish and maintain an annual compliance training program, and to file several compliance reports with the FCC at designated time frames.  Significantly, Sprint acknowledges that actions or inactions of any independent contractors, subcontractors, or agents that result in a violation of the company-specific do not call rules or the consent constitute an act or inaction by Sprint – in other words, Sprint is specifically on the hook for third parties’ actions.

The consent decree and $7.5 million payment serve as a useful reminder of the company-specific do not call rules.  Once a consumer indicates they do not wish to receive further telemarketing calls or texts, the FCC’s rules require that the telemarketer place that consumer on its internal, company-specific do not call list.  This consumer requests trumps even an established business relationship or prior express consent.  It can only be revoked by subsequent express consent – which we would recommend be in writing.  Even if a consumer does business with your company every day, if he or she has asked not to receive telemarketing calls – don’t call!  Compliance with the company-specific do not call rule means your organization does not call someone who has indicated they do not want to be called.  And, it can also save your company great time, resources, and money spent defending private litigation or an FCC enforcement action.  Further, if your organization utilizes third parties for telemarketing campaigns, your company should make sure the third party is taking do not call requests, logging them, and passing those to your company for future campaigns.

Apr 07
2014

FTC Sends Message: Make Your Mobile App Secure

Mobile payments have become so commonplace that consumers rarely stop to think about whether their online payment is secure. Mobile app developers can fall into a similar trap of assuming that the necessary security measures are enabled without performing the necessary audits to assure security on a regular basis. A recent settlement between the FTC and two companies offering unsecured mobile application products gives cause to think again.

The FTC alleges that the movie ticketing service Fandango and credit monitoring company Credit Karma failed to adequately protect consumers’ sensitive personal information in their mobile apps because they failed to use Secure Sockets Layer (“SSL”) protocol to establish authentic, encrypted connections with consumers. Generally, an online service will present an SSL certificate to the app on the consumer’s device to vouch for its identity. The app then verifies the certificate to ensure that it is connecting to the genuine online service. When companies fail to use this protocol—especially if consumers use the app over a public wi-fi system—third party attackers can substitute an invalid certificate to the app, thus establishing a connection between the app and the attacker rather than the online service. As a result, any information that the consumer enters into the app will be sent directly to the attacker, including credit card numbers and other sensitive and personally identifying information.

The FTC alleged that Fandango and Credit Karma left their applications vulnerable to interception by third parties by failing to use SSL protocol. The FTC alleged that Fandango misrepresented the security of its application by stating that consumers’ credit card information would be stored and transmitted securely, despite the fact that the SSL protocol was disabled on the app from March 2009 to March 2013. The FTC alleged that Credit Karma’s app failed to validate SSL certificates from July 2012 to January 2013, leaving the app susceptible to attackers which could gather personal identifying information such as passwords, security questions and answers, birthdates, and “out of wallet” verification answers regarding things like mortgages and loan amounts.

In both cases, the online services received warnings of the vulnerabilities from both users and the FTC. In December 2012 a security researcher used Fandango’s online customer service form to submit a warning regarding the vulnerability. However, Fandango mistakenly flagged the email as a password reset request and sent the researcher a stock response on password resetting, then marked the complaint as resolved. A user sent a similar notice to Credit Karma about the SSL certificates in January 2013. Credit Karma responded by issuing a fix in the update to the iOS operating system that same month, however, one month later Credit Karma issued an Android app which contained the same vulnerability.

In both cases, the online services performed a more thorough internal audit of the apps only when issued a warning by the FTC. The FTC issued complaints against the companies for their deceptive representations regarding the security of their systems. While the complaints noted that the apps were vulnerable to third party attacks, they did not allege that any such attacks were made or that any consumer information was in fact compromised. Perhaps due to the lack of consumer harm, the FTC entered into consent agreements with Fandango and Credit Karma in which the services did not have to pay a monetary judgment, but did agree to establish comprehensive security programs and undergo security assessments every other year for the next 20 years. Fandango and Credit Karma are additionally prohibited from misrepresenting the level of privacy and security in their products or services.

SSL certificates are the default validation process that iOS and Android operating systems provide developers using the application programming interface. Therefore, mobile app developers can protect themselves and their users from this vulnerability simply by leaving the default SSL protocol enabled. What’s more, app developers can test for and identify SSL certificate validation vulnerabilities using free or very low cost tools. Therefore, all app developers should take the necessary precautions to ensure the security of their systems, and prevent harm to consumers (and potential lawsuits) down the road.

posted in:
Privacy
Connect with Us Share

About Ifrah Law

Crime in the Suites is authored by the Ifrah Law Firm, a Washington DC-based law firm specializing in the defense of government investigations and litigation. Our client base spans many regulated industries, particularly e-business, e-commerce, government contracts, gaming and healthcare.

Ifrah Law focuses on federal criminal defense, government contract defense and procurement, healthcare, and financial services litigation and fraud defense. Further, the firm's E-Commerce attorneys and internet marketing attorneys are leaders in internet advertising, data privacy, online fraud and abuse law, iGaming law.

The commentary and cases included in this blog are contributed by founding partner Jeff Ifrah, partners Michelle Cohen, David Deitch, and associates Rachel Hirsch, Jeff Hamlin, Steven Eichorn, Sarah Coffey, Nicole Kardell, Casselle Smith, and Griffin Finan. These posts are edited by Jeff Ifrah. We look forward to hearing your thoughts and comments!

Visit the Ifrah Law Firm website

Popular Posts