In August, the Federal Trade Commission (“FTC”) released a staff report concerning mobile shopping applications (“apps”). FTC staff reviewed some of the most popular apps consumers utilize to comparison shop, collect and redeem deals and discounts, and pay in-store with their mobile devices. This new report focused on shopping apps offering price comparison, special deals, and mobile payments. The August report is available here.
Popularity of Mobile Shopping Apps/FTC Interest
Shoppers can empower themselves in the retail environment by comparison shopping via their smartphones in real-time. According to a 2014 Report by the Board of Governors of the Federal Reserve System, 44% of smartphone owners report using their mobile phones to comparison shop while in retail store, and 68% of those consumers changed where they made a purchase as a result. Consumers can also get instant coupons and deals to present at checkout. With a wave of a phone at the checkout counter, consumers can then make purchases.
While the shopping apps have surged in popularity, the FTC staff is concerned about consumer protection, data security and privacy issues associated with the apps. The FTC studied what types of disclosures and practices control in the event of unauthorized transactions, billing errors, or other payment-related disputes. The agency also examined the disclosures that apps provide to consumers concerning data privacy and security.
Apps Lack Important Information
FTC staff concluded that many of the apps they reviewed failed to provide consumers with important pre-download information. In particular, only a few of the in-store purchase apps gave consumers information describing how the app handled payment-related disputes and consumers’ liability for charges (including unauthorized charges).
FTC staff determined that fourteen out of thirty in-store purchase apps did not disclose whether they had any dispute resolution or liability limits policies prior to download. And, out of sixteen apps that provided pre-download information about dispute resolution procedures or liability limits, only nine of those apps provided written protections for users. Some apps disclaimed all liability for losses.
Data Security Information Vague
FTC staff focused particular attention on data privacy and security, because more than other technologies, mobile devices are personal to a user, always on, and frequently with the user. These features enable an app to collect a huge amount of information, such as location, interests, and affiliations, which could be shared broadly with third parties. Staff noted that, “while almost all of the apps stated that they share personal data, 29 percent of price comparison apps, 17 percent of deal apps, and 33 percent of in-store purchase apps reserved the right to share users’ personal data without restriction.”
Staff concluded that while privacy disclosures are improving, they tend to be overly broad and confusing. In addition, app developers may not be considering whether they even have a business need for all the information they are collecting. As to data security, staff noted it did not test the services to verify the security promises made. However, FTC staff reminded companies that it has taken enforcement actions against mobile apps it believed to have failed to secure personal data (such as Snapchat and Credit Karma). The report states, “Staff encourages vendors of shopping apps, and indeed vendors of all apps that collect consumer data, to secure the data they collect. Further those apps must honor any representations about security that they make to consumers.”
FTC Staff Recommends Better Disclosures and Data Security Practices
The report urges companies to disclose to consumers their rights and liability limits for unauthorized, fraudulent, or erroneous transactions. Organizations offering these shopping apps should also explain to consumers what protections they have based on their methods of payment and what options are available for resolving payment and billing disputes. Companies should provide clear, detailed explanations for how they collect, use and share consumer data. And, apps must put promises into practice by abiding by data security representations.
Consumer Responsibility Plays Role, Too
Importantly, the FTC staff report does not place the entire burden on companies offering the mobile apps. Rather, FTC staff urge consumers to be proactive when using these apps. The staff report recommends that consumers look for and consider the dispute resolution and liability limits of the apps they download. Consumers should also analyze what payment method to use when purchasing via these apps. If consumers cannot find sufficient information, they should consider an alternative app, or make only small purchases.
While a great “deal” could be available with a click on a smartphone, the FTC staff urges consumers to review available information on how their personal and financial data may be collected, used and shared while they get that deal. If consumers are not satisfied with the information provided regarding data privacy and security, then staff recommends that they choose a different app, or limit the financial and personal financial data they provide. (Though that last piece of advice may not be practical considering most shopping apps require a certain level of personal and financial information simply to complete a transaction).
Deal or No Deal? FTC Will be Watching New Shopping Apps
FTC Staff has concerns about mobile payments and will continue to focus on consumer protections. The agency has taken several enforcement actions against companies for failing to secure personal and payment information and it does not appear to be slowing down. While the FTC recognizes the benefits of these new shopping and payment technologies, it is also keenly aware of the enormous amount of data obtained by companies when consumers use these services. Thus, companies should anticipate that the FTC will continue to monitor shopping and deal apps with particular attention on disclosures and data practices.
A company that markets video cameras that are designed to allow consumers to monitor their homes remotely has agreed to settle charges with the FTC that it failed to properly protect consumers’ privacy. This marks the FTC’s first enforcement action against a marketer of a product with connectivity to the Internet and other mobile devices, commonly referred to as the “Internet of Things.”
The FTC’s complaint alleges that TRENDNet marketed its cameras for uses ranging from baby monitoring to home security and that TRENDNet told customers that its products were “secure.” In fact, however, the devices were compromised by a hacker who posted links on the Internet to live feeds of over 700 cameras. Additionally, TRENDNet stored and transmitted user credentials in clear unencrypted text.
Under the terms of its settlement with the FTC, TRENDnet is prohibited from misrepresenting the security of its cameras or the security, privacy, confidentiality, or integrity of the information that its cameras or devices transmit. The company must also establish a comprehensive security program and notify customers about security issues with the cameras and must provide a software update to customers to address security issues.
“The Internet of Things holds great promise for innovative consumer products and services,” FTC Chairwoman Edith Ramirez said. “But consumer privacy and security must remain a priority as companies develop more devices that connect to the Internet.”
The FTC’s authority to regulate and penalize companies that the agency claims do not protect consumers with sufficient data security is being challenged in federal court in New Jersey by The Wyndham Hotel Group. Wyndham has argued, among other things, that the FTC has not published any formal rules on data security and therefore cannot penalize companies that it deems have not protected consumer information. That case is pending.
This is the first time the FTC has brought an enforcement action involving the “Internet of Things,” but the FTC has already signaled it will be carefully watching how the Internet of Things develops. In particular, the FTC will be hosting a workshop in November to explore these new technologies. The agency previously sought comment from interested stakeholders on the Internet of Things – including the privacy and data security implications of interconnected devices. We expect that the FTC will continue to explore these issues, with a particular emphasis on how these devices collect and share information, particularly sensitive and personal information, such as health information.
The Federal Trade Commission recently filed another complaint against a company for alleged data security lapses. As readers of this blog know, the FTC has initiated numerous lawsuits against companies in various industries for data security and privacy violations, although it is facing a backlash from Wyndham and large industry organizations for allegedly lacking the appropriate authority to set data security standards in this way.
The FTC’s latest target is LabMD, an Atlanta-based cancer detection laboratory that performs tests on samples obtained from physicians around the country. According to an FTC press release, the FTC’s complaint (which is being withheld while the FTC and LabMD resolve confidentiality issues) alleges that LabMD failed to reasonably protect the security of the personal data (including medical information) of approximately 10,000 consumers, in two separate incidents.
Specifically, according to the FTC, LabMD billing information for over 9,000 consumers was found on a peer-to-peer (P2P) file-sharing network. The information included a spreadsheet containing insurance billing information with Social Security numbers, dates of birth, health insurance provider information, and standardized medical treatment codes.
In the second incident, the Sacramento, California Police Department found LabMD documents in the possession of identity thieves. The documents included names, Social Security numbers, and some bank account information. The FTC states that some of these Social Security numbers were being used by multiple individuals, indicating likely identity theft.
The FTC’s complaint alleges that LabMD did not implement or maintain a comprehensive data security program to protect individuals’ information, that it did not adequately train employees on basic security practices, and that it did not use readily available measures to prevent and detect unauthorized access to personal information, among other alleged failures.
The complaint includes a proposed order against LabMD that would require the company to implement a comprehensive information security program. The program would also require an evaluation every two years for 20 years by an independent certified security professional. LabMD would further be required to provide notice to any consumers whose information it has reason to believe was or could have been accessible to unauthorized persons and to consumers’ health insurance companies.
LabMD has issued a statement challenging the FTC’s authority to regulate data security, and stated that it was the victim of Internet “trolls” who presumably stole the information. This latest complaint is yet another sign that the FTC continues to monitor companies’ data security practices, particularly respecting health, financial, and children’s information. Interestingly, the LabMD data breaches were not huge – with only 10,000 consumers affected. But, the breach of, and potential unauthorized access to, sensitive health information and Social Security numbers tend to raise the FTC’s attention.
While industry awaits the district court’s decision on Wyndham’s motion to dismiss based on the FTC’s alleged lack of authority to set data security standards, companies should review and document their data security practices, particularly when it comes to sensitive personal information. Of course, in addition to the FTC, some states, such as Massachusetts, have their own data security standards, and most states require reporting of data breaches affecting personal information.
Manufacturers and marketers know that the more consumer data they have, the more they can tailor and direct their advertising, their products, and their product placement. This helps them to maximize sales and minimize costs. Thanks to the combination of cheap data storage and ubiquitous data capturers (e.g., smart phones, credit cards, the Web), the amount of consumer data out there to mine is astounding. Hence the recently-popularized term, “Big Data.”
But the misuse of data could result in government enforcement actions and, more importantly, serious privacy violations that can affect everyone.
Some of the practical challenges and concerns flowing from the use of big data were addressed recently by FTC Commissioner Julie Brill at the 23rd Computers, Freedom and Privacy conference on June 26. Issues raised include noncompliance with the Fair Credit Reporting Act and consumer privacy matters such as transparency, notice and choice, and deidentification (scrubbing consumer data of personal identifiers).
The FCRA: Those whose business includes data collection or dissemination should determine whether their practices fall within the boundaries of the FCRA. As Brill pointed out, “entities collecting information across multiple sources and providing it to those making employment, credit, insurance and housing decisions must do so in a manner that ensures the information is as accurate as possible and used for appropriate purposes.” If Brill’s comments are any indication of enforcement actions to come, businesses should be aware that the FTC is on the lookout for big data enterprises that don’t adhere to FCRA requirements.
Consumer Privacy: Brill gave some credit to big data giant Acxiom for its recent announcement that it plans to allow consumers to see what information the company holds about them, but she noted that this access is of limited use when consumers have no way of knowing who the data brokers are or how their information is being used. Brill highlighted how consumer data is being (questionably) used by national retailer Target: the somewhat funny yet disturbing news story about Target Stores identifying a teen’s pregnancy. It is a classic example of why consumers ought to have notice of what data is being collected on them and how that information is being used.
Consumers also need to have, as Brill suggested, the opportunity to correct information about themselves. This makes sense. Data collection is imperfect – different individuals’ information may be inaccurately combined; someone’s information may have been hacked; someone could be the victim of cyber-bullying, and other mishaps and errors can occur. Consumers should be able to review and correct information for errors. Finally, Brill highlighted concerns that current efforts to scrub consumer data may be ineffective, as companies are getting better at taking different data points and still being able to accurately identify the individual. “Scrubbed” data in the wrong hands could be as harmful as a direct security breach.
Brill encouraged companies to follow the “privacy by design” recommendations issued by the FTC in order to build more protections into their products and services. She further emphasized her initiative “Reclaim Your Name,” which is set to promote consumer knowledge and access to data collection. Companies that are in the business of data collection, mining and analytics should take note of the FTC’s efforts to empower the consumer against the overuse or misuse of consumer data. If you want to stay on the good side of the FTC – and on the good side of the informed consumer – work with the consumer, and provide meaningful notice, choice and consent.
Beta testing is underway for Google Glass, a new technology that provides the functionality of a smartphone in a headset worn like glasses. Much like a smartphone, the Glass headset is able to exchange messages with other mobile devices, take pictures, record videos, and access search engines to respond to user queries. But unlike a smartphone, the device’s optical head-mounted display, voice recognition, and front-facing camera give users hands-free access to its features, including the ability to capture photographs and video recordings of the world in front of them.
For now, Glass is only available to developers and a small group of test users known as Google Explorers. The device is expected to go on sale to the public in 2014. In the meantime, public speculation swells, and the blogosphere is full of conflicting reports about what the device can and cannot do. Some suggest that the device will utilize facial recognition and eye-tracking software to show icons and statistics above people whom the user recognizes. A more common concern is that the device will be able to photograph and record what the user sees and then share that data with third parties without permission from the user or those whose likenesses are being captured.
Because of this lack of clarity, lawmakers around the world are urging Google to affirmatively address the myriad of privacy concerns raised by this new technology. Last month, an international group of privacy regulators – including representatives from Australia, Canada, Israel, Mexico, New Zealand, Switzerland, and a European Commission panel – signed off on a letter to Google’s CEO Larry Page asking for more information regarding the company’s plans to ensure compliance with their data protection laws.
Here in the United States, the House Bipartisan Privacy Caucus issued a similar letter of its own. In addition to a variety of questions regarding the device’s capabilities, the letters reference some of Google’s recent data privacy mishaps and ask whether Google intends to take proactive steps to ensure the protection of user and nonuser privacy.
Google’s Vice President of Public Policy and Governmental Relations (and former New York Congresswoman) Susan Molinari issued a formal response to the House Bipartisan Privacy Caucus. According to Molinari, Google “recognize[s] that new technology is going to bring up new types of questions, so [they] have been thinking carefully about how [they] design Glass from its inception.”
To address concerns about the picture and video capabilities, Molinari highlighted several features designed to “give users control” and “help other people understand what Glass users are doing.” For example, specific user commands are required to search the Internet or find directions, and the user must either press a button on the arm of the Glass or say “Take a photo” or “Record a video” in order to access those features.
Over the past decade the Federal Trade Commission has brought cybersecurity enforcement actions against various private companies, imposing tens of millions of dollars in monetary penalties and requiring companies to maintain more stringent data-security practices. No company has ever challenged the FTC’s authority to regulate cybersecurity in this way in court – until now. On June 17, 2013, a federal court will finally get a chance to weigh in on whether the scope of the FTC’s regulatory jurisdiction is so broad as to include setting standards for cybersecurity.
In FTC v. Wyndham Worldwide Corporation, et al., the FTC launched a civil action against the parent company of the Wyndham hotels and three of its subsidiaries for data security failures that led to three major data breaches in less than two years. The Commission’s complaint charges that Wyndham’s security practices were unfair and deceptive in violation of the FTC Act.
Unlike many other data-security FTC enforcement actions, in which the defendant has chosen to cut its losses and settle out of court, Wyndham has decided to stand and fight with a motion to dismiss. Judge Esther Salas of the U.S. District Court for the District of New Jersey is expected to rule on Wyndham’s motion on June 17.
With respect to the FTC’s unfairness claim, Wyndham’s motion asserts that the FTC is attempting to circumvent the legislative process by acting as if “it has the statutory authority to do that which Congress has refused: establish data-security standards for the private sector and enforce those standards in federal court.”
According to Wyndham, “on multiple occasions in the 1990s and early 2000s the FTC publicly acknowledged that it lacked authority to prescribe substantive data-security standards under the [FTC Act]. For that very reason, the FTC has repeatedly asked Congress to enact legislation giving it such authority.” Further, Wyndham highlights the Senate’s failure to pass the Cybersecurity Act of 2012, which sought to address the need for specific data-security standards for the private sector, and President Obama’s February 2013 Executive Order on cybersecurity that was issued in response to the Congressional stalemate.
On its face, Wyndham’s motion to dismiss seems quite strong. However, the facts that the FTC is alleging do not cut in Wyndham’s favor. The Commission’s complaint alleges that Wyndham’s failure to “adequately limit access between and among the Wyndham-branded hotels’ property management systems, [Wyndham] Hotels and Resorts’ corporate network, and the Internet” allowed intruders to use weak access points (e.g., a single hotel’s local computer network) to hack into the entire Wyndham Hotels and Resorts’ corporate network. From there, the intruders were able to gain access to the payment management systems of scores of Wyndham-branded hotels.
According to the FTC, Wyndham failed to remedy known security vulnerabilities, employ reasonable measures to detect unauthorized access, and follow proper incident response procedures following the first breach in April 2008. Thus, the corporation remained vulnerable to attacks that took place the following year. All told, the intruders compromised over 600,000 consumer payment card accounts, exported hundreds of thousands of payment card account numbers to a domain registered in Russia, and used them to make over $10.6 million in fraudulent purchases.
Unfortunately – as Wyndham notes in its motion to dismiss – hacking has become an endemic problem. There has been no shortage of stories about major cyber-attacks on private companies and governmental entities alike: from Google and Microsoft to the NASA and the FBI. And the FTC has not been shy about bringing enforcement actions against private companies with inadequate security measures.
If Wyndham prevails, the case could usher in a major reduction in FTC enforcement efforts. However, if the court sides with the FTC, the commission will be further empowered to regulate data security practices. With such high stakes on both sides, any decision is likely to result in an appeal. In the meantime, companies in various industry sectors that maintain personal consumer information are awaiting next week’s decision.
The Federal Trade Commission has made it quite clear that it is serious about advising mobile app developers that the rules of the road will be changing very soon. Since 2011, the Commission has been working to update the rules governing the collection of children’s personal information by mobile apps. The relevant law is the Children’s Online Privacy Protection Act (COPPA), and the rules are set to change in just over a month, on July 1.
As part of its effort to encourage compliance, the Commission recently issued more than 90 warning letters to app developers, both foreign and domestic, whose online services appear to collect data from children under the age of 13. The letters alert the recipients about the upcoming COPPA rule change and encourage them to review their apps, policies, and procedures for compliance. According to the letters, the Commission did not evaluate whether the recipients’ apps or company practices are in compliance. Therefore, we view this move as a public warning to all app developers that may be collecting personal information from children.
Until now, COPPA, which was originally enacted in 1998, defined “personal information” to include only the basics such as a child’s name, contact information, and social security number. Over the past decade, it has become antiquated by the development of mobile apps and other technological advancements affecting data collection. Unfortunately but understandably, COPPA’s original incarnation failed to account for the proclivities of today’s children, who – reared in the age of smartphones, Facebook, and Google-everything – routinely use mobile apps to share their lives with their friends, their family, and the world.
The FTC has expressed major concerns that, unbeknownst to many users, mobile app developers also collect and disseminate their users’ persistent identifiers (things such as cookies, IP addresses, and mobile device IDs). This information, which can recognize users over time and across different websites and online services, is often used by developers and third parties to market products to children based on each child’s specific online behavior. Come July 1, this practice will be illegal.
Under the revised rule, the definition of “personal information” has been expanded to include persistent identifiers, photos and videos with a child’s image, and recordings of a child’s voice. Additionally, developers of apps directed to children under 13 – or that knowingly collect personal information from children under 13 – will be required to post accurate privacy policies, provide notice, and obtain verifiable parental consent before collecting, using, or disclosing such information. However, there are some exceptions for developers that only use the information to support internal operations (i.e., analyze the app’s functionality, authenticate app users, etc.)
Protecting children’s privacy continues to be one of the Commission’s major initiatives, and the FTC has levied some hefty penalties for COPPA violations over the past year. That said, the Commission has indicated that it may be more lenient in cases where a small business has violated the rule despite well-intentioned attempts to comply. As we mentioned back in February, developers should beware of increased data privacy enforcement on the state level, as well. We encourage all mobile app developers to be proactive and review/update their policies to ensure compliance and avoid costly penalties.
On April 3, 2013, the Federal Trade Commission issued a press release that marks yet another step in its continuing trend of actions involving data brokers and data providers. As we have noted in earlier blog posts, the agency is making a concerted effort on a number of fronts to enforce the laws that protect consumer data and privacy.
The FTC’s current action involves a letter that it sent to a number of data brokerage companies that provide tenants’ rental histories to landlords. The letter is simply a notification to the companies that they may be considered credit reporting agencies under the Fair Credit Reporting Act (FCRA) and that they thus may be required to ensure that their websites and practices comply with that law.
The FTC letter also listed some of the obligations of credit reporting agencies to take reasonable steps to ensure the fairness, accuracy, and confidentiality of their reports — such as (1) ensuring that landlords are actually using the report for tenant screening purposes and not as a pretext, (2) ensuring the maximum possible accuracy of the information in the tenant reports, (3) if the company is a nationwide provider, providing consumers with a free copy of their report annually, and (4) ensuring that all obligations are met concerning notifications to landlords (e.g. letting consumers know about a denial based on a tenant report, the right to dispute information in the report, and the right to get a free copy of the report).
The FTC letter specifically noted that the agency has not evaluated whether the company receiving the letter is in compliance with the FCRA but that “we encourage you to review your websites and your policies and procedures for compliance.”
We have discussed FTC actions against data brokers before. In March, we discussed the FTC’s announcement of a settlement with Compete, Inc., a web analytics company. Compete sells reports on consumer browsing behavior to clients looking to drive more traffic to their websites and increase sales. Compete obtained the information by getting consumers to install the company’s web-tracking software in their computers. The FTC alleged that the company’s business practices were unfair and deceptive because the company did not sufficiently describe the types of information it was collecting from its users.
We are confident that the companies that received the letter regarding tenant information are reviewing their websites and polices, as encouraged by the FTC. However, what really intrigues us is the motivation behind the FTC sending the letters to the companies.
Of course, part of that motivation is to help ensure that the companies follow rules for privacy protection. Nonetheless, it is also interesting to note that there is a significant consequence under the FCRA – namely, individuals are permitted to seek punitive damages for deliberate violations of the FCRA. Thus, the letter arguably provides notice for the companies to become compliant immediately since future violations may be considered deliberate breaches that warrant punitive damages.
The increasing difficulties faced by internet providers and data gatherers in the international realm have yet again come to the fore. Privacy regulators in France, Germany, Spain, the Netherlands, the United Kingdom and Italy have banded together to investigate whether to fine Google for what they perceive to be violations of European Union privacy laws.
The background is that in March 2012, Google replaced its disparate privacy policies applicable to its various products (such as Gmail and YouTube) with a single policy that applied to all of its services.
However, as part of a report issued in October 2012, the EU’s Article 29 Data Protection Working Party then declared that Google’s unified policy did not comply with EU data protection laws. The EU’s primary, but not only, quibble with Google’s new policy involved the sharing of personal data across multiple Google services and platforms. At that time, the president of the French regulatory body, the CNIL, indicated that litigation would be initiated if Google did not implement the Working Party’s recommendations within three to four months.
As a result, Google now faces the time and costs of substantial regulatory oversight and investigation, as well as potential fines, from multiple national privacy protection watchdogs. In fairness, the EU privacy regulators have tended to be rather inclusive in their interpretation of what is and is not required by law. This is unfair to Google and to other companies that comply with what they believe to be the letter and spirit of the law, only to have regulators reinterpret the law to move the goal posts. But this is typical in the EU regulatory realm.
Google’s predicament sends a stern warning to all internet providers that gather personal data. Any provider’s natural inclination is to focus on complying with the applicable privacy rules applicable in the country where the provider is located. But the internet is borderless, subjecting providers to multiple laws in multiple jurisdictions. This creates the need for each provider to carefully analyze privacy policies to ensure as best as possible that it complies with the rules applicable across the globe. EU regulators and others are no longer content to allow the United States to set the guidelines for privacy and other rights, creating new challenges for privacy compliance in the United States and abroad.
Earlier this month, the Federal Trade Commission released a staff report outlining key issues facing consumers and companies as they adopt mobile payment services, entitled “Paper, Plastic . . . or Mobile? An FTC Workshop on Mobile Payments.” The report is based on a workshop held by the FTC in 2012 to examine the mobile payment industry.
Consumer use of mobile payment services continues to grow quickly. Mobile payment systems have the potential to be beneficial for both companies and consumers. However, many issues regarding fraud, privacy and security arise, and the FTC is looking to the industry to take the lead on establishing sound policies.
The FTC encourages companies that use mobile payment systems to develop clear policies on the resolution of disputes regarding unauthorized or fraudulent charges. Consumers fund their mobile purchases from a variety of sources (e.g., credit cards, bank account, mobile phone bills) and under current regulations each different method of funding has a different process for consumers to dispute an unauthorized or fraudulent charge. The FTC wants to create a clearer and streamlined process for consumers if an issue were to arise regarding a disputed charge. The FTC is planning to hold a separate roundtable on this issue in May.
The report highlights the problems associated with “cramming,” which involves placing unauthorized charges on a consumer’s phone bill. The FTC suggests that mobile carriers should perform some due diligence on companies from which they accept charges.
The report also discusses the idea of “privacy by design,” which involves strong privacy policies and transparency for consumers from inception of a company’s offerings. Consumers understand that they will need to provide some information to access a company’s services, but consumers may want to control how that information is stored and shared. The FTC and the industry realize that mobile payment systems can be an efficient, favored payment method. However, companies offering mobile payments need to be clear to consumers about how their data is being collected, maintained and used. Privacy issues are of paramount concern when using mobile payment systems because of the enormous amount of data available on smartphones.
The report also notes the potential privacy issues that can occur in the mobile payment process. Since mobile payment providers have access to both the financial information and contact information of the payer, they are in a position to create a serious privacy breach. The report suggests that companies consider privacy throughout the process of development, be transparent regarding data practices, and allow consumers options on how they want their information to be collected.
The report also encourages the industry to adopt measures to ensure that the entire mobile payment process is secure since financial information could potentially be disclosed. The FTC notes that there is technology available to make the protection of payment information more secure and suggests that financial information should be encrypted at all points in the transaction.
Companies should take note of the FTC’s report and adjust their practices. The FTC has put companies on notice about its expectations in mobile payments. It would not surprise us to see enforcement actions in the future in the area. Companies should, in particular, make clear their policy for explaining charges, and how they can be authorized. The more support a company has in showing that a charge is justified, the easier it will be to defend. This kind of specificity may also help influence authorities from even bringing charges. When offering mobile payment services, opt-in screens requiring a click or a password to make a charge and making sure the network is secure are best practices that may save an organization from being on the receiving end of an enforcement action.