fbpx
Back To Top

Press Release: DataEthics4All’s Call for Comments on FTC’s Regulation Rule on Commercial Surveillance and Data Security

DataEthics4All-Hero

Press Release: DataEthics4All’s Call for Comments on FTC’s Regulation Rule on Commercial Surveillance and Data Security

Federal Trade Commission Commercial Surveillance and Data Security Public Forum

FTC’s Regulation Rule on Commercial Surveillance and Data Security: DataEthics4All’s Call for Comments from our Community

The Federal Trade Commission’s Commercial Surveillance and Data Security Public Forum is hosting a public forum regarding its Advanced Notice of Proposed Rulemaking (ANPR) on commercial surveillance and data security practices that harm consumers and competition.

SUMMARY: The Federal Trade Commission (“FTC”) is publishing this advance notice of  proposed rulemaking (“ANPR”) to request public comment on the prevalence of commercial surveillance and data security practices that harm consumers. Specifically, the Commission  invites comment on whether it should implement new trade regulation rules or other regulatory  alternatives concerning the ways in which companies (1) collect, aggregate, protect, use, analyze, and retain consumer data, as well as (2) transfer, share, sell, or otherwise monetize that data in ways that are unfair or deceptive.

DATES: Comments must be received on or before [60 DAYS AFTER DATE OF  PUBLICATION IN THE FEDERAL REGISTER]. The Public Forum will be held virtually on  Thursday, September 8, 2022, from 2 p.m. until 7:30 p.m.

ADDRESSES: Interested parties may file a comment online or on paper by following the  instructions in the Comment Submissions part of the SUPPLEMENTARY INFORMATION  section below. Write “Commercial Surveillance ANPR, R111004” on your comment, and file  your comment online at https://www.regulations.gov. If you prefer to file your comment on paper, mail your comment to the following address: Federal Trade Commission, Office of the  Secretary, 600 Pennsylvania Avenue, NW, Suite CC-5610 (Annex B), Washington, DC 20580.  FOR FURTHER INFORMATION CONTACT: James Trilling, 202-326-3497; Peder Magee,  202-326-3538; Olivier Sylvain, 202-326-3046; or commercialsurveillancerm@ftc.gov.

Overview  

Whether they know it or not, most Americans today surrender their personal information  to engage in the most basic aspects of modern life. When they buy groceries, do homework, or  apply for car insurance, for example, consumers today likely give a wide range of personal  information about themselves to companies, including their movements,1 prayers, 2 friends, 3 menstrual cycles,4 web-browsing, 5 and faces, 6 among other basic aspects of their lives.

Companies, meanwhile, develop and market products and services to collect and  monetize this data. An elaborate and lucrative market for the collection, retention, aggregation, analysis, and onward disclosure of consumer data incentivizes many of the services and products  on which people have come to rely. Businesses reportedly use this information to target  services—namely, to set prices.

7 curate news feeds, 8 serve advertisements, 9 and conduct research on people’s behavior,10 among other things.

While, in theory, these personalization practices  have the potential to benefit consumers, reports note that they have facilitated consumer harms  that can be difficult if not impossible for any one person to avoid.  

11. Some companies, moreover, reportedly claim to collect consumer data for one stated  purpose but then also use it for other purposes.

12 Many such firms, for example, sell or otherwise monetize such information or compilations of it in their dealings with advertisers, data brokers,  and other third parties.

13 These practices also appear to exist outside of the retail consumer setting. Some employers, for example, reportedly collect an assortment of worker data to  evaluate productivity, among other reasons.

14—a practice that has become far more pervasive  since the onset of the COVID-19 pandemic.

15 Many companies engage in these practices pursuant to the ostensible consent that they obtain from their consumers.

16 But, as networked devices and online services become essential  to navigating daily life, consumers may have little choice but to accept the terms that firms offer.

17 Reports suggest that consumers have become resigned to the ways in which companies  collect and monetize their information, largely because consumers have little to no actual control  over what happens to their information once companies collect it.

18.  In any event, the permissions that consumers give may not always be meaningful or  informed. Studies have shown that most people do not generally understand the market for consumer data that operates beyond their monitors and displays.

19. Most consumers, for example, know little about the data brokers and third parties who collect and trade consumer data or build consumer profiles.

20. That can expose intimate details about their lives and, in the wrong hands, could expose unsuspecting people to future harm.

21. Many privacy notices that acknowledge such risks are reportedly not readable to the average consumer.

22. Many consumers do not have the time to review lengthy privacy notices for each of their devices, applications, websites, or services.

23. Let alone the periodic updates to them. If consumers do not have meaningful access to  this information, they cannot make informed decisions about the costs and benefits of using  different services.

24. This information asymmetry between companies and consumer runs even deeper.  Companies can use the information that they collect to direct consumers’ online experiences in  ways that are rarely apparent—and in ways that go well beyond merely providing the products or  services for which consumers believe they sign up.

25. The Commission’s enforcement actions have targeted several pernicious dark pattern practices, including burying privacy settings behind multiple layers of the user interface.

26. And making misleading representations to “trick or trap” consumers into providing personal information.

27. In other instances, firms may misrepresent or fail to communicate clearly how they use and protect people’s data.

28. Given the reported scale and pervasiveness of such practices, individual consumer consent may be irrelevant.  

The material harms of these commercial surveillance practices may be substantial,  moreover, given that they may increase the risks of cyberattack by hackers, data thieves, and  other bad actors. Companies’ lax data security practices may impose enormous financial and  human costs. Fraud and identity theft cost both businesses and consumers billions of dollars, and  consumer complaints are on the rise.

29. For some kinds of fraud, consumers have historically spent an average of 60 hours per victim trying to resolve the issue.

30. Even the nation’s critical infrastructure is at stake, as evidenced by the recent attacks on the largest fuel pipeline.

31. Meatpacking plants, 32 and water treatment facilities

33 in the United States

Companies’ collection and use of data have significant consequences for consumers’  wallets, safety, and mental health.

Sophisticated digital advertising systems reportedly automate  the targeting of fraudulent products and services to the most vulnerable consumers.

34. Stalking  apps continue to endanger people.

35. Children and teenagers remain vulnerable to cyber bullying, cyberstalking, and the distribution of child sexual abuse material.

36. Peer-reviewed research has linked social media use with depression, anxiety, eating disorders, and suicidal ideation among  kids and teens.

37. Finally, companies’ growing reliance on automated systems is creating new forms and  mechanisms for discrimination based on statutorily protected categories.

38. Including in critical areas such as housing,

39. employment, 40. and healthcare.

41. For example, some employers’ automated systems have reportedly learned to prefer men over women.

42. Meanwhile, a recent  investigation suggested that lenders’ use of educational attainment in credit underwriting might disadvantage students who attended historically Black colleges and universities.

43. And the Department of Justice recently settled its first case challenging algorithmic discrimination under the Fair Housing Act for a social media advertising delivery system that unlawfully discriminated based on protected categories.

44. Critically, these kinds of disparate outcomes may arise even when automated systems consider only unprotected consumer traits.

45. The Commission is issuing this ANPR pursuant to Section 18 of the Federal Trade  Commission Act (“FTC Act”) and the Commission’s Rules of Practice.

46. Because recent Commission actions, news reporting, and public research suggest that harmful commercial surveillance and lax data security practices may be prevalent and increasingly unavoidable.

47. These developments suggest that trade regulation rules reflecting these current realities may be needed to ensure Americans are protected from unfair or deceptive acts or practices.

New rules could also foster a greater sense of predictability for companies and consumers and minimize the uncertainty that case-by-case enforcement may engender.

Countries around the world and states across the nation have been alert to these concerns. Many accordingly have enacted laws and regulations that impose restrictions on companies’ collection, use, analysis, retention, transfer, sharing, and sale or other monetization of consumer data.

In recognition of the complexity and opacity of commercial surveillance practices today, such laws have reduced the emphasis on providing notice and obtaining consent and have instead stressed additional privacy “defaults” as well as increased accountability for businesses and  restrictions on certain practices.

For example, European Union (“EU”) member countries enforce the EU’s General Data Protection Regulation (“GDPR”), 48 which, among other things, limits the processing of personal data to six lawful bases and provides consumers with certain rights to access, delete, correct, and port such data.

49.Canada’s Personal Information Protection and Electronic Documents Act and 

50. Brazil’s General Law for the Protection of Personal Data contain some similar rights.

51. Laws in  California, 52. Virginia, 53. Colorado, 54. Utah, 55. and Connecticut, 56. moreover, include some comparable rights, and numerous state legislatures are considering similar laws. Alabama, 57. Colorado,58. and Illinois, 59. meanwhile, have enacted laws related to the development and use of artificial intelligence.

Other states, including Illinois, 60. Texas, 61. and Washington, 62. have enacted laws governing the use of biometric data.

All fifty U.S. states have laws that require businesses to notify consumers of certain breaches of consumers’ data.

63. And numerous states require  businesses to take reasonable steps to secure consumers’ data.

64. Through this ANPR, the Commission is beginning to consider the potential need for rules and requirements regarding commercial surveillance and lax data security practices.

Section 18 of the FTC Act authorizes the Commission to promulgate, modify, and repeal trade regulation  rules that define with specificity acts or practices that are unfair or deceptive in or affecting  commerce within the meaning of Section 5(a)(1) of the FTC Act.65 Through this ANPR, the  Commission aims to generate a public record about prevalent commercial surveillance practices or lax data security practices that are unfair or deceptive, as well as about efficient, effective, and  adaptive regulatory responses.

These comments will help to sharpen the Commission’s enforcement work and may inform reform by Congress or other policymakers, even if the   Commission does not ultimately promulgate new trade regulation rules.

66.  The term “data security” in this ANPR refers to breach risk mitigation, data management  and retention, data minimization, and breach notification and disclosure practices.

For the purposes of this ANPR, “commercial surveillance” refers to the collection,  aggregation, analysis, retention, transfer, or monetization of consumer data and the direct derivatives of that information.

These data include both information that consumers actively provide—say, when they affirmatively register for a service or make a purchase—as well as  personal identifiers and other information that companies collect, for example, when a consumer casually browses the web or opens an app. This latter category is far broader than the first. 

The term “consumer” as used in this ANPR includes businesses and workers, not just individuals who buy or exchange data for retail goods and services. This approach is consistent with the Commission’s longstanding practice of bringing enforcement actions against firms that  harm companies.

67. as well as workers of all kinds.

68. The FTC has frequently used Section 5 of the FTC Act to protect small businesses or individuals in contexts involving their employment or  independent contractor status.

     

Shilpi Agarwal, Founder and CEO of DataEthics4All Foundation will be testifying on September 8th and DataEthics4All invites it’s members and public to submit their views.