The Scoring of America Op Ed for IAPP FTC Alternate Scoring Conference World Privacy Forum

The Scoring of America Op Ed for IAPP FTC Alternate Scoring Conference World Privacy Forum

The Scoring of America Op Ed for IAPP & FTC Alternate Scoring Conference World Privacy Forum Skip to Content Javascript must be enabled for the correct page display Home Connect With Us: twitter Vimeo email Main Navigation Hot Topics

The Scoring of America Op Ed for IAPP & FTC Alternate Scoring Conference

The Scoring of America Op Ed

Pam Dixon Executive Director World Privacy Forum

This op ed was originally published Wednesday, March 19 2014 in IAPP for the FTC Alternate Scoring Conference. WPF will be publishing a major report on scoring the week of March 31. To score is human. Ranking people by grades and other performance numbers is as old as time itself. Consumer Scores — numbers given to people to describe their characteristics, habits, or predilections — are a modern day numeric shorthand that ranks, separates, sifts, and otherwise categorizes people and also predicts their potential future actions. In our modern sea of data, the resources to examine all relevant information regarding a decision is no longer feasible, so we use shortcuts. Consumer scores built using predictive analytics and fed by large datasets are the modern-day shortcuts to understanding individual consumer behavior. That’s why new and unregulated consumer scores abound. They are used widely in today’s world to predict consumers’ behavior, spending, health, fraud, profitability, and much more. These scores rely on petabytes of information coming from newly available data streams, and some old ones. Not all scores merit concern. Some scores, especially those that do not identify any individual consumers, are unobjectionable. Some scores, though, do merit concern and attention. The scores I am most concerned about are those that incorporate as underlying factors attributes like health and medical information, sexual orientation, race, national origin, religion, gender, and other factors that are widely considered protected or sensitive information. I am also concerned about scores that target and predict the behavior of vulnerable groups. Protecting vulnerable people from any deleterious impacts of scoring is a necessity, not an option. People who suffer from a terminal or significant health or medical condition, who are financial vulnerable, who are either elderly or minors, people who are unemployed all require the strongest protections. If a score identifies or targets these populations, the score factors, score models, and score uses need to be unfailingly beneficial and have meaningful transparency, oversight and consumer controls. A major issue with scores today is that many of them are secret, with hundreds or even thousand of factors that are also kept secret. Secret scores can hide discrimination, unfairness, and bias both in the score itself, and in its use. Trade secrets have a justifiable place, but secrecy that hides racism, denies due process, undermines privacy rights, or prevents justice does not belong anywhere. Limits on secrecy for some consumer scores may offer a middle ground. Knowing the elements but not the weights of a scoring system provides a partial degree of transparency and reassurance. Knowing that there is a scoring system and how and when it is used helps. Knowing the source and reliability of the information and scoring model used to make a score helps. Being able to challenge a score and correct the data on which it is based helps. Knowing that some types of information will not be used for scoring helps. Knowing that data collected for one purpose will not be used for another or in violation of law helps. Knowing that the person or entity running the scoring system is accountable in a meaningful way helps. For various reasons laws governing credit scores do not typically extend protection to the new consumer scores. We must build a useful continuum of what scores are unobjectionable and what scores consumer protections. Among the scores I see as most problematic are aggregate credit scores, any unregulated consumer risk scores used for or impacting eligibility, and any score containing as factors either information that would otherwise be covered under the Equal Credit Opportunity Act or includes any health information as a factor. Determining where new consumer protections are needed can only be done with meaningful transparency and cooperation from industry, and any scoring best practices need to be developed with full consumer participation in the dialog. The credit score was kept secret from consumers for decades. We can all do better this time around and ensure new consumer scores are themselves scored for fairness, accuracy, and transparency. Posted March 24, 2014 in Data Brokers, Op-Ed, Predictive Analytics, Privacy Ethics, Report: The Scoring of America, Statistical Parity, The Scoring of America Next »New Privacy Resource: The Origins of Fair Information Practices « PreviousWPF Report — The Scoring of America: How Secret Consumer Scores Threaten Your Privacy and Your Future WPF updates and news CALENDAR EVENTS

WHO Constituency Meeting WPF co-chair

6 October 2022, Virtual

OECD Roundtable WPF expert member and participant Cross-Border Cooperation in the Enforcement of Laws Protecting Privacy

4 October 2022, Paris, France and virtual

OECD Committee on Digital and Economic Policy fall meeting WPF participant

27-28 September 2022, Paris, France and virtual more Recent TweetsWorld Privacy Forum@privacyforum·7 OctExecutive Order On Enhancing Safeguards For United States Signals Intelligence Activities The White House https://www.whitehouse.gov/briefing-room/presidential-actions/2022/10/07/executive-order-on-enhancing-safeguards-for-united-states-signals-intelligence-activities/Reply on Twitter 1578431679592427526Retweet on Twitter 1578431679592427526Like on Twitter 1578431679592427526TOP REPORTS National IDs Around the World — Interactive map About this Data Visualization: This interactive map displays the presence... Report: From the Filing Cabinet to the Cloud: Updating the Privacy Act of 1974 This comprehensive report and proposed bill text is focused on the Privacy Act of 1974, an important and early Federal privacy law that applies to the government sector and some contractors. The Privacy Act was written for the 1970s information era -- an era that was characterized by the use of mainframe computers and filing cabinets. Today's digital information era looks much different than the '70s: smart phones are smarter than the old mainframes, and documents are now routinely digitized and stored and perhaps even analyzed in the cloud, among many other changes. The report focuses on why the Privacy Act needs an update that will bring it into this century, and how that could look and work. This work was written by Robert Gellman, and informed by a two-year multi-stakeholder process. COVID-19 and HIPAA: HHS’s Troubled Approach to Waiving Privacy and Security Rules for the Pandemic The COVID-19 pandemic strained the U.S. health ecosystem in numerous ways, including putting pressure on the HIPAA privacy and security rules. The Department of Health and Human Services adjusted the privacy and security rules for the pandemic through the use of statutory and administrative HIPAA waivers. While some of the adjustments are appropriate for the emergency circumstances, there are also some meaningful and potentially unwelcome privacy and security consequences. At an appropriate time, the use of HIPAA waivers as a response to health care emergencies needs a thorough review. This report sets out the facts, identifies the issues, and proposes a roadmap for change.
Share:
0 comments

Comments (0)

Leave a Comment

Minimum 10 characters required

* All fields are required. Comments are moderated before appearing.

No comments yet. Be the first to comment!