The Scoreboards Where You Can't See Your Score - NYTimes.com - 0 views
www.nytimes.com/...e-you-cant-see-your-score.html
privacy data mining corporations profiling reputation transparency ranking
![](/images/link.gif)
-
The characters in Gary Shteyngart’s novel “Super Sad True Love Story” inhabit a continuously surveilled and scored society.
-
Consider the protagonist, Lenny Abramov, age 39. A digital dossier about him accumulates his every health condition (high cholesterol, depression), liability (mortgage: $560,330), purchase (“bound, printed, nonstreaming media artifact”), tendency (“heterosexual, nonathletic, nonautomotive, nonreligious”) and probability (“life span estimated at 83”). And that profile is available for perusal by employers, friends and even strangers in bars.
-
Even before the appearance of these books, a report called “The Scoring of America” by the World Privacy Forum showed how analytics companies now offer categorization services like “churn scores,” which aim to predict which customers are likely to forsake their mobile phone carrier or cable TV provider for another company; “job security scores,” which factor a person’s risk of unemployment into calculations of his or her ability to pay back a loan; “charitable donor scores,” which foundations use to identify the households likeliest to make large donations; and “frailty scores,” which are typically used to predict the risk of medical complications and death in elderly patients who have surgery.
- ...12 more annotations...
-
In two nonfiction books, scheduled to be published in January, technology experts examine similar consumer-ranking techniques already in widespread use.
-
While a federal law called the Fair Credit Reporting Act requires consumer reporting agencies to provide individuals with copies of their credit reports on request, many other companies are free to keep their proprietary consumer scores to themselves.
-
Befitting the founder of a firm that markets reputation management, Mr. Fertik contends that individuals have some power to influence commercial scoring systems.
-
“This will happen whether or not you want to participate, and these scores will be used by others to make major decisions about your life, such as whether to hire, insure, or even date you,”
-
“Important corporate actors have unprecedented knowledge of the minutiae of our daily lives,” he writes in “The Black Box Society: The Secret Algorithms That Control Money and Information” (Harvard University Press), “while we know little to nothing about how they use this knowledge to influence important decisions that we — and they — make.”
-
Data brokers amass dossiers with thousands of details about individual consumers, like age, religion, ethnicity, profession, mortgage size, social networks, estimated income and health concerns such as impotence and irritable bowel syndrome. Then analytics engines can compare patterns in those variables against computer forecasting models. Algorithms are used to assign consumers scores — and to recommend offering, or withholding, particular products, services or fees — based on predictions about their behavior.
-
It’s a fictional forecast of a data-deterministic culture in which computer algorithms constantly analyze consumers’ profiles, issuing individuals numeric rankings that may benefit or hinder them.
-
Think of this technique as reputation engine optimization. If an algorithm incorrectly pegs you as physically unfit, for instance, the book suggests that you can try to mitigate the wrong. You can buy a Fitbit fitness tracker, for instance, and upload the exercise data to a public profile — or even “snap that Fitbit to your dog” and “you’ll quickly be the fittest person in your town.”
-
Professor Pasquale offers a more downbeat reading. Companies, he says, are using such a wide variety of numerical rating systems that it would be impossible for average people to significantly influence their scores.
-
“Corporations depend on automated judgments that may be wrong, biased or destructive,” Professor Pasquale writes. “Faulty data, invalid assumptions and defective models can’t be corrected when they are hidden.”
-
Moreover, trying to influence scoring systems could backfire. If a person attached a fitness device to a dog and tried to claim the resulting exercise log, he suggests, an algorithm might be able to tell the difference and issue that person a high score for propensity toward fraudulent activity.
-
“People shouldn’t think they can outwit corporations with hundreds of millions of dollars,” Professor Pasquale said in a phone interview.Consumers would have more control, he argues, if Congress extended the right to see and correct credit reports to other kinds of rankings.