Wednesday, April 24, 2019

Insurance, privacy, surveillance, algorithms, and repugnance

The NY Times is on the case:

Insurers Want to Know How Many Steps You Took Today
The cutting edge of the insurance industry involves adjusting premiums and policies based on new forms of surveillance.
By Sarah Jeong

"Last year, the life insurance company John Hancock began to offer its customers the option to wear a fitness tracker — a wearable device that can collect information about how active you are, how many calories you burn, and how much you sleep. The idea is that your Fitbit or Apple Watch can tell whether or not you’re living the good, healthy life — and if you are, your insurance premium will go down.
...
"artificial intelligence is known to reproduce biases that aren’t explicitly coded into it. In the field of insurance, this turns into “proxy discrimination.” For example, an algorithm might (correctly) conclude that joining a Facebook group for a BRCA1 mutation is an indicator of high risk for a health insurance company. Even though actual genetic information — which is illegal to use — is never put into the system, the algorithmic black box ends up reproducing genetic discrimination.

"A ZIP code might become a proxy for race; a choice of wording in a résumé might become a proxy for gender; a credit card purchase history can become a proxy for pregnancy status. Legal oversight of insurance companies, which are typically regulated by states, mostly looks at discrimination deemed to be irrational: bias based on race, sex, poverty or genetics. It’s not so clear what can be done about rational indicators that are little but proxies for factors that would be illegal to consider.
...
"A. I. research should march on. But when it comes to insurance in particular, there are unanswered questions about the kind of biases that are acceptable. Discrimination based on genetics has already been deemed repugnant, even if it’s perfectly rational. Poverty might be a rational indicator of risk, but should society allow companies to penalize the poor? Perhaps for now, A.I.’s more dubious consumer applications are better left in a laboratory."

HT: Julio Elias

No comments: