From: Ethical perspectives on recommending digital technology for patients with mental illness
Area | Goal of automation | Negative consequence |
---|---|---|
Criminal justice | Predict involvement in violent crime | Automated predictions of future bad behavior or guilt by association, in high-crime areas (Robinson et al. 2014) |
Employment | Display job openings based on user profiles | Job opportunities not offered based on traditional biases (Sweeney 2013; Savage 2016) |
Employment | Automate job applicant screeninga | Individuals flagged as potentially having stigmatized or expensive disease based on algorithm (Rosenblat et al. 2014) |
Employment | Employer sponsored wellness programs include fitness trackersa | Preferential treatment and promotions to those who participate (Rosenblat et al. 2014; Christovich 2016) |
Financial | Include health and lifestyle habits in non-traditional, credit-related scoring algorithms | Decreased credit or higher interest rates on credit cards for the sick (Dixon and Gellman 2014; Robinson et al. 2014) |
Higher education | Predict good candidates for higher education | Opportunities not offered based on traditional biases. (FTC 2016a) |
Insurance | Determine health status without physicals | Higher life insurance rates for those at higher risk (Batty et al. 2010; Robinson et al. 2014) |
Online commerce | Conditional (dynamic) pricing based on user profiles | Higher prices for those living in poor areas with less retail competition (Valentino-Devries et al. 2012; Acquisti and Varian 2005). MAC users shown more expensive goods than PC users (Mattioli 2012) |
Online commerce | Offer credit online based on user profiles | No credit offers from leading institutions to those with poor credit (Fertik 2013) |
Online information seeking | Provide news and information based on user profile | Reinforce prejudices and increase insularity (Pariser 2011) |