There were 1,574 press releases posted in the last 24 hours and 399,730 in the last 365 days.

Consumer Reports applauds New York Department of Financial Services’ guidance to insurers to protect consumers from algorithmic discrimination

Insurers must seek out less discriminatory alternatives to biased algorithms used to underwrite and price insurance

YONKERS, NY – Consumer Reports praised the New York Department of Financial Services for issuing guidance today outlining the steps it expects insurance companies to take to protect consumers from algorithmic discrimination in underwriting and pricing. New York’s action marks the first time a state has made it explicitly clear that insurers have an obligation to search for less discriminatory alternative models in order to mitigate the discriminatory impact of algorithms used to underwrite and price insurance for consumers.

The Department’s guidance makes clear that insurers should not use external consumer data and information sources and artificial intelligence systems unless they can establish through a comprehensive assessment that its underwriting or pricing guidelines are not unfairly or unlawfully discriminatory. If the insurer’s comprehensive assessment finds a disproportionate adverse effect, it must seek out a “less discriminatory alternative” variable or methodology that reasonably meets its legitimate business needs.

“While AI provides benefits for both insurers and consumers, these technologies also pose some serious risks, including the potential for discriminatory pricing and underwriting decisions that harm consumers,” said Jennifer Chien, senior financial fairness policy counsel for Consumer Reports. “As insurers continue to use AI to make decisions about consumers, we need to make sure these systems are designed to minimize bias and promote positive outcomes for everyone. New York’s action positions the state as a leader in efforts to ensure transparency, accountability and fairness in insurance underwriting and pricing.”

CR and the Consumer Federation of America recently urged the Consumer Financial Protection Bureau to issue similar guidance making it clear that banks and other financial firms have an obligation to mitigate the discriminatory impact of algorithms they use to underwrite and price credit for consumers.

The risk of algorithmic discrimination is well established and can occur when an automated decision-system repeatedly creates unfair or inaccurate outcomes for a particular group. While the risk of discrimination exists with traditional models, these risks are exacerbated by machine learning techniques for automated decision-making that rely on the processing of vast amounts of data using often opaque models. However, more advanced tools and techniques are now emerging that can and should be employed to more effectively mitigate the risk of discrimination.

Biased results can arise from a number of sources, including underlying data and model design. Unrepresentative, incorrect, or incomplete training data as well as biased data collection methods can lead to poor outcomes in algorithmic decision-making for certain groups. Data may be tainted by past discriminatory practices. Biases can also be embedded into models through the design process, such as through the improper use of protected characteristics directly or through proxies.

In a comment letter filed with the Department letter earlier this year, CR’s pointed out a number of examples to illustrate the potential risk. A fraud monitoring algorithm may systematically flag consumers on the basis of race or proxies for race, as illustrated in the recent lawsuit against State Farm claiming that its fraud detection software has a disparate impact on Black consumers. Telematics programs that obtain consumer-generated driving data for insurance pricing may result in unintended bias and disparate impacts. A joint investigation by CR and The Markup found that an advanced algorithm proposed by Allstate seemed to charge prices based on a consumer’s willingness to pay rather than actual risk.

Michael McCauley, michael.mccauley@consumer.org