The legal community in Minnesota is keeping an eye on the Federal Trade Commission’s warnings about unfair AI. The Commission says that health systems and businesses are potentially using discriminatory AI that may violate consumer protection laws. The FTC is tracking any and all new updates and changes to relevant consumer protection law.
Detecting and reporting bias
If there is bias, this falls under the violation umbrella contained in the Fair Credit Reporting Act. The act protects individuals from inequities in housing, employment, housing, insurance and other benefits. The Equal Credit Opportunity Act ensures that organizations not use biased algorithms that produce results grounded in color, race, religion, sex, national origin, age, marital status or financial status.
Discrimination in health care
Algorithms generating AI decisions may disregard or even promote discrimination. The FTC notes how predictions used by health systems reflect data that promotes racial bias. AI systems that should have benefited all patients instead made decisions based on whether a person was of color.
One study that challenged new updates relevant to consumer protection law revealed another problematic algorithm used by health care. It determined how to help patients and returned results that favored white patients over Black patients, even when Black patients had a greater health need.
The AI used costs to find patients who would most benefit from extra care. Physicians saw which patients required extended medication treatment or needed alternatives to staying out of hospitals. Unfortunately, due to the disproportionate levels of poverty resulting in Black patients spending less on health care than white patients, the AI determined that Black patients needed less health care.
Gross discrepancies like these prompted the FTC to ensure that any changes relevant to consumer protection law be examined. The warnings against using discriminatory AI that compromises access to health care could lead to stricter enforcement from the FTC.