Wyden, Booker Demand Answers on Biased Health Care Algorithms
Letters to FTC, CMS, health care companies follow disturbing revelations of flawed algorithms impacting care for black patients
Washington, D.C. – U.S. Senators Ron Wyden, D-Ore., and Cory Booker, D-N.J., ) today called on the Federal Trade Commission (FTC), the Centers for Medicare and Medicaid Services (CMS) and five of the largest health care companies in the nation to provide more information on how they are addressing bias in algorithms used in many health care systems. The series of letters follows recent revelations published in the journal Science that a widely-used software program severely underestimated the health care needs of black patients because of its racially-biased algorithm.
In the letters to CMS Administrator Seema Verma, FTC Chairman Joseph Simons and top executives of UnitedHealth Group, Blue Cross Blue Shield, Cigna Corporation, Humana and Aetna, the lawmakers highlighted the profound threat biased algorithms can pose to marginalized communities by systematically denying them care.
“In using algorithms, organizations often attempt to remove human flaws and biases from the process,” the lawmakers wrote. “Unfortunately, both the people who design these complex systems, and the massive sets of data that are used, have many historical and human biases built in. Without very careful consideration, the algorithms they subsequently create can further perpetuate those very biases.”
The lawmakers asked the FTC and CMS a series of questions about the steps they are taking to address algorithmic bias in the health care system, how well-equipped their current enforcement mechanisms were to handle algorithmic biases and the scope of the challenge. They also asked the FTC to commit to investigate the ways that algorithms unfairly discriminate against marginalized communities.
In their letter to the executives, the lawmakers asked for specific details about the algorithms their companies use to improve patient care and what safeguards the companies have put in place to prevent bias.
Earlier this year, Wyden and Booker also introduced the Algorithmic Accountability Act, which requires companies to fix discriminatory algorithms and outlines methods the federal government can use to mitigate the impacts of such algorithms.
Next Article Previous Article