newdarkage: “When similar health care systems…


“When similar health care systems have been automated, they have not always performed flawlessly, and their errors can be difficult to correct. The scholar Danielle Keats Citron cites the example of Colorado, where coders placed more than 900 incorrect rules into its public benefits system in the mid-2000s, resulting in problems like pregnant women being denied Medicaid. Similar issues in California, Citron writes in a paper, led to “overpayments, underpayments, and improper terminations of public benefits,” as foster children were incorrectly denied Medicaid. Citron writes about the need for “technological due process” — the importance of both understanding what’s happening in automated systems and being given meaningful ways to challenge them. Critics point out that, when designing these programs, incentives are not always aligned with easy interfaces and intelligible processes. Virginia Eubanks, the author of Automating Inequality, says many programs in the United States are “premised on the idea that their first job is diversion,” increasing barriers to services and at times making the process so difficult to navigate “that it just means that people who really need these services aren’t able to get them.” One of the most bizarre cases happened in Idaho, where the state made an attempt, like Arkansas, to institute an algorithm for allocating home care and community integration funds, but built it in-house. The state’s home care program calculated what it would cost to care for severely disabled people, then allotted funds to pay for help. But around 2011, when a new formula was instituted, those funds suddenly dropped precipitously for many people, by as much as 42 percent. When the people whose benefits were cut tried to determine how their benefits were determined, the state declined to disclose the formula it was using, saying that its math qualified as a trade secret.”

A healthcare algorithm started cutting care, and no one knew why – The Verge