Techtonic with Mark Hurst is a weekly radio show from WFMU about technology, how it's affecting us, and what we can do about it.

Dec 9, 2024: Arvind Narayanan, author, "AI Snake Oil"

AI isn't living up to its hype. In healthcare, criminal justice, finance, and social media moderation, AI is repeatedly failing to bring the benefits promised by Big Tech. Author and Princeton computer science professor Arvind Narayanan explains these failures, and what we can do about it, in his new book “AI Snake Oil.”

Show Notes

AI Snake Oil: What Artificial Intelligence Can Do, What It Can't, and How to Tell the Difference, by Arvind Narayanan and Sayash Kapoor

AISnakeOil.com, the newsletter from Narayanan and Kapoor

• PDF: The Princeton Web Transparency and Accountability Project (2017)

UnitedHealth uses AI model with 90% error rate to deny care, lawsuit alleges (ArsTechnica, Nov 16, 2023): "For the largest health insurer in the US, AI's error rate is like a feature, not a bug." Excerpt:
The lawsuit argues that UnitedHealth should have been well aware of the "blatant inaccuracy" of nH Predict's estimates based on its error rate. Though few patients appeal coverage denials generally, when UnitedHealth members appeal denials based on nH Predict estimates—through internal appeals processes or through the federal Administrative Law Judge proceedings—over 90 percent of the denials are reversed, the lawsuit claims. This makes it obvious that the algorithm is wrongly denying coverage, it argues.

But, instead of changing course, over the last two years, NaviHealth employees have been told to hew closer and closer to the algorithm's predictions. In 2022, case managers were told to keep patients' stays in nursing homes to within 3 percent of the days projected by the algorithm, according to documents obtained by Stat. In 2023, the target was narrowed to 1 percent.
Guest
Host
Comments
Playlist & Comments at WFMU
Aired
Dec 9, 2024