this post was submitted on 30 Jul 2024
808 points (98.1% liked)

Chronic Illness

220 readers
1 users here now

A community for chronically ill people.

Rules

  1. Be excellent to each other
  2. Absolutely no ableism, although good faith questions that take an ableist stance will be left up pending moderator discretion.
  3. No quackery. Does an up-to date major review in a big journal or a major government guideline come to the conclusion you’re claiming is fact? No? Then don’t claim it’s fact. This applies to potential treatments and disease mechanisms.

founded 3 months ago
MODERATORS
 

discharge = discharge from hospital

you are viewing a single comment's thread
view the rest of the comments
[–] tacticalsugar@lemmy.blahaj.zone 3 points 3 months ago* (last edited 3 months ago) (1 children)

Honestly that just leads to automation with built-in bias, and now you can't even threaten a doctor with a malpractice suit because you can't talk to a person, or the only person you can talk to says "sorry, the computer won't let me".

You can't use technology to fix social issues. People keep trying, and every time it just hurts chronically ill and disabled people even more. Have you ever heard of NarxCare?

NarxCare is a prescription drug monitoring program (PDMP) run by Bamboo Health. Bamboo Health was formerly known as Appriss. It is widely used across the United States by pharmacies including Rite Aid as well as those at Walmart and Sam’s Club. The NarxCare software allows doctors to view data about a patient, combining data from the prescription registries of various U.S. states to make the registries interoperable nationally. It also uses machine learning to generate an "Overdose Risk Score" that potentially includes EMS and criminal justice data; these scores have been criticized by researchers and patient advocates for the lack of transparency in the process as well as the potential for disparate treatment of women and minority groups.

[–] Xanjis@lemmy.zip 1 points 3 months ago (1 children)

Sure you still have innate/learned biases but eliminating situational (recent divorce) and bodily (hunger/sleepy/horny/sick) bias entirely is still a massive reduction in the total amount of bias you face day to day. If anything being able to see the biases of the data going into something like NarxCare is a good thing because now you have a paper trail for improvements. You can't just grab a hundred doctors and ask them "have you ever denied care due to your biases against women?" because the bad ones will either lie or not realize what they have done.

[–] tacticalsugar@lemmy.blahaj.zone 2 points 3 months ago

I would genuinely rather work with a doctor who just got divorced than have to fight the invisiable AI blackbox that calls me a drug addict for being chronically ill.

You can’t just grab a hundred doctors and ask them “have you ever denied care due to your biases against women?” because the bad ones will either lie or not realize what they have done.

Unlike Narxcare, which just denies care due to biases and won't tell you why because it's a machine learning blackbox. There is no "paper trail" for NarxCare, because denying care to patients is the point. I can at least argue with doctors, or request a new one.

You can't fix social issues with technology, and every attempt will just make things worse for the affected people.