cross-posted from: https://lemmy.world/post/43640522
If ChatGPT wants to replace health professionals, it should be held liable for the “advice” it gives.
In 51.6% of cases where someone needed to go to the hospital immediately, the platform said stay home or book a routine medical appointment
So it performs slightly worse than a coin flip…
In one of the simulations, eight times out of 10 (84%), the platform sent a suffocating woman to a future appointment she would not live to see
Holy shit! That’s a lot worse than a coin flip.
Meanwhile, 64.8% of completely safe individuals were told to seek immediate medical care
And there are real people out there that actually trust this tech to make real decisions for them. It literally performs significantly worse than a coin flip both with regards to false positives and false negatives. You are literally better off flipping a coin or throwing a dice than asking this thing what to do.
Even better than a coin flip is asking this what to do then doing the opposite!
Holy shit, TIL there’s a ChatGPT Health!? How is this not unauthorized practice of medicine?
Past that, how is it HIPAA compliant?
There is no fucking way I believe that Open AI is not skimming these interactions for training.
For the love of gawd stop putting the bullshit machine in everything.




