The doctor, and their steno-bot, will see you now.
- AlgonquinHawk ( @AlgonquinHawk@lemmy.ml ) English10•1 year ago
What’s the privacy policy for something like this? I don’t see any info listed. The idea of Amazon having that kind of data worries me.
- Gaywallet (they/it) ( @Gaywallet@beehaw.org ) English4•1 year ago
HIPAA law holds everyone who has access to the data accountable. A single violation can be punished by up to 50k in fines and prison time. If multiple records were accessed improperly you can see absolutely outrageous fines and this has happened! It’s perhaps one of the few areas in government that is taken seriously and there’s a lot of tech jobs in security in healthcare because of it.
Increased efficiency with diminished results and questionable repercussions for privacy. It’s certainly not ideal…
- Gaywallet (they/it) ( @Gaywallet@beehaw.org ) 4•1 year ago
Carbon health specializes in urgent but not emergency care. It’s relatively low acuity. I’ve seen gpt4 use cases like this assessed by doctors (read a recent paper on this actually) and when it’s employed in this kinda of fashion most doctors are in favor of it. They have to review it anyways, this doesn’t feel particularly different to them than reviewing what a nurse or scribe was writing up for them (or they were writing themselves) and often the LLM does a better job putting it all into plain English so it’s both patient and doctor readable.
- monerobull ( @monerobull@monero.house ) 5•1 year ago
GPT3.5 and 4 are pretty good at writing standardized forms for you but giving OpenAI medical data like this is an obvious no-go