You’re right about that, but other countries have similar protection. E.g. our board equivalent here in Germany would tear you a new one for. And the GDPR is gonna finish the job
Typically for the AI to do anything useful you’d copy and paste the medical records into it, which would be patient data.
Technically you could expunge enough data to keep it inline with HIPPA, but if there’s more people careless enough not to proofread their paper, then I doubt those people would prep the data correctly.
ChatGPT has no burden to respect HIPAA in that scenario. The medical provider inputting your PHI into a cloud-based LLM is violating your HIPAA rights in that case.
At least the AI saw personal medical info and Nope!'d out of that?
Come to think of it, I wonder if using ChatGPT violates HIPPA because it sends the patient data to OpenAI?
I smell a lawsuit.
I don’t think HIPPA applies in Jerusalem.
You’re right about that, but other countries have similar protection. E.g. our board equivalent here in Germany would tear you a new one for. And the GDPR is gonna finish the job
What patient data?
Typically for the AI to do anything useful you’d copy and paste the medical records into it, which would be patient data.
Technically you could expunge enough data to keep it inline with HIPPA, but if there’s more people careless enough not to proofread their paper, then I doubt those people would prep the data correctly.
ChatGPT has no burden to respect HIPAA in that scenario. The medical provider inputting your PHI into a cloud-based LLM is violating your HIPAA rights in that case.
Just to clarify I am implying the medical provider would be the one sued. I didn’t think ChatGPT would be in the wrong.
ChatGPT has just done a great job revealing how lazy and poorly thought out people are all over.