It’s no secret that people are becoming more comfortable – and increasingly reliant – on AI assistants like ChatGPT. Including using it to seek therapy, legal, and medical advice.
But, unlike the trained humans in those fields, what you say to ChatGPT can be used against you in a court of law.
In a recent podcast interview, Sam Altman, CEO of OpenAI and creator of ChatGPT, warned of the possible legal ramifications for conversations with ChatGPT:
“Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. There’s doctor-patient confidentiality, there’s legal confidentiality, whatever. And we haven’t figured that out yet for when you talk to ChatGPT.
So, if you go talk to ChatGPT about your most sensitive stuff and then there’s like a lawsuit or whatever, like we could be required to produce that. And I think that’s very screwed up.”
Adding to this concern, is the fact that due to an ongoing lawsuit, OpenAI is curently required to save all chat logs – even deleted conversations. So users need to be aware that their private conversations, even deleted ones, could possibly become public; whether through court subpoena or other means.
Please be very careful with what you share with an AI, and think twice before sharing sensitive business or very personal information. Especially protected information that could constitute a HIPAA or CMMC violation, information that could fuel corporate espionage, or thoughts that could get you in big legal trouble. Keep in mind – AI is not your friend, or your doctor, or your lawyer. It’s not good at keeping secrets.










