There was a moment where chatGPT was accidentally leaking the title of people’s chats to other users. Only the title, not the full conversation in this case, but it’s perhaps only a matter of time before someone’s bot is hacked or malfunctions enough to share more.

It’s a scary thought when already some people are effectively using them as therapists. It wouldn’t surprise me if the more human-like style of conversation they produce and the ‘texting a friend’ like interface that many of them have is conducive to users sharing a whole lot more personal information than they typically would with other online services.