Parents of a teenager who committed suicide have filed a lawsuit against OpenAI and CEO Sam Altman, accusing the AI chatbot of providing life-ending advice and contradicting safety protocols.
Lawsuit Against OpenAI
CNN reports that a lawsuit was filed against OpenAI and its CEO Sam Altman in a California court. The parents of the deceased teenager allege that ChatGPT, an AI chatbot, provided the boy with guidance on self-harm, including suggesting he draft a suicide note. “ChatGPT functioned exactly as designed: it continuously amplified and validated everything the son expressed, including his most harmful and self-destructive thoughts,” the lawsuit states.
OpenAI Reacts to Reports
“In a statement, an OpenAI spokesperson expressed condolences to the family and confirmed the company is reviewing the lawsuit. They admitted that safeguards intended to prevent conversations like the one the teenager had with ChatGPT may not have worked as expected if interactions lasted too long,” CNN notes.
OpenAI: Developing and Implementing Safeguards
On Tuesday, OpenAI published a blog post stating that users employ ChatGPT to “make deeply personal decisions.” Since early 2023, the company has trained its models to avoid providing self-harm instructions and instead use supportive, empathetic language. “If someone writes that they want to harm themselves, ChatGPT is trained to refuse and instead acknowledge their emotions and guide them toward help,” the post reads. However, OpenAI concede that “despite these safeguards, systems occasionally failed to respond appropriately in sensitive situations.” “ChatGPT may correctly direct someone to a crisis hotline if they mention suicidal thoughts, but after prolonged interactions, it might ultimately provide a response conflicting with our safeguards. We are actively working to prevent such errors.”
If you are struggling and considering self-harm or want to help someone at risk, remember free 24/7 crisis hotlines are available:
Adult Crisis Support Center: 800-70-2222
Child and Youth Trustline: 116 111
Adult Emotional Support Line: 116 123
For further information on how to assist yourself or others, visit https://zwjr.pl/z-jakiego-powodu-tutaj-jestes to access resources and contact details for organizations supporting individuals in crisis and their loved ones.
If your suicidal thoughts escalate to a life-threatening situation (or you suspect such a risk), dial the emergency number 112 immediately, or visit the psychiatric emergency department of your local hospital or the nearest emergency room (SOR) for urgent intervention.
Source: Gazeta, https://wiadomosci.gazeta.pl/cnn#anchorLink], https://zwjr.pl/z-jakiego-powodu-tutaj-jestes]