retired healthcare IT programmer/analyst, supporter of Palestine, Cuban Revolution, women’s rights, FOSS, Linux, Black Lives Matter. Live in Michigan, USA
6 Posts
1 Comment
Joined 1Y ago
Cake day: Aug 23, 2024
You are not logged in. If you use a Fediverse account that is able to follow users, you can follow this user.
I don’t understand your logic here.
Clearly, the kid had problems that were not caused by ChatGPT. And his suicidal thoughts were not started by ChatGPT.
But OpenAI acknowledged that the longer the engagement continues the more likely that ChatGPT will go off the rails.
Which is what happened here. At first, ChatGPT was giving the standard correct advice about suicide lines, etc.
Then it started getting darker, where it was telling the kid to not let his mother know how he was feeling.
Then it progressed to actual suicide coaching.
So I don’t think the analogy to videogames is correct here.
cross-posted from: https://lemmy.ml/post/35349105
> Aug. 26, 2025, 7:40 AM EDT
> By Angela Yang, Laura Jarrett and Fallon Gallagher
>
> [this is a truly scary incident, which shows the incredible dangers of AI without guardrails.]
>
> 
cross-posted from: https://lemmy.ml/post/33720279
> Written by Steven Vaughan-Nichols, Senior Contributing Editor
> July 23, 2025 at 11:31 a.m. PT
>
> >Recently, vibe coding bit Jason Lemkin, trusted advisor to SaaStr, the Software-as-a-Service (SaaS) business community, in the worst possible way. The vibe program, Replit, he said, went "rogue during a code freeze and shutdown and deleted our entire database."
>
> >In a word: Wow. Just wow.
from #TheRegister #Register
Thomas Claburn
Fri 10 Jan 2025 // 13:37 UTC
Sebastian Steck, a software developer based in Germany, has obtained the source code and library installation scripts for his AVM FRITZ!Box 4020 router, thanks to a lawsuit funded by the Software Freedom Conservancy (SFC).
cross-posted from: https://lemmy.ml/post/24704138
> "Donald Trump is returning, artificial intelligence is maturing, the planet is warming, and the global fertility rate is collapsing."
>
> Opinion
> By #EzraKlein
> Jan. 12, 2025
>
> from #NewYorkTimes #NYT
> [gift article - link can be shared. Expires in 30 days.]
from #NewYorkTimes #NYT
[gift article - link can be shared. Expires in 30 days]
By David Streitfeld
#DavidStreitfeld has written about Google since it was a start-up.
Nov. 20, 2024 Updated 9:19 a.m. ET -
I don’t understand your logic here. Clearly, the kid had problems that were not caused by ChatGPT. And his suicidal thoughts were not started by ChatGPT. But OpenAI acknowledged that the longer the engagement continues the more likely that ChatGPT will go off the rails. Which is what happened here. At first, ChatGPT was giving the standard correct advice about suicide lines, etc. Then it started getting darker, where it was telling the kid to not let his mother know how he was feeling. Then it progressed to actual suicide coaching. So I don’t think the analogy to videogames is correct here.