Florida student asks ChatGPT how to kill his friend, ends up in jail: deputies(wfla.com)
8 points by trhway 2 days ago | 2 comments
- quantumcotton 14 hours agoThey probably shouldn't announce this. I get that they're trying to do it as an example. But, now kids are gunna know how to download a local LLM and do it. At least this way you can catch them one at a time.
- higginsniggins 2 days agoI don't think this is what he had in mind in trying to jailbreak the program...