California Legal professional Common Rob Bonta and Delaware Legal professional Common Kathy Jennings met with and despatched an open letter to OpenAI to precise their considerations over the security of ChatGPT, notably for youngsters and youths.
The warning comes per week after Bonta and 44 different attorneys basic despatched a letter to 12 of the highest AI firms, following stories of sexually inappropriate interactions between AI chatbots and youngsters.
“Because the issuance of that letter, we discovered of the heartbreaking dying by suicide of 1 younger Californian after he had extended interactions with an OpenAI chatbot, in addition to a equally disturbing murder-suicide in Connecticut,” Bonta and Jennings write. “No matter safeguards had been in place didn’t work.”
The 2 state officers are at present investigating OpenAI’s proposed restructuring right into a for-profit entity to make sure that the mission of the nonprofit stays intact. That mission “contains guaranteeing that synthetic intelligence is deployed safely” and constructing synthetic basic intelligence (AGI) to profit all humanity, “together with youngsters,” per the letter.
“Earlier than we get to benefiting, we have to be sure that ample security measures are in place to not hurt,” the letter continues. “It’s our shared view that OpenAI and the business at massive should not the place they have to be in guaranteeing security in AI merchandise’ improvement and deployment. As Attorneys Common, public security is one in every of our core missions. As we proceed our dialogue associated to OpenAI’s recapitalization plan, we should work to speed up and amplify security as a governing power in the way forward for this highly effective know-how.”
Bonta and Jennings have requested for extra details about OpenAI’s present security precautions and governance, and mentioned they anticipate the corporate to take speedy remedial measures the place acceptable.
TechCrunch has reached out to OpenAI for remark.
Techcrunch occasion
San Francisco
|
October 27-29, 2025