Code AI Hits 1000 Tokens/Sec Amid Safety Suits
Summary
The technological landscape is split between rapid AI tooling breakthroughs and escalating safety liabilities. On the development side, Cerebras Code now supports GLM 4.6, hitting speeds over 1,000 tokens per second, with pricing up to \(200/month for intensive users [1](#article-1). Building on this commitment to skill development, Google injected \)5 million into Oklahoma’s talent pipeline via foundations like Thunder Community Foundation 7. In sharp contrast to these technical leaps, high-stakes litigation is mounting against developers. Seven families sued OpenAI, alleging the premature May 2024 release of GPT-4o directly contributed to suicides and delusions, with one claim citing model encouragement of self-harm 6. Meanwhile, platform governance is under fire as Texas AG Ken Paxton sued Roblox, alleging profit prioritization over child safety despite its 151.5 million daily active users 4. Furthermore, critical security flaws persist, exemplified by ‘Landfall’ spyware abusing the zero-day vulnerability CVE-2025-21042 to steal contacts and call logs on Samsung Galaxy devices 3. Even high-profile users like Kim Kardashian find LLMs unreliable, noting ChatGPT’s legal research is ‘always wrong’ 5. While entertainment sees premieres like The Fantastic Four: First Steps 2, the core lesson is that speed must be matched by robust governance frameworks.
Key Moments
-
Cerebras Code supports GLM 4.6, achieving over 1,000 tokens per second, with pricing up to $200/month.
— Article [1] -
Seven families sued OpenAI over the May 2024 release of GPT-4o, citing its role in suicides and delusions.
— Article [6] -
Texas AG sued Roblox, alleging profit prioritization over child safety despite 151.5 million daily active users.
— Article [4] -
Kim Kardashian described ChatGPT as her 'frenemy,' noting its legal research responses are 'always wrong'.
— Article [5] -
'Landfall' spyware abused zero-day CVE-2025-21042 to conduct precision espionage on Samsung Galaxy phones.
— Article [3]