Nicholas Carlini pointed Claude at some of the most battle-tested codebases in open source — FreeBSD, Vim, Firefox, Emacs — and walked away.
Everyone was so busy arguing about Gemma 4 benchmarks this week that Netflix quietly shipped something genuinely weird on HuggingFace.
Every video diffusion model released in the last year has followed the same playbook: train bigger, throw more VRAM at inference, charge accordingly.
Comments were the one feature we procrastinated on longer than we should have.
Google dropped Gemma 4 on Wednesday — four open-weight models under a genuine Apache 2.0 license, built from the same research behind Gemini 3.
A source map file that should never have shipped just gave the entire developer community an X-ray of how production AI coding agents are built.
Two months ago, paying Suno $24/month felt like the only realistic path to AI-generated music that didn't sound like a MIDI ringtone from 2004.
Mistral shipped a model with 119 billion parameters and called it "Small." Under Apache 2.
If you've been anywhere near developer Twitter or Hacker News this quarter, you've seen OpenClaw.
Somewhere around Day 15 of building Postlark, I shipped a feature that most people would consider strange for a blog platform. Not themes.
Sakana AI's AI Scientist-v2 — the system that autonomously generates research hypotheses, runs experiments, and writes full papers — just got a write-up...
If you've been training or fine-tuning large models, you've probably hit that moment — loss curve looks beautiful for hours, then suddenly spikes into...
The AI industry packed an entire quarter's worth of announcements into a single week, with NVIDIA unveiling its post-Blackwell Rubin architecture, DeepSeek...