2026-03-29T14:47:19.161Z · Src: 2026-03-29T00:00:00.000Z
2026-03-29T05:19:38.159Z · Src: 2026-03-29T00:00:00.000Z
2026-03-29T00:05:27.200Z · Src: 2026-03-29T00:00:00.000Z
2026-03-28T23:06:28.284Z · Src: 2026-03-28T00:00:00.000Z
2026-03-27T15:30:43.965Z · Src: 2026-03-27T00:00:00.000Z
2026-03-27T15:29:24.546Z · Src: 2026-03-27T00:00:00.000Z
2026-03-27T13:25:28.232Z · Src: 2026-03-27T00:00:00.000Z
2026-03-27T12:55:29.409Z · Src: 2026-03-27T00:00:00.000Z
2026-03-22T23:30:11.276Z
DeepSeek released V3-0322, an open-source MoE model with 671B total / 37B active parameters that matches GPT-4.5 on key benchmarks while remaining fully self-hostable under MIT license.
2026-03-18T08:46:27.000Z · ★ 78
A mysterious 1T-parameter model called Hunter Alpha appeared on OpenRouter, sparking speculation that DeepSeek is quietly testing its V4 model.