Articles

4 articles
Tag: moe
2026-04-07T21:31:49.551Z · Src: 2026-04-07T00:00:00.000Z
2026-04-07T16:07:53.324Z · Src: 2026-04-07T00:00:00.000Z
2026-03-25T04:07:35.527Z · Src: 2026-03-25T00:00:00.000Z
2026-03-22T23:30:11.276Z
DeepSeek released V3-0322, an open-source MoE model with 671B total / 37B active parameters that matches GPT-4.5 on key benchmarks while remaining fully self-hostable under MIT license.