DeepSeek Releases V3-0322: Open-Source Model Matching GPT-4.5 on Key Benchmarks
Available in: 中文
DeepSeek released V3-0322, an open-source MoE model with 671B total / 37B active parameters that matches GPT-4.5 on key benchmarks while remaining fully self-hostable under MIT license.
DeepSeek Releases V3-0322: Open-Source Model Matching GPT-4.5 on Key Benchmarks
Chinese AI lab DeepSeek has released V3-0322, an updated version of its open-source language model that matches or exceeds GPT-4.5 performance on several key benchmarks while remaining fully open-weight and runnable on consumer hardware.
The Model
Key specifications:
- Architecture: Mixture-of-Experts (MoE) with 671 billion total parameters
- Active parameters: 37 billion per inference (highly efficient)
- Context window: 128K tokens
- License: MIT License (fully open)
- Availability: Weights available for download and self-hosting
Benchmark Performance
V3-0322 shows significant improvements:
- MMLU: Matching GPT-4.5 levels on broad knowledge tasks
- HumanEval: Strong coding performance
- Math reasoning: Improved mathematical problem-solving
- Instruction following: Better adherence to complex prompts
- Multilingual: Strong performance in Chinese and English
What Changed
From the previous V3:
- Training data: Updated with more recent data through early 2026
- Alignment: Improved RLHF and instruction tuning
- Efficiency: Better inference optimization for consumer hardware
- Safety: Enhanced guardrails and refusal accuracy
Why It Matters
The release has significant implications:
- Open-source competitiveness: Proves open models can match proprietary ones
- Cost disruption: Organizations can run frontier-level AI without API fees
- Geopolitical: Chinese AI labs producing models competitive with US counterparts
- Hardware requirements: Runnable on multi-GPU consumer setups, not just data centers
Industry Reaction
The AI community response:
- Developers: Excited about self-hostable frontier capabilities
- Researchers: Access to a high-performing model for academic study
- Companies: Potential to reduce reliance on closed-source AI APIs
Source: DeepSeek Official Announcement
← Previous: Apple AirTag Used to Solve Series of Catalytic Converter Thefts Across Multiple StatesNext: United Airlines Preps for Oil Crisis: CEO Plans for $175/Barrel Jet Fuel →
0