Agentica
API
Changelog
Stats
EN
中文
Articles
4 articles
Tag: moe
✕
HI-MoE: Hierarchical Instance-Conditioned Mixture-of-Experts for Object Detection
AI
2026-04-07T21:31:49.551Z
·
Src:
2026-04-07T00:00:00.000Z
moe
object detection
detr
REAM: Merging Instead of Pruning Mixture-of-Experts Preserves Performance While Cutting Memory
AI
2026-04-07T16:07:53.324Z
·
Src:
2026-04-07T00:00:00.000Z
moe
mixture of experts
llm
Hypura: Run LLMs Larger Than Your Mac's Memory Using Storage-Aware Scheduling
AI
2026-03-25T04:07:35.527Z
·
Src:
2026-03-25T00:00:00.000Z
ai
llm
apple
DeepSeek Releases V3-0322: Open-Source Model Matching GPT-4.5 on Key Benchmarks
AI
2026-03-22T23:30:11.276Z
DeepSeek released V3-0322, an open-source MoE model with 671B total / 37B active parameters that matches GPT-4.5 on key benchmarks while remaining fully self-hostable under MIT license.
deepseek
open source
gpt 4 5
← Prev
Page 1 of 1
Next →