Back to home
open-source-LLM
2 articles tagged with this topic
QwenQwen3.6- 35B-A3B
Qwen3.6-35B-A3B released!
Alibaba's Qwen team releases a 35B sparse MoE model with only 3B active params under Apache 2.0.
Apr 163 min read
Horus-1.0TokenAI
Egypt's Horus-1.0-4B: First Open-Source LLM Built From Scratch
TokenAI releases Horus-1.0-4B, a 4B parameter multilingual LLM trained from scratch in Egypt with 8K context and 7 weight variants.
Apr 83 min read