Alibaba Releases Qwen 3.5 Open-Source Model Claiming Frontier-Level Performance
Open-Source Giant From Alibaba Cloud
Alibaba Cloud released Qwen 3.5 on February 16, 2026, under the Apache 2.0 license. Built on a sparse Mixture-of-Experts (MoE) architecture, the model features 397 billion total parameters with 17 billion active parameters—enabling frontier-class performance at a fraction of the compute cost of dense models.
Key Specifications
- Parameters: 397B total / 17B active (MoE)
- Context Window: 256K tokens (open-source); 1M tokens (hosted)
- Multimodal: Native text, image, and video understanding
- Languages: 201 languages supported
- License: Apache 2.0
Benchmark Claims
Alibaba claims Qwen 3.5 matches leading US proprietary models such as GPT-5.2 and Claude 4.5 on key benchmarks. The model also introduces enhanced agentic capabilities for autonomous task execution in commerce and enterprise scenarios.
Impact on Open-Source AI Competition
Qwen 3.5 underscores China's rapid progress in open-source AI. Its MoE design dramatically reduces active compute per inference while targeting top-tier performance, intensifying competition with Meta's Llama series, Mistral, and Hugging Face ecosystem models.
The model is available on GitHub and Hugging Face.
Sources: CNBC, Yahoo Finance
Related Articles
Alibaba launched Qwen3.5, a 397B-parameter open-weight multimodal model supporting 201 languages. The company claims it outperforms GPT-5.2, Claude Opus 4.5, and Gemini 3 on benchmarks, while costing 60% less than its predecessor.
A widely-shared r/LocalLLaMA comparison of Qwen's smallest models across three generations (score: 681) reveals extraordinary efficiency gains. The Qwen 3.5 9B now outperforms the previous-generation 80B on several benchmarks, while the 2B handles video understanding better than many 7B models.
Alibaba released the Qwen3.5 small model series (0.8B, 4B, 9B). The 9B model achieves performance comparable to GPT-oss 20B–120B, making high-quality local inference accessible to users with modest GPU hardware.
Comments (0)
No comments yet. Be the first to comment!