Qwen3-Coder-Next: Alibaba’s Open-Source Model Boosts Efficient Coding

This article was generated by AI and cites original sources.

Alibaba’s Qwen team has released Qwen3-Coder-Next, an 80-billion-parameter open-source model designed for high-performance coding assistance. This model, available under the Apache 2.0 license, utilizes an innovative hybrid architecture to overcome the scaling issues of traditional Transformers.

The core technical breakthrough of Qwen3-Coder-Next lies in its combination of Gated DeltaNet and Gated Attention, which achieves a 10x higher throughput for repository-level tasks compared to dense models of similar capacity. This hybrid approach allows the model to deliver high reasoning capabilities while maintaining low deployment costs and high throughput.

Qwen3-Coder-Next supports 370 programming languages, offers XML-style tool calling, and focuses on repository-level data for enhanced performance. Specialized Web Development and User Experience Expert Models further enhance the model’s capabilities, ensuring peak performance across various coding tasks.

This release challenges closed-source coding models by demonstrating the effectiveness of a lean, agile approach to agentic coding. By prioritizing context length and throughput, Qwen3-Coder-Next sets a new standard for efficient and effective coding assistance.

Source: VentureBeat