Tongyi Qianwen Unveils Trillion-Parameter Model Qwen3-Max-Preview with Performance Boost but Open-Source Ambiguity
Alibaba Cloud has launched its largest Tongyi Qianwen model to date, Qwen3-Max-Preview, surpassing 1 trillion parameters and demonstrating improved performance in dialogue, agent tasks, and instruction following. Currently available as a preview via Qwen Chat and Alibaba Cloud API, the core question of whether the model will be open-source remains unanswered.
The parameter race has reached a new magnitude—Tongyi Qianwen just dropped the trillion-parameter bomb named Qwen3-Max-Preview. Officially, it outperforms its predecessor Qwen3-235B in dialogue fluency, complex task handling, and instruction comprehension.

Interestingly, the comments section is flooded with two dominant voices: half demanding "When will it be open-source?" while the other half are already experimenting on platforms like AnyCoder. This divide perfectly mirrors the current state of the AI community—eager for cutting-edge breakthroughs yet committed to maintaining open-source ecosystems.
On technical specifics, the official release only mentions benchmark improvements over previous versions but avoids direct comparisons with GPT-5 Pro or Gemini 2.5 Pro. This selective disclosure recalls a netizen’s quip: "Trillion parameters are like the numbers on gym weight plates—what really matters is what you can lift with them."
Currently accessible through:
- [Qwen Chat](https://chat.qwen.ai)
- [Alibaba Cloud API](https://modelstudio.console.alibabacloud.com/?tab=doc#/doc/?type=model&url=2840914_2&modelId=qwen3-max-preview)
- [AnyCoder integration](https://huggingface.co/spaces/akhaliq/anycoder)
Notably, while the Tongyi Qianwen series is renowned for open-source contributions, the Max line has consistently remained proprietary. Faced with relentless queries like "Will it be open-source?", the official account has gone radio silent. This ambiguity may hint at a new balancing act between commercialization and open-source—after all, the electricity bill for training trillion-parameter models isn’t something idealism alone can cover.
(Note: All feature screenshots and performance data are from official sources; actual results may vary by use case.)
发布时间: 2025-09-05 23:43