Back to Repositories

QwenLM/Qwen-Agent

Alibaba's production-grade agent framework built on the Qwen model family. Powers the backend of Qwen Chat and supports parallel function calling, MCP integration, sandboxed code execution, and RAG for documents up to 1 million tokens. Works with any OpenAI-compatible API endpoint including vLLM and Ollama — not locked to Qwen models.

ai agent framework
Python

Why It Matters

Qwen-Agent is not a weekend project — it's the production backend behind Qwen Chat, Alibaba's flagship AI assistant. With native MCP support, a Docker-based sandboxed code interpreter, and RAG processing documents up to 1M tokens, it competes directly with frameworks like LangChain and CrewAI. The Apache-2.0 license and compatibility with any OpenAI-compatible endpoint make it a serious choice for teams that want framework flexibility without proprietary lock-in. Trending at +586 stars/day in March 2026.

Repository Stats

Stars
15.6k
Forks
1.5k
Last Commit
3/4/2026

Related Resources

Weekly AI Digest