top of page

China's Open-Source Models Overtake the US: The Qwen Effect and the Shifting AI Landscape

China's Open-Source Models Overtake the US: The Qwen Effect and the Shifting AI Landscape

1. Introduction: A Sudden Reversal in the Global AI Open-Source Map

1. Introduction: A Sudden Reversal in the Global AI Open-Source Map

The year 2025 will be remembered for "The Flip"—a historic turning point in artificial intelligence. In mid-2025, the cumulative download volume of Chinese open-source AI models surpassed that of their US counterparts for the first time, systematically rewriting the global AI map. The phenomenon's drama lies in its astonishing speed: back in November 2023, US models held a 60% market share while China had just 25%; but by September 2025, the situation had completely reversed, with China accounting for 65% of new downloads while the US dropped to 30%. This shift even caught the attention of mainstream media like The Washington Post, which confirmed in an October 13, 2025 report that "China now leads the U.S. in this key part of the AI race".

In response to this challenge, the United States urgently established the "American Truly Open Models" (ATOM) project in August 2025, which aims to rebuild its open-source AI advantage and has received support from Silicon Valley heavyweights like OpenAI's Chief Strategy Officer Jason Kwon and Hugging Face CEO Clement Delangue. All of this raises a central question: What allowed Chinese models to achieve this exponential growth in such a short time?

2. The Data Explosion: Chinese Models Dominate by the Numbers

The trend is definitive. According to The Atom Project, as of October 2025, the cumulative download volume of Chinese open-source models had exceeded 550 million, leading the 475 million downloads of US models by about 75 million, and the gap continues to widen. In comparison, European models, primarily led by Mistral, maintained a stable market share of 5-10%, with about 100 million cumulative downloads.

Even more telling is the shift in the share of derivative models (finetunes). In October 2025, Alibaba's Qwen series alone accounted for over 40% of new open-source models on Hugging Face, while the former leader, Meta's Llama series, had dropped to 15%. Based on Hugging Face data, Alibaba's download speed had already surpassed Meta's by the end of September 2025, and it is projected to become the leader in the 1B+ parameter model category by the end of November.

3. Alibaba's Qwen: The Catalyst for Critical Mass

At the heart of this shift is Alibaba's Qwen series, which acted as the primary engine driving the transition from incremental progress to market dominance.

3.1 Qwen's Unmatched Ecosystem Expansion

The expansion of Qwen's ecosystem is staggering:

3.2 Performance and Technical Breakthroughs

On key benchmarks, Qwen 2.5-72B achieved an MMLU score of 86.8, surpassing Llama 3.1-405B's 86.2. This is thanks to its training data of 18 trillion tokens, resulting in an 18%+ performance boost over Qwen 2. The Qwen 2.5 series covers 7 different parameter sizes from 0.5B to 72B, meeting various needs from edge to cloud.

3.3 Specialized Dominance: Multimodality and Coding

Qwen has also made major breakthroughs in specialized capabilities, further enhancing its appeal to the developer community.

4. The Rise of China's Open-Source AI Ecosystem

4. The Rise of China's Open-Source AI Ecosystem

Qwen is the flagship, but it sails in a formidable fleet powered by policy, capital, and a diverse range of models.

4.1 Model Quantity and National Share

Of the approximately 3,755 publicly released LLMs worldwide, 1,509 come from China, accounting for about 40%. Although Chinese entities account for only 5.2% of total downloads on Hugging Face due to access restrictions (the "Hugging Face China Paradox"), this data does not reflect true adoption rates. More representatively, Chinese developers contribute 31.8% of models in the large model space (7.5B+ parameters), demonstrating their focus on high-capability systems. Third-party trackers like The Atom Project bypass platform limitations, revealing a more realistic picture of global adoption trends.

4.2 Policy and Ecosystem Strategy: The "AI+" Plan

China's "AI+" plan is the top-level design driving this ecosystem's rise. First proposed in March 2024 and fully approved in July 2025, the plan sets clear and ambitious goals.

4.3 Projected Industry Scale

Behind the "AI+" plan is a massive economic vision. In 2024, the scale of China's core AI industry reached 500 billion RMB (approx. $70B USD), with over 4,300 related enterprises. The national goal is for the core AI industry to exceed 1 trillion RMB (approx. $140B USD) by 2030, with related industries reaching 10 trillion RMB. Morgan Stanley forecasts that the total value of China's AI industry could eventually reach $1.4 trillion, with related investments expected to break even by 2028 and achieve a 52% return on invested capital by 2030.

5. The Rising Second Wave of Innovators

Beyond Alibaba, a powerful "second tier" of Chinese models has emerged, creating a complementary and robust "China Model Matrix."

Model/Company

Background/Features

Core Strengths & Innovations

DeepSeek

Extreme Efficiency & Reasoning: Forced to innovate after US chip controls, it developed highly efficient training methods. Its R1 model's training cost was only $5-6M, a fraction of GPT-4's. Its API is compatible with OpenAI's, allowing for easy switching.

Moonshot (Kimi)

Long Context & Agentic Intelligence: The Kimi K2 model boasts 1 trillion parameters (MoE architecture) and excels at code generation and tool use. Its API is also compatible with OpenAI's structure.

Zhipu AI (GLM)

A startup with a Tsinghua University background.

Efficient MoE & Dual-Mode System: The GLM-4.5 series uses a Mixture-of-Experts architecture to save compute. Its "Thinking Mode" is designed for complex reasoning. It offers an extremely low-cost API and uses a permissive MIT license.

Baidu (ERNIE)

China's search giant, which strategically shifted from closed to open source.

Full Open-Source & Ecosystem Building: In June 2025, it officially open-sourced the ERNIE 4.5 series, including 10 models under the permissive Apache 2.0 license.

ByteDance (Seed)

The parent company of TikTok.

Long Context & Flexible Control: The Seed-OSS-36B model offers a 512K native long-context capability. An innovative "Thinking Budget" feature lets developers balance inference depth and speed.

6. Shifting Community Feedback and International Perception

The international developer community and capital markets have taken notice. Martin Casado, a partner at a16z, revealed that 80% of US AI startups pitching to them are using Chinese open-source models. America's hottest AI coding startups, such as Cursor and Cognition, have also been found to be quietly building on top of Chinese LLMs.

On technical forums like Reddit's r/LocalLLaMA, developers now treat Qwen as a peer competitor to Llama, widely discussing its pros and cons. Kai-Fu Lee, CEO of 01.AI, was blunt at the TED AI 2025 conference, stating that the top 10 open-source models all come from China and that America is already losing the AI hardware war. As early as December 2024, Hugging Face CEO Clement Delangue predicted that China would surpass the US in 2025.

7. Future Outlook: The New Geopolitics of Open-Source AI

7. Future Outlook: The New Geopolitics of Open-Source AI

This trend is still accelerating.

  • Predictions & Trends: Based on current download speeds, Alibaba is on track to officially surpass Meta as the download leader in the 1B+ parameter model space by the end of November 2025. In the future, multimodal fusion (e.g., Qwen-Omni and Qwen3-VL), open APIs (with OpenAI-compatible formats as the standard), and cross-platform compatibility will become mainstream.

  • Potential Risks: Geopolitics also introduces risks. Regulatory divergence (e.g., differences between the EU's AI Act and China's policies), the fragmentation of model standards, and barriers to international collaboration could all become future challenges. While US chip export controls were intended to be restrictive, they paradoxically spurred technological innovation in China, leading to more efficient architectures and training methods.

Frequently Asked Questions: China's AI Leadership and the Global Competitive Landscape

1. Is China's AI now definitively "better" than America's?

Not necessarily in every aspect, but it has achieved leadership in the critical area of open-source adoption and momentum. By mid-2025, cumulative downloads and fine-tunes on Chinese models surpassed American peers, marking a historic tipping point in open AI dominance. In terms of performance, top Chinese models like Qwen 2.5 have surpassed specific Western counterparts like Llama 3.1 on key benchmarks. The shift is less about a single "best" model and more about China's dominance in building a comprehensive and rapidly growing open-source AI ecosystem.

2. What are the key factors behind Alibaba's Qwen model series being so successful?

Qwen's success is driven by a combination of critical factors:

Top-Tier Performance: It consistently scores at or near the top in major benchmarks. Several versions of Alibaba's Qwen3 models, released in late April, outperform Meta's latest Llama 4 models according to AI model leaderboards LiveBench and Artificial Analysis.

Comprehensive Model Family: It offers a wide range of model sizes, allowing developers to choose the right balance of performance and efficiency for everything from mobile devices to cloud servers.

Ecosystem Dominance: By mid-2025, the Qwen family had been downloaded more than 400 million times, with over 140,000 derivative models built on top, cementing Alibaba's place at the center of open-source AI. With broad community engagement and model diversity, Qwen's open ecosystem mirrors Android's scale, while OpenAI's GPT-OSS plays the constrained "iOS" counterpart.

Strategic Shift to Open-Source: Throughout 2024, Alibaba's balance gradually shifted toward open-source models as they began drawing steady feedback from developers in China and the U.S., startups, academic researchers, and PhD students who started using them to build custom AI systems.

3. How did US chip sanctions impact China's AI progress?

China's achievements in efficiency are no accident. They are a direct response to the escalating export restrictions imposed by the US and its allies. By limiting China's access to advanced AI chips, the US has inadvertently spurred its innovation.

DeepSeek's V3 model training costs are just one-tenth of OpenAI's GPT-4, which was estimated to be US$63 million. Key optimizations that reduced reliance on expensive hardware include innovations in model architecture, training frameworks and algorithms.

4. Why are American startups, including those backed by top VCs, using Chinese open-source models?

US startups are increasingly adopting Chinese models for pragmatic reasons: performance and cost.

Cost-Effectiveness: DeepSeek's R1 open-source reasoning model was unprecedentedly cheap at only 1 yuan ($0.14) per 1 million tokens. Nvidia CEO Jensen Huang noted that "America wins when models like DeepSeek and Qwen run best on American infrastructure".

High Performance: These models demonstrate leading-edge capabilities, particularly in valuable commercial areas like coding, math, and reasoning. Reuters reports Qwen3-Coder rivals OpenAI's GPT-4 on code tasks.

Ease of Integration: Most major Chinese models offer APIs that are fully compatible with OpenAI's standards, allowing developers to switch seamlessly by changing just a few lines of code.

5. What are the main risks or challenges associated with this new AI landscape?

The primary risks stem from geopolitical and regulatory fragmentation:

Regulatory Divergence: Different regions are creating their own rules, such as the EU's AI Act and China's "AI+" plan. This could create compliance challenges for developers building global products.

Splintering of Technical Standards: Open models proved uncontrollable but essential. Training know-how and infrastructure have diffused globally, making model restriction nearly impossible, yet openness remains the West's strategic advantage. Without global coordination, competing definitions of "open-source AI" and technical standards could make it difficult for models and tools from different ecosystems to work together.

Barriers to International Cooperation: Geopolitical tensions, trade restrictions, and concerns over data sovereignty could hinder the cross-border collaboration that has fueled the open-source movement's success.

6. What is the significance of China releasing over 1,500 open-source LLMs?

China has released approximately 1,509 of the world's ~3,755 publicly released LLMs, far more than any other country. This explosion reflects heavy state and industry investment in domestic AI, open licensing (often Apache- or MIT-style), and a strategic pivot by Chinese tech giants and startups toward publicly shared models.

7. How has Alibaba's commitment to open-source changed the AI landscape?

Alibaba CEO Wu remarked during an earnings call that the company is firmly committed to open-source AI. "We believe the full open sourcing of Qwen3 will drive innovation and the new applications by developers, startups and enterprises," he said. This strategic pivot represents a fundamental shift in how AI is being developed globally, with open models becoming central to the evolution of AI as a whole.

8. What does the future hold for the US in open-source AI?

The West needs its own open model strategy. To sustain innovation, reduce power concentration, and project soft influence abroad, investment in domestic open models is urgent and achievable. OpenAI's GPT-OSS and Ai2's OLMo efforts are positive signals, yet underfunded relative to China's scale and insufficient to reclaim leadership.

Expect acceleration in agentic open models, a tightening race between Qwen and GPT-OSS ecosystems, and growing U.S. realization that open infrastructure is critical to maintain global AI influence.

9. Conclusion: From "Catch-Up" to "Leadership"

The rise of China's open-source AI, led by the "Qwen effect," is the result of synergy between corporate innovation and a national open-source strategy. From reversing download trends to surpassing performance benchmarks and innovating on cost efficiency, Chinese models are achieving a symbolic transition from "follower" to "leader" on multiple fronts. By attracting global developers with permissive licenses, China is turning "soft power" into the "hard power" of shaping global technical interfaces and standards.

A new, multipolar AI world order is taking shape, driven by performance, efficiency, and openness—and it's built on code that is open for all to see.

Get started for free

A local first AI Assistant w/ Personal Knowledge Management

For better AI experience,

remio only runs on Apple silicon (M Chip) currently

​Add Search Bar in Your Brain

Just Ask remio

Remember Everything

Organize Nothing

bottom of page