China – Ant Group, the fintech powerhouse backed by Jack Ma, has unveiled a major step forward in China’s bid for technological sovereignty. According to internal sources and a recently published research paper, the company has successfully trained large AI models using domestic semiconductors—including chips from Huawei and Alibaba—reducing training costs by up to 20 percent compared to traditional Nvidia-based systems.
This development marks a potential inflection point in China’s artificial intelligence (AI) strategy. By relying on local alternatives and implementing the Mixture of Experts (MoE) technique, Ant aims to bypass the limitations posed by U.S. export controls on high-performance chips such as Nvidia’s H800.
While not abandoning Nvidia entirely, Ant’s pivot to domestic and AMD hardware reflects a wider national trend of strategic decoupling. The Chinese AI ecosystem is recalibrating, choosing computational efficiency and cost optimization as tools to advance its capabilities without relying on Western technologies.
The models—Ling-Plus and Ling-Lite—are Ant’s flagship entries in the escalating race for AI supremacy between China and the United States. Though Ling-Plus’s 290 billion parameters are far behind GPT-4.5’s estimated 1.8 trillion, the models show strong benchmark performance, particularly in Chinese-language tasks.
Ant’s research highlights an ambition to scale models “without premium GPUs,” a stark contrast to Nvidia CEO Jensen Huang’s assertion that computing demand will always justify more powerful (and expensive) chips. Instead, Ant’s approach represents an alternative vision of AI development: more accessible, more economical, and locally produced.
These advances come at a time when China is being strategically boxed out of cutting-edge GPU supply chains. With Washington tightening export restrictions and Taiwan’s semiconductor leverage still critical, companies like Ant are under pressure to innovate within constraints.
According to Ant, training one trillion tokens using its optimized approach cost 5.1 million yuan—down from 6.35 million yuan using high-performance hardware. While the company noted stability challenges in training (hardware variance causing error spikes), its results suggest resilience in resource-constrained development environments.
Open-sourcing the Ling models further indicates China’s intent to build an open, scalable AI foundation independent of U.S. giants. The models are already being applied across sectors, including healthcare (via Ant’s acquisition of Haodf.com) and finance, through advisory tools like Maxiaocai.
Strategically, Ant’s breakthrough sends a broader signal to Beijing and the world: China is adapting rapidly to geopolitical and technological constraints. The ability to create competitive AI models without premium hardware is not just a technical win—it’s a geoeconomic milestone.