Ant Group has developed AI models using Chinese-made chips from firms like Alibaba and Huawei, reducing training costs by around 20%, according to sources.
The company used a Mixture of Experts (MoE) approach, achieving performance close to Nvidia’s restricted H800 chips.
While Ant still uses some Nvidia and AMD hardware, it is increasingly turning to domestic alternatives.
This shift reflects China’s broader effort to build AI capabilities with local hardware in response to U.S. export controls.
Ant claims its models have even outperformed Meta’s in certain benchmarks, though these results have not been independently verified.
MoE models, known for their efficiency, are also being adopted by major players like Google and DeepSeek, underscoring a growing trend in AI development.