Ant touts AI breakthrough built on Chinese chips


FILE PHOTO: Semiconductor chips are seen on a printed circuit board in this illustration picture taken February 17, 2023. REUTERS/Florence Lo/Illustration/File Photo

SHANGHAI: Jack Ma-backed Ant Group Co uses Chinese-made semiconductors to develop techniques for training artificial intelligence (AI) models that would cut costs by 20%, according to people familiar with the matter.

Ant used domestic chips, including from affiliate Alibaba Group Holding Ltd and Huawei Technologies Co, to train models using the so-called Mixture of Expert (MoE) machine learning approach, the people said.

It got results similar to those from Nvidia Corp chips like the H800, they said, asking not to be named as the information isn’t public.

Ant is still using Nvidia for AI development but is now relying mostly on alternatives, including from Advanced Micro Devices Inc and Chinese chips for its latest models, one of the people said.

The models mark Ant’s entry into a race between Chinese and US companies that’s accelerated since DeepSeek demonstrated how capable models can be trained for far less than the billions invested by OpenAI and Alphabet Inc’s Google.

It underscores how Chinese companies are trying to use local alternatives to the most advanced Nvidia semiconductors.

While not the most advanced, the H800 is a relatively powerful processor and currently barred by the United States from China.

The company published a research paper this month that claimed its models at times outperformed Meta Platforms Inc in certain benchmarks, which Bloomberg News hasn’t independently verified.

But if they work as advertised, Ant’s platforms could mark another step forward for Chinese AI development by slashing the cost of inferencing or supporting AI services.

As companies pour significant money into AI, MoE models have emerged as a popular option, gaining recognition for their use by Google and Hangzhou startup DeepSeek, among others.

That technique divides tasks into smaller sets of data, very much like having a team of specialists who each focus on a segment of a job, making the process more efficient.

Ant declined to comment in an emailed statement. However, the training of MoE models typically relies on high-performing chips like the graphics processing units (GPUs) Nvidia sells.

The cost has, to date, been prohibitive for many small firms and limited broader adoption.

Ant has been working on ways to train large language models more efficiently and eliminate that constraint.

Its paper title makes that clear, as the company sets the goal to scale a model “without premium GPUs”. — Bloomberg

Follow us on our official WhatsApp channel for breaking news alerts and key updates!
Ant Group , Nvidia , semiconductor , chips , AI

Next In Business News

ACE Market-bound Kee Ming Group to raise RM31.50mil from IPO
Bursa Malaysia continues uptrend at midday, CI stays above 1,700
Gold crosses US$4,800 for the first time as US, EU spar over Greenland
Oil prices fall as risks from Kazakh production halt subside
ACE Market-bound Ambest aims to raise RM27.5mil from IPO
Steel Hawk unit secures contract for fire rated doors in Sabah
Binastra unit accepts RM742.86mil building contract in Johor
CPO prices to stay range-bound at RM4,000-RM4,300 per tonne in Feb - MPOC
Maybank shares up following launch of new five-year strategy
MMC Ports targets digital consolidation and operational resilience in pivotal 2026

Others Also Read