BETA — Сайт у режимі бета-тестування. Можливі помилки та зміни.
UK | EN |
LIVE
Технології 🇺🇸 США

US crackdown threat could shake out China’s ‘distillation’ AI copycats: analysts

South China Morning Post Minxiao Chang,Coco Feng 2 переглядів 2 хв читання
US crackdown threat could shake out China’s ‘distillation’ AI copycats: analysts
AdvertisementUS-China tech warTechUS crackdown threat could shake out China’s ‘distillation’ AI copycats: analysts

US plans action against AI ‘distillation’, as analysts warn weaker Chinese start-ups could be forced out within a year

2-MIN READ2-MIN Listen
Even among more capable developers, distillation is often used to accelerate AI iteration. Photo:   EPA
Minxiao Changin ShenzhenandCoco Fengin GuangdongPublished: 11:00pm, 24 Apr 2026Updated: 11:08pm, 24 Apr 2026

The administration of Donald Trump has threatened action to shield the US artificial intelligence industry from being “distilled” by Chinese rivals, in a move analysts say could weed out weaker players in China’s AI sector within a year.

In a memo released on Thursday, Michael Kratsios, the science and technology adviser to the US president, warned that “surreptitious, unauthorised distillation campaigns” were enabling foreign entities – mainly in China – to release models that appear to match leading systems on select benchmarks at a fraction of the cost.

“Distillation” was a widely used technique in which a smaller “student” model was trained on the outputs of a more advanced “teacher” model, allowing developers to replicate capabilities more cheaply, said Helen Toner, interim executive director at Georgetown University’s Centre for Security and Emerging Technology, during testimony before the Senate on Wednesday.

Advertisement

Some Chinese start-ups had claimed to “self-develop” models while relying heavily on distillation, and such firms lacking original research could be “forced out of the game” within six to 12 months, said Zhang Ruiwang, a Beijing-based information systems architect.

Even among more capable developers, distillation is often used to accelerate iteration. Zhang said this could lengthen development cycles: gaps that might previously have been filled within three months could now take a year or more.

AdvertisementAllegations surrounding the practice have surfaced repeatedly in recent months. Toner, a former board member of OpenAI, told a US Senate hearing this week there was “strong evidence” that Chinese AI firms were using distillation techniques to extract capabilities from US models.
Even among more capable developers, distillation is often used to accelerate iteration. Photo: AFP
Even among more capable developers, distillation is often used to accelerate iteration. Photo: AFP
AdvertisementSelect VoiceSelect Speed0.8x0.9x1.0x1.1x1.2x1.5x1.75x00:0000:001.00x
Поділитися

Схожі новини