Chinese search giant Baidu said on Tuesday that ERNIE Bot, its competitor to ChatGPT, has achieved a 10-fold improvement in inference efficiency just one month after its release, while reducing the cost of large language model (LLM) inference to one-tenth of its original level. Inference, which refers to the process of running LLMs, mostly takes place on graphics processors or GPUs.

Why it matters: Baidu isn’t alone in looking to offer lower-cost AI products. Other tech majors in China such as Tencent and Alibaba have announced recently that they are introducing lower-priced AI products due to efficiency gains in the field.

Details: This development enables more small and medium companies to access large model technologies at reduced prices through tech firms’ cloud services. Meanwhile, the efficiency gains allow the tech giants to grab cloud market share at a low cost.

  • Baidu plans to “significantly lower” the threshold for enterprises deploying large models through three services, the company said in a Wednesday statement sent to TechNode. These services include using the inference capability of ERNIE Bot directly, training industry-specific large models via high-quality and accurate business data, or unveiling models in Baidu’s cloud service for more stable and efficient operation.
  • E-commerce giant Alibaba is also looking to boost revenue from its newly unveiled model Tongyi Qianwen, with the firm announcing on Wednesday that it will launch an AI co-development program for customers covering the transportation, petrochemical, and telecommunications industries.
  • On the same day, Alibaba’s cloud unit also announced its largest price cut amid the expansion of China’s cloud computing market. The prices of its core products are set to be reduced by 15% to 50%.
  • Tencent, another tech heavyweight in the country, has rolled out a digital human-production platform that it says can reduce production costs from millions to thousands of yuan within 24 hours.

Context: Since OpenAI’s ChatGPT gained worldwide popularity, numerous Chinese tech companies have declared an intention to enter the field of generative AI based on large models. However, training such models can be pretty expensive. According to a report by state-owned financial services company Guosheng Securities, training GPT-3 costs about $1.4 million per session, while over $2 million is needed when training larger LLMs.

Cheyenne Dong is a tech reporter now based in Shanghai. She covers e-commerce and retail, AI, and blockchain. Connect with her via e-mail: cheyenne.dong[a]