The 2024 landscape of cloud computing and large-scale models

CN
巴比特
Follow
9 months ago

If a group of people wants to have a meal together but can't agree on what to eat, what would you recommend to them? I believe many friends would say: hot pot.

With a rich and flexible selection of ingredients and a lively atmosphere when many people gather, cloud services, like hot pot, have become the first choice for thousands of industries to use AI.

Since 2023, the public cloud market has been booming with large models, and various major cloud computing companies in China have laid out AI cloud services, Maas, and other related products, making it a dazzling scene.

But as you may have noticed, with the inclusion of AI large models in cloud services, the commercial revenue growth brought to cloud vendors seems to be not matching the popularity of large model technology itself. It has stayed in a "well-received but not well-sold" situation for several months, with a lot of investment but not much output.

There are many factors behind this, such as: many companies experimenting with large models are small-scale projects, with uncertain willingness for renewal; low cloud usage for text tasks, AI native applications have not had a big breakthrough, and ToC business is not as profitable as imagined; users are concerned about privacy and security, and do not want to put their data on public clouds; GPU computing power is too expensive, and the IaaS model is not profitable; B-end projects have long cycles, complex requirements, high customization costs, and low profits…

However, after nearly a year of "simmering over a slow fire," the "hot pot" of "cloud + large models" has finally started to have a spicy and hot taste.

Mainstream cloud vendors such as Alibaba Cloud, Huawei Cloud, Baidu Cloud, Tencent Cloud, JD Cloud, etc., have seen a sharp increase in the number of successful bids for government and industry intelligence projects. Telecom cloud operators like China Telecom Cloud and China Mobile Cloud have also seen remarkable market share growth over the past year.

How should we understand the future pattern of the cloud service market? Let's focus on the boiling "cloud + large model" hot pot and discuss it while tasting.

Cloud + Large Models, a Hot Start in 2024

It is necessary to explain why the "hot pot model" of cloud and large models is more attractive for the intelligence of thousands of industries.

To use AI and train large models, each industry needs to purchase GPUs, build development platforms, and local data centers, which is equivalent to buying ingredients in bulk, renting a storefront, opening a private restaurant, and only serving their own people, which is too costly. Public clouds have the characteristics of elasticity and scalability, providing a continuous supply of computing power, similar to a central kitchen that uniformly purchases ingredients and allows customers to take them as needed. So, it's not that private chefs can't afford it, but hot pot is more cost-effective.

For most enterprises and users, putting the training of large models and heavy tasks on cloud service platforms, flexibly accessing AI computing power, or directly calling the large model APIs of cloud vendors, is a more cost-effective and faster option for intelligence.

Although the large model caused a "craze" in the cloud market at the beginning of 2023, in reality, the business related to large models of cloud vendors still involves the gradual commercialization of large models, the input-output ratio of computing power infrastructure, integration with the original business system, the construction of a full-stack toolchain, customer relationships, and other complex factors.

The result is that the revenue growth brought to the public cloud market by large models did not show the same fierce momentum as in the first half of 2023 and the "battle of large models." Like a hot pot that keeps bubbling, but the food is not cooked and cannot be eaten, many people gradually began to doubt whether "cloud + large models" is just a "storytelling" and a false proposition by the suppliers.

However, as long as the fire of large models is still burning, the customers of industry intelligence are not leaving, and the "hot pot model" of cloud services delivering AI conforms to industrial logic, then the "cloud + large model" business will sooner or later have its day at the table.

That day has come.

In the fourth quarter of 2023, the cloud market presented the long-awaited lively scene. Cloud vendors with large model-related products, such as Alibaba Cloud, Huawei Cloud, Tencent Cloud, Inspur Cloud, Baidu Cloud, and JD Cloud, have successively won bids for digital projects in government and enterprises. For example, in November, Baidu Intelligent Cloud won the bid for the "China Postal Savings Bank's large-scale pre-training model financial scenario application system software development procurement project"; Tencent Cloud won the bid for the "Ruijin Hospital's digital medical innovation center's medical large model platform development" project.

The IDC report "Tracking the Chinese Public Cloud Service Market (First Half of 2023)" shows that telecom cloud operators China Telecom Cloud and China Mobile Cloud have shown strong growth in the IaaS business, seizing the opportunity to become computing power infrastructure.

Overall, after a year of simmering, the "cloud + large model" market has had a hot start in 2024. Looking further into the future, the situation will continue to be "hot."

The demand for large models as advanced productive tools will see a larger scale of release compared to last year.

In terms of policies, at the recently held "AI Empowering Industry Renewal" special promotion meeting for central enterprise AI, it has been emphasized that it is necessary to consolidate the development foundation, concentrate major resources on the most needed and advantageous areas, and accelerate the construction of a batch of intelligent computing power centers… to create large model-empowered industrial tools from infrastructure, algorithms, AI platforms to solutions. With continuous policy incentives, some cautious government and large state-owned enterprises are entering the AI large model race and becoming customers of cloud vendors.

In the market, after experiencing a slow recovery in 2023, more and more enterprises have realized that they cannot wait for the market dividend to return. They need to actively transform, optimize cost structures, and introduce intelligent technologies to reduce costs and increase efficiency in order to maintain their competitiveness in the saturated market. Going digital on the cloud and becoming intelligent with AI has become an option that enterprise departments cannot delay and cannot wait for, and more large model-related projects will emerge.

So, it is not difficult to predict that 2024 will be a year of "cloud + large model" business sprinting.

So, how should cloud vendors operate the "hot pot restaurant" of the AI era and help thousands of industries become intelligent? There are already some answers.

Large Models, a "Sharp Knife"

If a hot pot restaurant lacks a clear brand recognition and unique taste, and relies on a uniform formula and ingredients, it is easy to fall into homogeneous competition. This is similar to the IaaS model that cloud vendors hope to get rid of, which lacks differentiation and has low profit margins.

To achieve commercial growth through the MaaS (Model as a Service) model, cloud vendors must first have a very distinct "sharp knife" product - large models, to form their own differentiated competitive advantage.

It is easy to see that cloud vendors with excellent market performance and continuous orders all have very strong "AI wisdom," which is their own sharp knife product - large models, bringing obvious brand endorsement.

For example, Baidu Cloud's Wenxin Yiyuan, Huawei Cloud's Pangu large model, Tencent Cloud's Hunyuan large model, JD Cloud's Yanxi large model, and the basic infrastructure related to large models, such as Inspur Cloud's AI servers, Baidu's PaddlePaddle and Kunlun chips, Huawei Cloud's Ascend AI hardware and software, Tencent Cloud's vector database, JD Cloud's Cloud Ship Jinghai, etc., are evidence of the large model capabilities of cloud vendors.

Cloud vendors need to leverage their "sharp knife" products to attract industry customers and optimize them from at least three aspects:

1. Branding and positioning of large models, including the establishment of a clear brand image and unique selling points.

2. Continuous innovation and improvement of large model products, including the development of more advanced and efficient large models.

3. Integration of large models with industry solutions, including providing tailored large model solutions for different industries.

This is the end of the translation.

2. Systematization. The digitalization and AI intelligence transformation of government and enterprises going to the cloud is a complex systematic project that requires a large number of supporting technologies and equipment, rather than simply calling large model APIs. For example, AI-enhanced high-definition cameras and industrial Internet require cloud + IoT + AI + network. Therefore, to help customers make good use of large models, cloud vendors need to transform themselves into large "hot pot supermarkets" and provide a sufficiently rich selection.

3. Service capability. Due to the varying levels and capabilities of enterprise digitalization in transition, there may be a lack of technical talent reserves. Bringing large models truly into the market is a daunting task for the AItoB business. The solution for cloud vendors is to provide comprehensive "Haidilao-style" services, offering one-stop Mass services from data cleansing to model training to application deployment, as well as full-lifecycle services such as cloud migration, updates, and maintenance.

Taking Huawei Ascend AI Cloud Service as an example, in addition to computing power, it also provides heterogeneous computing architecture CANN, all-scenario AI framework MindSpore, AI development production line ModelArts, and a series of AI underlying development tools and technology platforms.

In summary, the arrival of AI large models has changed the basic rules and business models of the cloud market, and it is also the business entry point for cloud vendors. Only by honing the "large model business" into a sharp knife can the cake of industrial intelligence be cut.

Financial Industry, Leading the Taste of AI

The emergence of large models has brought new opportunities for the intelligence of thousands of industries. It can be said that every industry and every scenario is worth integrating large models. However, we often overlook a key issue: the AI heritage, strategy, and resources of different industries are vastly different.

The integration of AI large models with vertical industries cannot be a one-size-fits-all approach, but rather a phased approach based on different industries and scenarios.

Specifically, the first wave is led by digitally native enterprises with deep technical accumulation, such as the internet, e-commerce, finance, and enterprise services. The second wave includes industries with relatively low levels of digitalization but many generative AI landing scenarios, such as education, telecommunications, entertainment, and government affairs. In addition, AI large models will also begin to explore traditional physical industries such as agriculture, energy, and construction, gradually producing some replicable benchmark cases.

Among them, the financial industry, with its high level of digitalization, strong desire for intelligence, large business volume, and abundant cash flow, has become a key industry in the first wave and a battleground for cloud vendors in the large model battle.

Financial Intelligence will be the main landing point for the 2024 cloud + large model.

So, what is the delicious large model in the eyes of financial customers?

There are many details, but for commercialization, two points are crucial:

First is security.

The financial industry has strict requirements for risk control, security, and efficiency. The competition between China and the United States has raised questions about the sustainability and security of AI infrastructure such as NVIDIA graphics cards, TensorFlow development frameworks, and Oracle databases. Therefore, domestically developed alternative products that are independently researched and developed, can be mass-produced, and meet the practical application needs of the financial industry, such as Huawei Cloud's GaussDB, Ascend AI chips, and Tencent Cloud's TDSQL, have become the focus of financial customer procurement.

Second is precision.

To land large models, it is necessary to consider the understanding of industry characteristics and carefully formulate AI cloud services that better meet the needs of financial customers. Huawei Cloud delves into financial application scenarios and has launched Financial PaaS 3.0 to ensure high performance and availability of financial business. Baidu Intelligent Cloud's OpenMind Smart Finance solution, combined with cognitive financial business scenarios, can distill a vast amount of financial expertise. JD Cloud's Yanxi large model, based on general knowledge, also incorporates JD's financial industry Know-How.

These targeted industry solutions are what make the combination of finance and large models a reality.

The demand for cloud-based intelligence in the financial industry is rapidly increasing, and independent and controllable large models and infrastructure are also the trend. In the future, we will see cloud vendors creating AI raw materials from the root technology to impress the taste buds of financial customers and skillfully crafting "cloud + AI" solutions that are more suitable for finance.

Government Cloud, Rising Urban AI Infrastructure

How can the general public benefit from the arrival of AI large models? Urban AI infrastructure, like a "base" made from various data, applications, and services, allows people to experience the beauty of urban life through changes in various aspects such as government affairs, transportation, parks, entertainment, tourism, and education.

Large government and enterprise entities have always been pioneers in digitalization. Currently, the cloud migration needs of government and enterprise customers are gradually entering a deep-water area.

On the one hand, there is an increasing emphasis on AI in government and enterprises.

Several years ago, large government and enterprise entities began to migrate to the cloud and explore AI, creating a series of achievements such as smart cities, city brains, and intelligent transportation. As market saturation increases, the growth rate has begun to slow down.

The arrival of large models, with its natural cost advantages in obtaining large models in the cloud, has once again activated the demand growth in the government and enterprise cloud market. At the 2023 Baidu Cloud Intelligence Conference, Baidu Intelligent Cloud released the Kyushu digital government solution based on large models. The Zhongshan Municipal Government cooperated with Huawei Cloud to create a digital government 2.0 based on the Pangu large model in the government affairs field.

Making large models the core capability of urban governance and creating urban AI infrastructure has become a new opportunity for cloud vendors and a key area of competition.

On the other hand, there is a change in the decision factors for deep cloud usage in government and enterprises.

From informatization to digitalization, large government and enterprise entities were able to quickly migrate to the cloud in the early stages. Many places, such as Shanghai and Hunan, experienced a wave of "ten thousand enterprises migrating to the cloud." However, as they enter the deep cloud usage stage, government and enterprise customers have strong demands for cloud data security, independent and controllable infrastructure, and not being tied to a single cloud.

Multi-cloud procurement may take away some profit space from some cloud vendors, such as Alibaba Cloud, which prioritizes "public cloud first." However, it also provides opportunities for cloud vendors that support "hybrid cloud + large model" deployment. For example, the Suzhou Municipal Government Cloud Service project that was awarded at the end of last year was won by China Telecom Cloud, Huawei Cloud, and Inspur Cloud. The service requirements mentioned the construction principle of "centralized + distributed" to ensure the reliable operation of the government cloud platform and simultaneously promote information sharing between supplier government cloud platforms.

Urban AI infrastructure is rapidly rising. Whether it can fully meet the demands of deep cloud usage is a crucial question for "cloud + large model" to enter the government and enterprise race track.

The increasingly hot "cloud + large model" market is entering a golden development period and has become a historic development opportunity for public cloud vendors.

Don't let thousands of industries miss the wave of large models and bring intelligence to the table in the era of intelligence. The "cloud + large model" feast will eventually make us say, "It's really delicious."

免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

Share To
APP

X

Telegram

Facebook

Reddit

CopyLink