sultana mow's profile

Privatization Deployment

Privatization Deployment
As Zhu Xiaohu said, the industry's main HE Tuber expectation for large models is "whether they can solve problems." Therefore, privatized deployment will be a key trend in the future of large models in the industry. That is, major AI large models will be deployed in depth based on the actual needs of customers. Customize, match computing clusters/databases with different performances, and create corresponding models for each user type to improve work efficiency.

Before OpenAI, the AI ​​large model industry for privatized deployment has become

 increasingly mature: overseas Microsoft and Salesforce, and domestic Tencent Cloud have actually focused on this market; Salesforce is already providing GPT-4-based customization for industry customers In the chatbot business, manufacturers that rely on selling customized large model services can also reach more potential users through the application store platform provided by OpenAI.


But whether it is privatized deployment or directly relying on existing general-purpose large model products, in order to better match the existing artificial intelligence capabilities with the actual needs of users, the raw materials required for large models - data, are crucial. An important part: This is also the reason why Tencent Cloud is currently cooperating with users in industries such as Shanghai University and CCTV to customize the large model.


"It is necessary to ensure the quality and coverage of data, but also to pay attention to the protection and security compliance of sensitive data. These are crucial for enterprise users to use large models." Senior Executive Vice President of Tencent Group and CEO of Cloud and Smart Industry Group said in When introducing big model data, it was believed that computing power and services are the key backgrounds that currently promote the privatized deployment of large models.


A CEO of a start-up company in the field of data analysis told reporters: In addition to the


 current extreme shortage of large-scale model technical talents, product managers who currently have "experience in using large models" are still considered rare in the industry. Even though many data analysts Companies have high expectations for large models. However, due to the lack of professional talents and the insufficient corpus for professional auxiliary training, it is difficult for existing generative dialogue robots to meet entry-level requirements without professional customization. Industry needs.


Whoever can "deliver the final service" can quickly occupy the largest AI large model


 market at the moment. The MaaS (Model-as-a-Service) model has become the unanimous choice of today's giants when providing large model services.
The popularity of the MaaS model has also spawned many large industry model manufacturers that do not make general large models but specialize in the toB market: in addition to Tencent Cloud, there is also the Pangu large model released by Huawei, whose main direction is AI for Industry (AI Empowering industries), including coal mining, cement, electric power, finance, agriculture and other industries.
According to relevant data released by Tencent Cloud, the current main implementation scenarios of large models are still concentrated in the financial/customer service and education fields: overseas, OpenAI cooperates with Khan Academy to develop customized chat robots and help students prepare review materials for exams. It also prevents students from using AI robots to directly cheat; Shanghai University uses the Tencent Cloud TI platform to apply large-scale model exploration in consultation and Q&A scenarios, which can provide students with consultation, Q&A, and cross-modal retrieval capabilities.


It is true that many people in the large model industry believe that the privatized deployment model of handing over all data/computing power/services to one company is equivalent to "monopolizing" the property rights/source code of user data and models, which may lead to The development of artificial intelligence is too centralized, but as far as the current situation is concerned, there are still not many open source large model options that can be commercialized. In the face of the high cost of building data centers, it is possible to reduce the cost of using large models as much as possible. This is what many companies should prioritize.


In the debate between Fu Sheng and Zhu Xiaohu in the circle of friends, Fu Sheng believed thatitself has great commercial value, but if itself accounts for 99% (of the value), then the entire AI ecosystem will be completely out of the question. Different from the current MaaS model, the large model application store is more like the App Store in today's mobile phones. It will eventually work with all the applications in it to expand the large model commercial market to more scenarios.


In terms of business modelhas currently relied on subscription (GPT4) and paid API to explore an effective profit model. At the same time, it has also accumulated a large amount of key data such as corpus from C-side users, but this does not mean that "big "Model Application Store" can once again succeed in today's large model commercial field. In order to make ccessful in commercialization, the problems facing OpenAI are still thorny: how to solve the copyright/data issues involved between different large models, and more practical security/ethical issues. ‍


But there is no doubt that in the current large-model commercialization ecosystem, MaaS has appeared in the market as a mature application model. The “out-of-the-box” model has attracted many industry users to come and test the water, and then use This opportunity is changing more and more industries.

Privatization Deployment
Published:

Privatization Deployment

Published: