PG电子(中国)官方网站

China Security Co., Ltd
Industry Dynamics
Security+AI ushers in the 2.0 era, and a new competitive trend has arrived?
Release time: 2023-07-27
      In 2015, the security industry began to introduce AI technology on a large scale, but the market response was relatively lackluster and did not spark any splashes. On the contrary, in the following two years, following AlphaGo's consecutive victory over two human Go champions, the attention of the entire market to AI continued to rise. For AI, it is only then that it truly becomes well-known, and the public's expectations for it are also increasing.
 

      However, as time passed, the patience of the public and capital towards AI began to gradually decrease. Especially as the most widely used field of AI,The security industryThe security industry has also encountered development bottlenecks, such as severe homogenization of AI application scenarios and products in the industry market, weak growth of enterprise performance, etc. The industry urgently needs new technologies, new formats, and new models to break through this bottleneck.
 

      Industry insiders have expressed that the big model may officially usher in the AI2.0 era, which will bring huge potential and opportunities to various industries. The AI1.0 era has become history. So what are AI1.0 and AI2.0? Look at the following introduction!
 
 

      01. Big models usher in the AI2.0 era?
 

      There are still some doubts in the market about the proposition that big models will usher in the AI2.0 era. After all, the term 'big models' has been around for a long time and a short period of time, and more importantly, its practical application and commercial value have not been truly reflected. Moreover, professionals in various industries do not have a very thorough understanding of the professional terms involved in this field, let alone end customers.
 

      So, as the general public, we should first understand the definition and central idea of big models in order to keep up with the pace of the market. Large models refer to machine learning models with a large number of parameters (billions or even billions), which can process large-scale datasets during training and provide higher predictive power and accuracy, such as image recognition, speech recognition, and so on.
 

      There are also many interpretations of other hot concepts such as GPT and AIGC in the market. But overall, they have an inclusive and inclusive relationship with the large model. Large models are a technical tool that can be used to construct various types of machine learning models, including generative language models such as GPT. AIGC is a category of generative AI, covering various applications and technologies of generative artificial intelligence.
 

      If you still cannot understand, perhaps it is better to understand from the perspective of economic benefits generated by investment. At this point, the division of AI can be divided into two stages, one is the 1.0 era, and the other is the 2.0 era. Among them, the AI1.0 era is the deep learning that occurred after AlphaGo, and this deep learning entered various industries and created value, with a time dimension of approximately 2015 to 2022. AI2.0 started with the 2023 big model craze, and the duration is unknown. However, it can be clearly stated that big models solve the bottleneck problems encountered by AI1.0.
 

      For example, in the absence of a large model, AI1.0 needs to collect, clean, annotate data for a certain field of AI application, and then adjust the model. The entire process is complex and expensive, and it is also the primary factor for capital to cool AI before.
 

      The characteristics of the large model just make up for this defect, not only reducing costs but also being simple and convenient. When you apply it, you can make slight adjustments based on the big model for your niche field. It can be said that the big model has ushered in the AI2.0 era, characterized by big data, big computing, and big models.
 

      Of course, evaluating the good or bad of a large model mainly involves three aspects from the current perspective: algorithm, computational power, and data. Algorithms determine their own learning ability, computing power affects their learning efficiency, and data determines the training effectiveness of AI. The future may involve the needs of end customers, what they want the big model to do, whether they can create their own exclusive big model, and so on.
 

      In other words, in the current stage of AI advancement, developing large models to seize the market or open up new blue oceans for enterprises remains to be seen.
 
 

      02. What can a large model do?
 

      As mentioned earlier, big models have ushered in the AI2.0 era. What can big models do in this era, what can they bring to the general public, and so on, are common questions that everyone wants to ask.
 

      To answer this question, we can actually start with what AI has changed in the AI2.0 era. At present, industry insiders have stated that in the era of AI2.0, if you have the technology of large models as a basic platform and then overlay applications, it will rewrite every field. In other words, the big model changes the AI application ecosystem, such as the fields that have already been applied: intelligent healthcare, smart home, intelligent transportation, intelligent customer service, intelligent finance, and so on.
 

      However, some argue that the large models that have already been announced on the market do not actually create much value in the field of landing applications, and most of them are accessories derived from digital transformation in the hope of improving quality and efficiency in specific businesses.
 

      For C-end customers, purchasing enterprise products at this stage can only be considered as icing on the cake with a large model, and the most practical commercial value has not been highlighted. So, in the short term, the value brought by large models may be overestimated, and there may be room for growth in the long term.
 

      In addition, at the recently passed 2023 World Artificial Intelligence Conference, if friends have consulted relevant exhibitors and on-site visitors, they can generally understand that big models have a great help in improving the usability of enterprise products, but big models are not all of their products. For businesses and audiences, their true goal is to use AI technology for digital and intelligent products.
 

      Moreover, during the emergence and rapid dissemination of an emerging technology, there will always be some uncertain factors that constrain the development of large models, such as algorithms that still have room for improvement, data that is not yet complete, and so on.
 

      That is to say, in today's continuous fermentation of big models, we need to maintain a clear mind, not just look at big models, but pay more attention to the current events themselves. Why do companies develop big models or discuss big models in various conference venues? In fact, most of them are to cater to the market, and if you are an AI company or innovative enterprise, not mentioning big models or AIGC, it may not be professional enough to the outside world, which may affect the future development of the company.
 

      Overall, large models are a good development trend in terms of technology. In theory, they can accommodate everything, but they cannot fully accommodate it yet. Finding a suitable application scenario is not easy for AI big models, and the amount of value it can create in the scenario it is applied to is still unknown and requires time to test.


Scan QR code
Follow the official WeChat platform
Learn more

  • Contact number:027-87827660
  • Company address:Building 5, Huazhong Xiaogui Mountain Financial Culture Park, No. 203, Zisha Road, Wuchang District, Wuhan City, Hubei Province
Copyright 2015-2019 China Security Co., Ltd    | 鄂ICP备2022010799-1号沪公网安备31010702002659号
友情链接: