A brief analysis of McKinsey's Lilli: What development ideas are provided for the corporate AI market?

Reprinted from panewslab
05/09/2025·12DMcKinsey's Lilli case provides key development ideas for the enterprise AI market: edge computing + potential market opportunities for small models. This AI assistant, which integrates 100,000 internal documents, not only achieved 70% of employee adoption but also used on average 17 times a week, this product stickiness is rare in enterprise tools. Let me talk about my thoughts:
-
Enterprise data security is a pain point: the core knowledge assets accumulated by McKinsey in 100 years and the specific data accumulated by some small and medium-sized enterprises have extremely strong data sensitivity, and are not processed on the public cloud. How to explore a balanced state of "data not leaving the local area and AI capabilities are not discounted" is the real market demand. Edge computing is an exploration direction;
-
Professional small models will replace general big models: what corporate users need is not a general model with "10 billion parameters, all-round", but a professional assistant who can accurately answer questions in specific fields. In contrast, there is a natural contradiction between the versatility of large models and the professional depth, and small models are often valued more in corporate scenarios;
-
Cost balance between self-built AI infra and API calls: Although the combination of edge computing and small models has a large investment in the early stage, the long-term operating costs have been significantly reduced. Just imagine if the AI model used by 45,000 employees frequently comes from API calls. The dependency, usage scale and increase in quality theory will make self-built AI infra a rational choice for large and medium-sized enterprises;
-
New opportunities in the edge hardware market: big model training is inseparable from high-end GPUs, but edge reasoning requires completely different hardware. Processors optimized for edge AI by chip manufacturers such as Qualcomm and MediaTek are ushering in a good market opportunity. When every enterprise wants to build its own "Lilli", edge AI chips designed for low-power, high-efficiency will become a necessity for infrastructure;
-
The decentralized web3 AI market is also enhanced simultaneously: Once the enterprise's demand for computing power, fine-tuning, algorithms and other aspects on small models are driven, how to balance resource scheduling will become a problem, and traditional centralized resource scheduling will become a problem, which will directly bring great market demand to web3AI's decentralized small models fine-tuning network, decentralized computing power service platform, etc.;
When the market is still discussing the general capability boundaries of AGI, it is even more enjoyable to see that many enterprise users are already exploring the practical value of AI. Obviously, compared with the resource monopoly leap forward in comparing computing power and algorithms in the past, when the market focuses on edge computing + small model methods, it will bring greater market vitality.