Highlight Report

LLM leader shakes GenAI market with enterprise software licensing

Home » Research & Insights » LLM leader shakes GenAI market with enterprise software licensing

Enterprise leaders want generative AI (GenAI) solutions that offer better security, control, cost management, and integration capabilities. Leading large language model (LLM) provider Inflection AI thinks it has the answer by ditching the current industry standard cost-per-use API model and shifting to enterprise software licensing.

Many firms—particularly those with significant compliance and regulatory risk concerns—want models they can run within their own infrastructure. That’s led some to try to build their own LLMs amid other attempts to own and manage their own AI systems. While there is merit in this for many, some will likely bump up against performance gaps, as many of these home-grown models will fail to keep up with the leaders in providing coherent and contextually accurate responses.

Licensing model offers cutting-edge LLM capabilities with the controls enterprises demand

To meet this challenge, Inflection AI now offers the capabilities of a cutting-edge LLM with the controls such enterprises seek. The offering includes:

  • The option of on-premises deployment: Inflection AI will provide the hardware, software, integration, and consulting services. They are in discussion with potential partners about providing the latter.
  • Complete control over data: Your data stays on-premises or in a private cloud.
  • Customization and integration: Inflection AI will provide tools to make it easier for enterprises to fine-tune the model using their own data specific to each firm’s context and needs. Inflection AI will also provide integration services to ensure the solution works with existing enterprise systems, including development environments.
  • Service-level agreements (SLAs): Inflection AI offers its technology as enterprise software with licensing agreements—including support and SLAs.

The firm is committing to continued R&D and product development, funded by software licenses and focused on enterprise needs. The net result for enterprise customers is complete control over their data, predictable and manageable costs versus API-based models, tailored solutions for specific enterprise needs, integration with support, AND access to the AI technology of one of the four leading-edge LLMs—one that is now uniquely focused on enterprise needs.

Only a handful of LLMs can match up to Inflection AI—here’s why

Few leading LLM operators can claim Inflection AI’s capabilities—a small group that has cracked the twin challenges of convergence and parameter scaling and incorporates compound AI architectures.

ChatGPT, Gemini (Google DeepMind), Anthropic, and Inflection AI are all trained on extensive and diverse training sets—with billions of parameters; the more parameters, the greater the accuracy and realism of the output. The complexity and capability of a model to handle a wide range of nuanced tasks also rise with the number of parameters. But it takes significant computational power and memory (and investment) to train and fine-tune models of this power—which is why the leading group remains so far ahead.

This group also leads with its focus on convergence, the process by which the model’s parameters align for coherence and reliability.

The leading four also incorporate advanced architectures integrating tools and capabilities such as web search, reasoning, and planning. Models without such ‘compound AI architectures’ are unlikely to compete on functionality and versatility.

The performance of the leading four in our recent high-level review of LLMs versus business needs—How to choose the right LLM—further illustrates their capability advantage.

The business model will only suit the most committed
Exhibit 1: Only the most advanced and committed organizations will be ready to take the leap with Inflection AI’s new model

Source: DALL-E and HFS Research 2024

Be warned, the Inflection AI model is not for those early in their GenAI journey. Theirs is a scale play for organizations ready to invest heavily, having already proven and learned all they need from POCs, pilots, and initial enterprise scaling efforts. There likely will be hefty upfront costs that only the most committed will be prepared to fund. Firms also will need access to deep technical expertise. While Inflection AI is promising as-a-service services, they, too, cannot be immune to the AI talent battle that the rest of the industry is facing.

And those who commit to the on-premises model must factor in difficult-to-judge spare capacity. We are taking our first steps on this GenAI journey, meaning the complexity of AI tasks coming down the line will continue to increase the load on your infrastructure. Of course, many will be wary of putting so many AI eggs in one Inflection AI basket.

The Bottom Line: New model will suit advanced GenAI enterprises—and represents a disruption likely to repeat across the industry as it matures.

Inflection AI’s model offers significant enterprise benefits such as better security, control, and customization. However, to take advantage of it, you must have achieved relative GenAI maturity and be heavily committed to your GenAI journey. Leaders must weigh the benefits against both setup costs and the risk of vendor lock-in. Where the sums add up, the approach will likely disrupt the comfortable API-led model seen across the rest of the industry and point the way forward as the market matures.

Sign in to view or download this research.

Login

Register

Insight. Inspiration. Impact.

Register now for immediate access of HFS' research, data and forward looking trends.

Get Started

Logo

confirm

Congratulations!

Your account has been created. You can continue exploring free AI insights while you verify your email. Please check your inbox for the verification link to activate full access.

Sign In

Insight. Inspiration. Impact.

Register now for immediate access of HFS' research, data and forward looking trends.

Get Started
ASK
HFS AI