


Investors looking for meaningful profits from consumer-facing chatbots may be disappointed. Once they incorporate the new AI capabilities into their workflow, it can be easier for them to accept a price increase later. This approach often makes the most sense for products sold to consumers and small businesses, who may balk at paying more for a feature that they may or may not use. Adobe has historically used this approach with its Creative Cloud and Acrobat products. Eventually, the company might impose across-the-board price increases, justified by the value that has been added. Instead, the strategy would aim to enhance the value of the product with AI added as a feature.

Microsoft, OpenAI’s infrastructure partner (and minority investor), has said that 2% of its Azure cloud growth in the third quarter will come from generative AI consumption.Īs a feature: Some AI providers might integrate AI capabilities into products without charging for the enhanced services initially. OpenAI has pioneered this consumption model by charging enterprise customers according to the number of “tokens” they use, with each token representing about 750 words. Because their usage may be sporadic, and because AI infrastructure costs so much, the cloud vendors will likely charge them on an a la carte model, in our view. We believe that many will choose to leverage the native AI platforms of cloud vendors such as, Google and Microsoft. Google is going down a similar path, having recently announced a $30 per user/per month pricing for its Duet AI service for G Suite enterprise applications.Ī la carte: As more companies adopt AI technologies, they’ll need more computing infrastructure to run their AI queries. So why did Microsoft charge more than expected? Were customers willing to pay more because productivity gains already exceed expectations? Or was the technology proving more expensive for Microsoft than it expected? It’s too soon to say, but it may be a bit of both. Some investors anticipated a much lower pricing point. Microsoft is already doing this by charging $30 per user/per month for a service called Copilot, which adds AI capabilities to applications within its Microsoft 365 suite. Subscriptions: Companies that can integrate AI features to enhance existing products will have instant access to a potentially lucrative customer base. Understanding the dynamics of these strategies can help investors analyze whether different types of companies are on track to profit from the technology. While the commercialization of AI is still in its infancy, we’re already seeing three key pricing strategies. As a result, AI vendors must balance their customers’ productivity expectations against their own cost of servicing them. AI-enabling technology is very expensive, as the supply of critical infrastructure such as GPUs remains extremely limited. As a result, at this stage of the technology’s evolution, many investors are concentrating their attention on how AI vendors will price the technology.įor AI platforms, finding the right price point is partly dictated by the cost of computing infrastructure.

For example, a company seeking to improve the productivity of a $100,000-a-year employee by 25% will face an entirely different value proposition if the AI technology for that worker costs $5,000 or $20,000. For example, AI can potentially perform many menial, time-consuming tasks, freeing up professionals to add more value for their employers.ĭelivering productivity gains will depend on the technology’s cost. While a few high-profile cases focused on worker redundancies, far more have focused on increasing output with the same employee base. Some companies have predicted that AI could unlock productivity improvements of 20%–30%. But we’re starting to see these companies take various approaches to turn productivity gains into profits. Sorting out the winning strategies among platforms and users is a lot murkier. The market has quickly identified “picks and shovels” winners-as seen in this year’s performance of NVIDIA, which makes graphics processing units (GPUs) that are essential for AI. These paths to monetization are intricately linked. And “picks and shovels” suppliers sell the underlying hardware needed to run the technology. Providers of the technology to those users-the “platforms”-will profit if they can achieve favorable price points. Users of the technology can find ways to improve productivity with AI. PricingĬompanies can make money from generative AI in several ways.
#JAMES HYPE NEW YORK HOW TO#
How to Monetize Generative AI: Productivity vs.
