Poe wants to be the App Store of conversational AI, will pay chatbot creators

Poe wants to be the App Store of conversational AI, will pay chatbot creators

VentureBeat presents: AI Unleashed - An exclusive executive event for enterprise data leaders. Network and learn with industry peers. Learn More


Poe, the multi-AI chatbot aggregator launched by Quora in late 2022, has built up a following by giving users a single interface for accessing multiple competing large language models (LLMs) including OpenAI’s GPT-3.5/4, Anthropic’s Claude, Google’s PaLM 2, and Meta’s Llama 2. In April, Poe opened the door to third-party creators, be they individuals or enterprises and organizations, to create their own chatbots within its platform using its supported LLMs.

Now Poe has unveiled a new monetization model to enable third-party creators to earn revenue from building and offering their own chatbots through the Poe app. 

In an exclusive phone interview with VentureBeat, Quora CEO Adam D’Angelo explained the reasoning behind this move and his vision for Poe’s future.

“We’re coming into this market, we’re lowering the barrier to entry, we’re gonna enable this huge ecosystem of AI products to flourish,” said D’Angelo. “[Poe] is going to increase diversity and the breadth of applications that are available to everyone.”

Event

AI Unleashed







An exclusive invite-only evening of insights and networking, designed for senior enterprise executives overseeing data stacks and strategies.






Learn More


In the discussion D’Angelo outlined two key moves by Poe: letting bot makers earn money from their creations, and opening up API access so anyone can add new natural language models to the platform.

 The changes are already live, and creators are already being paid and new models integrated. 

D’Angelo expects this will encourage more participants to enter the market with unique offerings, ultimately providing greater choice and capabilities for users.

Offsetting costs to enable creators

Running large language models requires substantial computing resources. “We have relationships with larger API providers already, like OpenAI and Anthropic and Google, and we’re paying them money to use their LLM technology for inference,” D’Angelo explained.

Poe itself utilizes services from leaders in the space. But negotiating individual agreements doesn’t scale when the goal is thousands or even millions of conversational AI options.

That’s why Poe launched a revenue sharing model. “When we think about how we can enable an ecosystem with thousands or millions of different forms of AI on Poe, we can’t, it’s just too much overhead to negotiate a specific contract with each of the providers,” said D’Angelo.

Now a portion of payments from users to Poe is passed on to creators. This provides an economic framework to support the costs of developing specialized bots.

Poe’s new monetization model has two main components: first, Poe divvies up revenue from its $20 a month subscription when a bot leads a user to sign up for the app. 

Second, bot creators will soon be able to set per-message fees that Poe will pay out for each user interaction. Poe is still working on the 

This flexible approach allows bot makers to earn immediately from new Poe subscribers they help acquire, while also building recurring revenue over time based on usage volume. Poe plans to extend this monetization structure further to provide additional ways for bot developers to monetize their creations.

D’Angelo gave examples like training, evaluating, and marketing expenses. For quality experiences, there is significant work beyond just API access. Revenue sharing enables creators to build sustainable businesses behind their bots.

Opening access to new models

In addition to reducing barriers like infrastructure costs, Poe wants to spur innovation by allowing easy integration of different language models.

“Our API is totally open ended. Anyone can go and hook up a new model,” D’Angelo stated. This flexibility contrasts with needing to rely solely on models offered by Poe itself.

D’Angelo recognizes the field is progressing incredibly quickly. He wants Poe to support diversity, not constrain it. Enabling third parties to connect custom or tuned models benefits the entire ecosystem.

He pointed to existing examples like the Mistral model finetuned by  Fireworks.ai. “Lots of other people could follow and add different models to the platform,” he noted.

Lowering barriers and increasing diversity

D’Angelo drew an analogy between Poe and web browsers like Chrome and Safari. In the early days of the internet, building web applications required enormous effort. After browsers provided a common interface, the barriers dropped drastically.

Suddenly anyone could create websites. This explosion didn’t commoditize the internet though. Sites still competed fiercely, just based more on content and services rather than technical factors.

“When you lower the barrier to entry, I mean competition among websites is fierce, but there’s lots of websites that have very good businesses. It’s just that their differentiation is not about the technology anymore. It’s about what experience is being delivered through the technology,” D’Angelo explained.

He sees a similar future for conversational AI as Poe opens access and reduces friction. Unique experiences will distinguish successful bots, not fundamental model advantages.

“We’re lowering the barrier to entry, we’re gonna enable this huge ecosystem of AI products to flourish and I don’t think it’s necessarily going to increase competition as much as it’s going to just increase diversity,” said D’Angelo.

Empowering niche providers

Major players will continue pushing the cutting edge in language model research. But Poe is particularly focused on enabling smaller, niche providers.

“We want to help them bring what they create to mass market to everyone around the world. Especially because of these inference costs that are kind of underneath serving these models,” D’Angelo said.

He’s optimistic about text, image, audio and video generation. New techniques like model fine-tuning create even more opportunities. “I think especially we’re especially interested in kind of long tail creators, people who are just particularly talented at training a model or they have unique data or unique talent,” D’Angelo noted.

By supporting these specialists, Poe can deliver more diversity and choice to users.

Moving faster together

Ultimately, Poe wants rapid innovation across the entire conversational AI space, not to constrain progress to what it can handle internally.

By empowering external creators and apps, Poe believes it can accelerate development in areas like specialized models and vertical use cases. Increased diversity and competition will improve access and utility for users around the world.

D’Angelo acknowledged that the industry pace is incredible and even difficult for Poe to keep up with fully. “Our hope with the API and with these monetization tools is that we don’t have to be a bottleneck on it,” he said.



VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.