Startups

Built on the

ChatGPT API

Are Taking a

Huge Risk

Rachel Curry
Rachel Curry is a journalist based in Lancaster, Pennsylvania. Her work focuses on finance and technology on a global scale, as well as local issues impacting her community. You can connect with her on Twitter at @writingsofrach

Generative AI has emerged as a groundbreaking and captivating technology, capable of feats that have dominated press headlines and engaged a growing user base. Startups have raced to capitalize upon the fever-pitch surrounding ChatGPT and other generative AI models – and the recent release of OpenAI’s ChatGPT API has opened up the potential for countless niche products built on the back of the company’s super-powerful technology. But history shows longevity is uncertain – and risks plentiful – for companies too dependent on a closed-source API.

The ChatGPT API-based startup arena includes companies like Baselit (a so-called copilot for data analytics), Landbot (a no-code customer service chatbot builder), and BarriAI (APIs that allow you to build ChatGPT apps on the fly).

Many API-dependent companies of the past have suffered after having access revoked or usage costs suddenly increased. Thread Creator, a startup that offered users the ability to manage Twitter threads, which was unable to function when the company lost access to the Twitter API in 2021. Then there’s CanvasPop, which no longer connects with Meta-owned Instagram to print pictures straight from the app. Elon Musk recently announced a price hike for many users of Twitter’s API.

Users of the startups’ services also face risk, which could lead to regulatory or legal requirements to a degree that founders are not able to pivot to meet. The valuations of many companies rest on the value of data collected from users rather than simply the quality or demand of the product itself.

The problems with ChatGPT startups are multifold. For founders, building a product on top of another company’s proprietary model leaves developers vulnerable. “You’re dependent on somebody else’s platform and infrastructure, and then you have to pass that cost onto your own customers,” says Theo Priestley, author of “The Future Starts Now,” and long-time futurist, 

If you use ChatGPT’s API, you pay based on a token system that reflects how “costly” it is to generate the text you need. This is important, considering that startups and developers using the API are held hostage if pricing shifts—which could trouble the overall competitiveness of industry pricing.

That issue melds with another, which is the replicability of ideas. “The ideas are ten a penny,” Priestley says.

What’s to stop another entrepreneur from taking a similar idea, improving it, and using it to launch something else—all while doing it cheaper? Alternatively, what’s to stop OpenAI from seeing demand for an API-dependent tool and simply replicating it as a feature for its own paying ChatGPT subscribers, effectively shuttering startups in that niche?

According to Dan Cunningham, CTO of AI-powered reputation management software Chatmeter, that market saturation is, in fact, possible to get past. “Startups have to ask themselves what value they are bringing, and where they can differentiate themselves,” Cunningham says. The emphasis here is on differentiation, or making ChatGPT API integration a slice of your offering, but not the whole pie. (Chatmeter has a new generative AI offering using the OpenAI Chat Completion API.)

[Generative AI chatbots] have become data hoovers very quickly and easily, and that’s worrying to me,” says Priestley.

When startups rely too heavily on generated text as a substitute for human interaction, especially in cases where the AI’s influence isn’t readily disclosed, they run the risk of discomforting users and creating ethical concerns. California-based mental health nonprofit Koko rolled out a GPT-powered mental health support feature that it quickly shuttered due to ethical concerns. Koko co-founder Rob Morris wrote on Twitter about the shutdown. “We used a ‘co-pilot’ approach, with humans supervising the AI as needed,” he wrote. “Once people learned the messages were co-created by a machine, it didn’t work.”

Peter Relan, venture capitalist and founder of YouWeb Incubator, has experience investing in the generative AI space (including for Got It AI and MathGPT). “Failure is a distinct possibility,” Relan says, referring generally to investment in the generative AI space. He clarifies that there is broad-based investing going on at the seed stage, with investors hoping to find the winners in generative AI, much like the rest of the technology industry. However, with generative AI, the wheels are turning faster due to rapid advancement in the technology.

With all of these risks for founders, users, and investors, what are the alternatives to these kinds of API-based startups?

Aside from proprietary models using great training data sets, Relan says you can lean on open-source models or even fine-tuned LLMs for task-specific applications.

Relan says he sees that with his conversational AI startup Got It AI, which built its own model, ELMAR (Enterprise Language Model Architecture.) “It is a suite of on-premise guard-railed language models that can be fine-tuned to enterprise-specific tasks,” he says. He claims that this approach protects all parties involved while retaining the utility of generative AI. 

While we have yet to see the full breadth of alternatives, startups entirely dependent on ChatGPT APIs might have a short-lived outlook. However, that doesn’t mean generative AI innovation is a moot point. In fact, development in the space is essential, but it’s crucial to do so in a way that caters to longevity and value to all stakeholders.

Cunningham, of Chatmeter, believes startups that fine-tune models for their own purposes can succeed. “Smart startups will be looking to these technologies for future development, choosing the right model for their particular use case, and in many cases training these models on their own data and needs.”

Do you agree with this?
Do you disagree or have a completely different perspective?
We’d love to know