2024: The Year of AI modern stack Startups — Avoiding the Commoditization Trap

Daniel Porras Reyes
5 min readJan 17, 2024

--

Summary:

  • The complexity of integrating AI into applications, primarily due to the non-deterministic nature of LLMs, has spurred a significant increase in AI infrastructure startups offering tools for A/B testing, monitoring, model routing, user feedback, caching, and privacy, among other features.
  • 2024 will be the year of AI infrastructure startups, as companies move from testing to production, powered by these infrastructure companies.
  • In the competitive AI infrastructure market, startups are adopting diverse strategies to differentiate themselves. Their approaches vary, ranging from providing complete, end-to-end solutions to focusing on specialized, technically sophisticated features with novel approaches.
  • In the short term, there will be significant overlap among players, but as complexities increase, collaboration among them will also grow. For instance, those specializing in observability will likely partner with players with a security and privacy focus.
  • Several players in the space will fail to differentiate, run out of runway, and consolidate with larger players, especially as the traditional players in the IT Ops market recognize they will need to round out product sets and capabilities to remain competitive.
  • As AI is used to tackle more complex, multi-modal,and agent-based use cases, the role of infrastructure startups will become more crucial.

2023 was the year of AI acceleration. According to a survey by Menlo Ventures, the number of enterprises using AI has been relatively flat for several years, but finally ticked up from 48% to 55% in 2023. Despite this encouraging trend, integrating AI into practical applications remains complex and challenging. According to a survey by Retool, 30% of companies are still establishing the basics. One of the primary obstacles in this process is the non-deterministic and probabilistic nature of LLMs, which can cause a massive change in the user experience on a case-by-case basis.

Similar to previous waves like the Cloud, these new innovations create the need for new infrastructure platforms that help developers evaluate, monitor, experiment with, and deploy applications. I believe 2024 will be the year of AI infrastructure startups. As adoption continues to increase and companies increasingly look to move mission-critical applications into production, infrastructure startups will emerge as the big winners.

Embedding AI within a product requires a solid, well-established base before moving into more intricate phases such as model fine-tuning, pre-training, or AI agent deployment. This groundwork includes critical components such as:

  • A/B Testing
  • Observability, logging, and explainability
  • User feedback
  • Model Routing and orchestration
  • Caching
  • Privacy and security
  • Compliance

Each of these is expanded on in more detail at the bottom of the post (Apendix A).

In response to these challenges and opportunities, a new wave of infrastructure startups has emerged, bringing innovative solutions to simplify AI deployment. Highlighted by Base10, Insight Partners, Sequoia, and Cowboy Ventures, the AI stack now includes over 30 startups, each contributing to the ease of AI integration.

When analyzing the different players, you can see a lot of overlap in the features they offer. For example, the following image shows 16 relevant startups that offer some form of model and prompt experimentation or A/B testing solutions.

Given this growing landscape, the question that emerges is how these startups can differentiate themselves while also competing with the DIY attitude taken by more technical advanced application development teams. One approach we have seen from some players is the attempt to build an end-to-end solution, as demonstrated by companies like Portkey, a Flybridge portfolio company. According to the founder:

“Enterprises are expecting a well integrated full-stack LLMOps product that allows them to be forward compatible and production ready. That’s the direction Portkey want to take.” — Rohit Agarwal

Another approach we have observed is to concentrate on a subset of features while developing more technically advanced features than what an end-to-end solution could offer. For instance, Martian, which recently raised a $9M seed round lead by Accel, developed a model routing approach through model mapping. In this approach, they transform models into a new format while preserving essential characteristics. Model mapping goes beyond distillation, not only simplifying models but also reorganizing and restructuring them in a way that makes their internal workings more comprehensible. This has enabled them to outperform other models 67% of the time with their model router and to attract users from companies like Stripe and Gusto.

In the next few years, we will likely see a dynamic shift in the AI infrastructure landscape. Initially, there will be an increasing overlap among companies as they strive to offer comprehensive, end-to-end solutions to address a broad range of needs. However, as the intricacies of these use cases intensify, the demand for highly specialized solutions will also rise. This shift might lead some companies to refine their focus, excelling in specific areas of their product line while forming alliances for other capabilities. For instance, while observability-focused companies might currently offer basic security features, the deepening complexity of security needs may encourage them to form partnerships with specialized security providers.

Last year, driven by the rise of AI, many infrastructure startups were born. This year, as these companies mature and begin to grow their customer base, category leaders are likely to emerge and a premium will be placed not just on product strategy, but also on the sophistication of their go-to-market strategies. This will probably lead to several startups that fail to differentiate and become commoditized, either running out of runway or consolidating with a larger player in the coming years.

The relevance of these infrastructure startups will continue to increase as more breakthroughs allow the building of more complex workflows. For example, multimodality will create the need not only to monitor large language models but also vision models. Additionally, as AI agents become more advanced and widespread, having appropriate monitoring processes is essential for identifying and mitigating issues like error propagation.

As we progress into 2024, the role of AI infrastructure startups is becoming increasingly pivotal for the evolution of application-layer companies. These startups are instrumental in simplifying the complexities associated with AI deployment. By providing robust and user-friendly AI infrastructure, they enable application layer startups to divert their focus away from the technical intricacies of AI implementation. This shift allows these companies to concentrate more on tailoring their offerings to meet customer needs effectively. Consequently, we can expect the coming years to unveil some of the most innovative and user-centric solutions, as founders channel their energies into enhancing user experiences rather than grappling with the underlying complexities of AI infrastructure.

If you are a founder building in the AI infrastructure space, do not hesitate to reach out at daniel@flybridge.com. We invest in AI startups from pre-seed to Series A, typically leading rounds. You can learn more about Flybridge at out website.

Appendix A:

--

--