Why Small Teams Need a New Option for Open Source AI
Earlier todayCakeannounced $13M in funding led byGradientandPrimaryto bring open source AI to the mid-market. If you’re reading this, you’ve probably seen overcrowded market maps likethis. The question I’ll answer here is why small AI teams need a new infrastructure option. If the thesis resonates, please reach out for ademoor check outopen roles at Cake.
The answer is that everyone already knows what the best AI stack is. It’s the “build” option chosen by every hyper-capitalized and scaled organization, i.e., hiring tens or hundreds of engineers to build fully custom AI environments. The problem for the 99% of businesses who can’t invest at this level is that their use cases are too complex for off-the-shelf products (the “last-mile” problem), but they don’t have the scale of technical team to build what they need.
The “last-mile” problem is exacerbated in a world of accelerated change in which best-in-class technologies are often cutting-edge open source projects that aren’t secure and reliable by default. In meeting hundreds of mid-market AI teams over the last year, we have consistently heard that their #1 problem is not any one part of the stack. Rather, by farthe most common “biggest pain point” is how to assemble and maintain the overall “full stack”.
Accelerating the Path to Production
One of Cake’s earliest customers is a fast-growing healthcare technology startup that builds complex machine learning systems using petabyte-scale data from multiple hospitals. As we got to know the team, we felt the frustration and anxiety caused by tight customer deadlines, enormous technical challenges, and a talented team without sufficient bandwidth for the task at hand. There wasn’t time to make hires – let alone assemble the platform.
The partnership involved more all-night hacking sessions than planned on our part, but we saw the magic of providing a modular, pre-integrated environment composed of popular open source components.We were in production in under three weeks(just in time for a critical customer deadline) rather than an untenable estimated 6-9 months. Our collaboration produced results that helped unlock new business for our customer and push forward the boundary of clinical and research science. This customer is now expanding quickly, with AI in production serving some of the largest global pharmaceutical companies during high value decision points during clinical trials.
A Universal Challenge for SMB and Mid-Market AI Teams
There are tens of thousands of similar small and midsized businesses with ambitious AI goals, but which are bottlenecked on engineering bandwidth around the “full stack” problem. Underlying infrastructure is notoriously difficult to stand up and maintain. Open source research breakthroughs come with security and reliability concerns. Initial configuration and integration is already too hard — not to mention the ongoing effort to keep a system up-to-date in a rapidly evolving landscape.
Ideally, small teams could engage with new cutting edge AI research via a system with the following three criteria:
Managed Infrastructure- a fully managed underlying compute environment that enables focus on AI-enabled business use cases and customer priorities
Rapid Integration- a rapid, secure, and reliable pathway for integrating new breakthrough technologies into the environments
Easy Customization- A straightforward path to solving the inevitable “last mile” challenges, via customization of the entire stack
Unfortunately, each of these requirements has historically not been available outside of the largest and best capitalized organizations. Infrastructure management is hard enough on its own, and setting up a future-proof system enabling rapid integration of new technologies for production use as well as ongoing customization is simply not in the realm of possibility for 99% of businesses.
A New Option
Cake’s mission is to accelerate the widespread distribution of AI ensuring that frontier innovation solves humanity’s hardest problems.
That mission required providing smaller organizations a radically different option:
The control of a build with the ease of a buy.
Cake offers a new paradigm that finally unlocks the horizon-expanding potential of cutting edge AI for businesses of any size.
Cake fulfills this vision with a new architecture that separates the environment’s underlying platform capabilities from the AI software running in the environment. Underlying infrastructure includes concerns such as security, compute management, and cost optimization. The AI capabilities comprise curated pre-integrated sets of popular open source AI/ML components.
Cake Architecture
Click to view full size
How Cake Fulfills its Promise
Cake solves a number of the challenges listed earlier.
Managed Infrastructure- Cake handles underlying infrastructure on Kubernetes, including security, user management, RBAC, cost and compute management, monitoring, and alerting. Cake is SOC 2 compliant and is deployed in a customer-controlled environment, solving many common concerns around data privacy and security.
Rapid Integration- Cake continuously curates and integrates popular open source technologies. These components are immediately available to all customers. Cake handles all of the version updates, security patches, integrations, testing, etc.
Easy Customization- The Cake stack is open source, modular, and available for last-mile customization within a secure environment. Customers can easily swap new technologies in and out of their environment without worrying about handrolling new integrations or security features for recent open source projects.
Early Proof Points
The most exciting part of the Cake story so far is seeing customers achieve their required level of customization and control at a fraction of the cost and time that would otherwise be required. Cake customers are already:
Running RAG on 100s of millions of PDF pages
Operating distributed (federated learning) medical image analysis model training across hospitals on petabyte-scale data
Operating mission-critical national scale insurtech workloads
Developing and deploying many additional sophisticated and scaled AI/ML use cases
Cake customers usually reach production 6-12 months earlier than expected and with 80% lower total costs than budgeted.Our north star in 2025 is to continue rolling out capabilities that further accelerate our customers on their ambitious AI roadmaps.
It Takes a Village
I am thrilled to thank our investors atGradientandPrimary.Darian Shirazi,Brian Schechter, and the broader teams at both firms have been steadfast in their support of the Cake mission and vision. The $13M in funding fuels our ability to continue delivering growth of the team, product, and customer base.
I’d also like to thank every member of the growing Cake team for their incredible work every day. It is amazing to see our customers accelerating their use cases into production on the Cake platform. Of course, thank you toSkyler, my co-founder, and the architect of the Cake platform (read his blogposthere). Finally, a huge thank you to our early customers who took the leap early on our product development journey. I hope you are as excited about our collaboration so far as we are.
Cake is growing quickly. Check out ouropen roleshere, or reach out for ademoif you’re battling hard AI infrastructure problems.