While I was at Kruze, I compared the costs of traditional SaaS businesses to those of our AI clients. Not surprising, their costs were much higher, particularly the compute costs. While AI revenue grows quickly, costs rise even faster. Over the past year, AI compute expenses surged 300%, reaching 50% of revenue, while SaaS costs held steady at 18%.
Even with cheaper models like DeepSeek, supposedly trained for under $10M, running AI at scale is still far more expensive than SaaS for the near term, since most SaaS businesses don’t really have a ton of compute on the backend.
I do think that the super-easy money will eventually end – all of these bubbles do – but the best AI companies will start to focus on efficiency. It seems reasonable that more focused, point type LLMs/models, will be able to be created for less. For example, I don’t need a math focused LLM to know how to make sourdough bread…
I haven’t seen that many AI founders who are interested in efficiency yet, but it feels reasonable that sooner or later the best founders will realize that VCs are going to start becoming less interested in the high burn, and will ask for some attempts to create better unit economics.