written by
Major Tom

Accelerating the AI/ML Revolution AI’s Adoption Has Arrived

6 min read

AI’s use cases have accelerated with generative AI. Whether ChatGPT or “generative art,” viable use cases are already crossing over. With the models and infrastructure be- ing made available, at the application layer, what occurred recently in SaaS will occur even more quickly with AI Apps.

SaaS Started Big, then Niched Down

It seems like SaaS initially attacked the market held by on-premise enterprise giants. These large companies (Oracle, SAP) provided comprehensive suites across depart- ments and functions: HR, Finance, Manufacturing.

Salesforce entered the smaller businesses that wouldn’t normally pay hundreds of thousands if not millions of license and consulting with their SaaS Model. They took and led the market in very large functions: sales, marketing, service.

Vertical players, including those building on top of Salesforce, followed next. The poster child of near-perfect execution was Veeva, which built onto of the Salesforce platform but then focused on the life sciences industry.

As computing become easier and less costly, “add ons” to the Salesforce ecosys- tem niched-down and helped the gorilla deliver a “total solution” through deeper functionality. Although many SaaS developed independently of Salesforce, the Sales- force App Marketplace is a good indicator of the SaaS app speciation.

Driving this proliferation of similar, if not exact copy-cats, of solutions is the contin- ued reduction in computing cost, the diversification of acquisition channels, and de- veloper tools which can enable a single solo developer to build a viable SaaS product.

The winners in this explosion of niched-apps are the customers (more choice, bet- ter solution fit), compute and infrastructure providers, and the builders who can cap- ture the surplus value on their time.

AI will see a similar curve, just faster

AI will likely see a similar adoption and speciation, just faster.
Some of the limiting factors that needed to help accelerate what occurred in SaaS

is already in place: cheap compute, developer tooling and framing, large TAM of non- consumers (the mid-market and SMB normally priced out of legacy software).

What will accelerate this pace?

First, any company that runs software will want to extract the productivity gains from AI. The hard “lift” for SaaS adoption was getting business decision-makers to un- derstand the gains from a “digital transformation.” This mental and operating shift was hard for large companies that consulting companies had to guide them through the process and articulate the benefits.

That education phase is largely done. AI is an extension of the “productivity, effi- ciency” story of digital transformation. Business demand is there.

Second, now that the popular narrative of “general AI” is returning to more realis- tic and practical levels with use cases around ChatGTP and other generative AI, companies can think of areas of opportunity more readily. Not only can they think of them, but likely the AI primitives already exist.

Third, the growth of SaaS still required VC investment to bring together a team to build and sell their niche solutions. Although a good number were able to bootstrap themselves, the development cycle still required a team and some time.

We’re seeing that fall dramatically with solo SaaS builders, but differentiation, dis- tribution, and product fit (if not a copy cat) is still hard and takes time.

But AI “entrepreneurs” could be researchers, model-builders, data-scientists who work on the underlying primitives. These entrepreneurs differentiate not on UI or fea- tures; they stand out and build demand in a way more akin to a solo algorithmic trad- er who has a good idea by sheer intellectual insight.

In the same way that solo SaaS developers can outsource to other SaaS providers much of their operational needs (from compute, to CI/CD, to security scanning), so will these ML/DS researchers and algorithmic innovators.

They will be able to test, publish, and prove their models without a team.

When downstream consumers, which can be other ML builders, AI application providers, or enterprises, then use their models, these ML builders can monetize and build a sustainable business.

All without creating a team.


In some ways, they are the ultimate “creators” in the creator economy.
Currently, there has been very profitable solo-creators selling courses or writing

books. The distribution and production infrastructure is readily available and low cost. But the differentiation and value-creation side seems limited.

In other words, how different are the writers and course creators on how to market a business really that different? Usually they set themselves apart in brand, position- ing, and marketing messaging. The actual utility and “features” are largely the same.

Similarly, the utility isn’t usually large enough to extend to enterprises. Many that are successful appeal to other entrepreneurs, small businesses, freelancers, growth- minded employees. This is a large market with people willing to spend.

But AI capabilities can provide an order of magnitude greater value to a large en- terprise.

Fourth, the features or AI primitives, are composable. This is not a zero-sum game amounts the AI creators.

In the case of single-purpose apps, its hard for the user to justify having more than one of a similar app.

In the case of a writer or course creator, it’s possible the customer can purchase more than. For example, people do buy multiple leadership, productivity, and rela- tionship building books and courses. But those are Superconsumers, and at some point, it’s a diminishing return.

My hunch is AI primitives and their composability actually creates a much larger design space. It’s more akin to Legos, where the end-user puts together their applica- tion based on different capabilities.

This is the same premise behind AWS building primitives for developers. With over 15+ databases, alone, AWS has embraced the concept of build primitives and let innovation bloom.

The potential for AI primitives will be more likely 15K, not 15, maybe even more. Each time a new open-sourced foundation is released, thousands more models will explode.

Who benefits from this rapid speciation?

The “AI Creator” will benefit the most. With nominal marketing (likely content market- ing, publishing research in the open), AI Creators can work with a laptop, a data note- book like Jupyter, and a dataset either of their own or from the public domain, and build a large, viable business.

The customers who consume these models can benefit, as long as they have easy access, deployment, verification, testing, security, and the like. These customers can range from a small mobile app developer using No-Code to a large enterprise cus- tomer. The usage-based pricing model supports this wide range.

The infrastructure providers benefit. I think this is the time when reclaiming com- pute as well as the end-to-end MLOps from BigTech is both possible and better for the ecosystem.

What is the critical enabler?

The critical enabler is to serve two primary end-users: the consumer of the models and the builder.

On the one hand, builders need an easy way to make consumable their models with little to no infrastructure and operational demands on them. It should be set and forget.

On the other hand, the consumers, whether enterprise or start-up, benefit when they don’t have to worry about standing up production-grade infrastructure and oper- ations. They want to take existing models and either go directly to production or be able to modify, enhance, and iterative on easily.