Metadata
- Author: mattturck
- Full Title:: Full Steam Ahead: The 2024 MAD (Machine Learning, AI & Data) Landscape
- Category:: 🗞️Articles
- URL:: https://news.dataelixir.com/t/t-l-etyikid-yuuitdetk-y/
- Finished date:: 2024-04-04
Highlights
The intensely (insanely?) crowded nature of the landscape primarily results from two back-to-back massive waves of company creation and funding. (View Highlight)
The first wave was the 10-ish year long data infrastructure cycle, which started with Big Data and ended with the Modern Data Stack. The long awaited consolidation in that space has not quite happened yet, and the vast majority of the companies are still around. (View Highlight)
The second wave is the ML/AI cycle, which started in earnest with Generative AI (View Highlight)
Note: those two waves are intimately related. A core idea of the MAD Landscape every year has been to show the symbiotic relationship between data infrastructure (on the left side); analytics/BI and ML/AI (in the middle) and applications (on the right side). (View Highlight)
New highlights added 2024-04-04
Two big waves + limited consolidation = lots of companies on the landscape. (View Highlight)
New highlights added 2024-04-06
Unstructured data (ML/AI) is hot; structured data (Modern Data Stack, etc) is not. (View Highlight)
Many startups in and around the Modern Data Stack will aggressively reposition as “AI infra startups” and try to find a spot in the Modern AI Stack (see below) (View Highlight)
Many people in the BI industry are skeptical, however. The precision of SQL and the nuances of understanding the business context behind a query are considered big obstacles to automation. (View Highlight)
New highlights added 2024-04-07
In consumer, AI apps show high churn. How much was it mere curiosity? (View Highlight)
Billions of venture capital and corporate money are being invested in foundational model companies. Hence everyone’s favorite question in the last 18 months: are we witnessing a phenomenal incineration of capital into ultimately commoditized products? Or are those LLM providers the new AWS, Azure and GCP? (View Highlight)
Perhaps the analogy with cloud vendors is indeed pretty apt. AWS, Azure and GCP attract and retain customers through an application/tooling layer and monetize through a compute/storage layer that is largely undifferentiated. (View Highlight)
Most importantly, they will often combine them – LLMs may not be great at providing a precise prediction, like a churn forecast, but you could use an LLM that calls on the output of another model which is focused on providing that prediction, and vice versa. (View Highlight)
On the plus side, maybe all of us as users of Generative technologies should just enjoy the explosion of VC-subsidized free services:
VCs brought you cheap Ubers
VCs brought you cheap Airbnbs
VCs are bringing you cheap AI inference
YOU’RE WELCOME (View Highlight)
Perhaps the biggest winners of Generative AI in 2023 were the Accentures of the world, which reportedly generated $2B in fees for AI consulting. (View Highlight)
But we’re early in answering some of the key questions Global 2000-type companies face: What are the use cases? The low hanging fruit use cases so far have been mostly a) code generation co-pilots for developer teams, b) enterprise knowledge management (search, text summarization, translation, etc), and c) AI chatbots for customer service (a use case that pre-dates Generative AI (View Highlight)
One version of the question: AI makes it 10x to code, so with just a few average developers, you’ll be able to create a custom-made version of a SaaS product, tailored to your needs. Why pay a lot of money to a SaaS provider when you can build your own. (View Highlight)
The current financing environment is one of the “tale of two markets” situations, where there’s AI, and everything else. (View Highlight)
AI talent has always been rare and today is not very different. (View Highlight)