AI: What is it good for?
- tm5633
- Mar 20
- 4 min read
(Q2 2024 Letter)
The theory behind artificial intelligence was first articulated by Alan Turing in 1950 when he posed the question of whether machines could think. Explosive growth of computing technology over the ensuing seven decades brought this theory closer to a reality, but it was the October 2022 release of Chat GPT that brought artificial intelligence from the purview of data scientists and technologists to the consciousness of consumers. Today, despite all the excitement, artificial intelligence at its core is simply fast computing power capable of processing large amounts of data and recognizing and extrapolating patterns.
For all the promise of artificial intelligence, the historical context of technology platform shifts and capital cycles matters. Six main platform shifts have created new technology and distribution opportunities: mainframe computers, personal computers, networking, desktop internet (Web 1.0), mobile internet (Web 2.0), cloud SaaS, and possibly Generative AI. Each of these platform shifts proved to be successful in part by developing an ecosystem around them with complementary businesses, content creators, and developers that helped drive reinforcing consumer demand. In pushing forward innovation that was discontinuous with the standards and infrastructure already in place, these platform shifts systematically disabled the existing moats of the incumbents as users adopted the new technology and stranded the now obsolete incumbent infrastructure.
Due to the historical context of discontinuous platform shifts laying waste to incumbent competitive advantages, the incentive structure facing the mega cap technology incumbents favors over investing in any new thing when the alternative is underinvesting and missing a potential platform shift. While this dynamic was certainly true in past potential platform shifts that proved to have limited real-world applications as of yet (virtual reality, Metaverse, and Blockchain to name a few recent ones), the financial backers of this cycle (big tech, sovereign nations) are more intertwined and driven more by supremacy than by an adherence to return on invested capital. For businesses whose historical performance is due to their capital light, scalable, high return on invested capital attributes, a ramp in capital intensity by upwards of one trillion dollars with all the depreciation and amortization yet to hit income statements is concerning. To a skeptical eye, artificial intelligence is beginning to look like a capital cycle on steroids.
In a typical capital cycle, capital is attracted to a high return business as managers seek growth by expanding production. Competitors, not wanting to miss any growth opportunity embark upon similar expansions that cause the future supply picture to lag current demand trends. Analysts and investors often confuse this cyclical tailwind as secular and then extrapolate cyclical trends by putting elevated multiples on what prove to be temporary revenues and earnings. As the industry wide expansion comes online, new supply floods the market and as returns fall below the cost of capital, businesses cut prices, valuations collapse, and businesses shut down. Even with the benefit of hindsight, human nature does not change, and this cycle repeats itself time and again.
As it relates to artificial intelligence, there is little evidence yet that this capital cycle is any different. A massive inflow of investment from big tech into the foundational infrastructure layer is fueling AI startups that then fund big tech’s own demand acceleration by securing access to GPUs, data centers, and computational power. On top of this effective vendor financing, venture capitalists are adding fuel to the fire by putting more than half of their investments into artificial intelligence businesses, and in some cases, stockpiling GPUs to use as enticements to win lead positions in funding rounds. Is artificial intelligence simply a capital cycle on steroids that is in effect putting high multiples on a temporary stimulus, or is this a familiar case of the limits of one’s vision being seen as the limits of the world? Using a simple definition of AI as the manufacturing of intelligence from large amounts of data, there are a tremendous amount of possible use cases, but there is a very real difference between what is possible, probable, and profitable.
What is possible is that as artificial intelligence becomes accessible to the point where real-world applications are developed, more parts of the economy become white space for the technology sector. As businesses look to use this technology to remove production bottlenecks and unlock productivity gains, they will need to increase capital expenditures and operating expenses. The question remains whether businesses are compensated for this increased spending or whether the rules of competition dictate that any gains are competed away before sustainable pricing and margin gains accrue. When one thinks of an iPhone as a remote control for the physical world, it is easy to think about all the ways that artificial intelligence could help overlay a digital world onto a physical world. The ability to do so could spark a revolution in applications and business models from traditional manufacturing to materials sciences and personalized medicine. However, a lack of any meaningful real-world innovations in the application layer of artificial intelligence puts this squarely in the possible scenario today. The challenge lies in separating the possible from the probable, and the probable from the profitable.
Like any model, the quality of an artificial intelligence model is determined by the quality and relevance of the data input. While an artificial model itself is a commodity, those who have proven competence in data collection and management stand to accrue outsized competitive advantages should advances in artificial intelligence permeate the application layer. Our focus on repeatability in a business model means that the consumable nature of our businesses’ product or service reinforces a well-developed data stream to inform and drive strategy. The closed loop nature of this data stream enables businesses to improve products that then get used more, and as the products are used more, they then generate more data to inform the model and further improve the product. Our businesses' incumbent oligopoly or monopoly position in their niche means that they not only already have treasure troves of data, but they also have the scale to refine processes for data collection and interpretation and the distribution to push products. In short, should artificial intelligence develop an application layer that drives a meaningful paradigm shift, our businesses are in an enviable position of benefiting from it while not being dependent upon it for their respective paths to value creation.
Comentarios