Untangling the AI Economy

Untangling the AI Economy

Amid the breathless pronunciations of how AI will upend existing social-economic paradigms, rendering whole fields of work redundant and potentially creating the requirement for a "post-labour" economics, have been several similarly eye-watering investment announcements. In the United States as part of the Stargate Artificial Intelligence (AI)1 infrastructure project announced by the current US President Donald Trump, 500 billion dollars over four years have been scheduled to be invested in large-scale datacentres, with the first 100 billion dollars, leveraged from a group of equity funders including Soft Bank, OpenAI, Oracle and MGX being allocated immediately in order to construct two data centres based in Abilene, Texas which are currently expected to be completed before the end of 2025. In the European Union (EU) the InvestAI initiative aims to mobilize around 200 billion euros including a new European fund of around 20 billion euros to create AI gigafactories.These gigafactories3 are envisioned as high capacity AI infrastructure hubs utilizing extensive computing infrastructure, including the use of around 100,000 AI Chips to develop specialized models in specific areas such as medicine and science compared to existing AI factories which typically contain around 25,000 chips.

Together these are intended to operate much like fixed capital "enabling AI companies, in particular SMEs and startups, as well as researchers in different scientific disciplines" and "supporting the development of AI solutions tailored to the needs of different industrial sectors, public authorities and scientific disciplines".Such technology does not come cheaply however with each factory predicted to cost around 3-5 billion euros, hence requiring the need for what the European High Performance Computing Joint Undertaking (EuroHPC JU) refers to a "a more industrial and market-driven approach" potentially involving the use of public-private partnerships across EU member states and other states participating in the initiative. However the EuroHPC still sees a large amount of funding being provided by member states on case by case basis and in a brief section in the public AI gigafactories consultation there's a mention of state funding for AI gigafactories reaching around 35% of total investment.

As in other public-partnerships one suggested use of this funding has been "to help de-risk private investments" helpfully explained in the footnotes as "mechanisms where public funds reduce the financial exposure or uncertainty faced by private partners, thereby making the investment proposition more attractive and viable."Given much of the current coverage of AI development it would appear quite understandable why governments and other supranational institutions might want to take a risk on funding it. In 2024 the management consulting company, Bain and Company predicted the AI market could reach between $780 - $990 billion dollars. Following a reference to the 2024 Q3 earnings call by NVIDIA CEO Jensen Huang who stated “Generative AI is the largest TAM [total addressable market] expansion of software and hardware that we’ve seen in several decades.”

Bain predicted a 40 - 55% growth in the AI hardware and software for the next three years with fluctuations in supply and demand creating volatility along the way but overall a long-term, durable upwards trend.When looking at where this growth in the market is predicted to comes from however a more complex picture emerges, a great part is expected to come from new models, requiring greater computational power, infrastructure and energy usage with the expectation that they will also deliver gains in intelligence and performance.The expansion of these models is expected to drive further demand for a range of inputs including graphics processing units (GPUs), substrates, silicon photonics and power generation. This is as detailed in the report is somewhat complicated by certain algorithmic innovations including Retrieval Augmented Generation (RAG) which rather than having LLMs focus on pre-trained data, allows models to retrieve data from relevant external sources, such as documents, databases or other information repositories.

The other innovation being Vector Embeddings, which transform complex data such as images or videos into numerical representations, together these regularly handle much of the computing, storage tasks and networking closer to where the data is stored reducing latency and making data more secure and private.In this context smaller language models developed for specific tasks utilizing open source frameworks such as Meta's llama, Mistral and Falcon and proprietary offerings such as Claude's Anthropic and Google's Gemini  are expected to become more important as they are expected to utilize lower overall energy costs and provide greater efficiency than the general purpose language models.Another area for growth is the increasing adoption of LLM-enabled software by existing independent software companies such as Adobe, Microsoft and various others which is expected to allow such companies to easily add generative AI capabilities to their existing software suite rather than developing custom solutions.

Further innovations and supposedly potential sources of revenue can be seen in the increasing verticalization of the technology stack, that is to say an increased specialization of tech companies towards the development of AI including the development of custom silicon for AI computation, such as Google's Tensor Processing Units (TPUs), Amazon's Graviton and Meta's Meta Training Inference Accelerator (MTIA). NVIDIA, who this year became one of the first companies to achieve $4trn dollars in market value, and alongside increased demand for AI products have also looked to increase their hardware offerings unveiling new compute fabrics, Nvidia's attempt to unify different computing elements such as GPUs, CPUs and networking elements underneath a new architecture, allowing for the sharing of workload and data responsibilities.

Real Demand

While, however this significant investment is matched by considerable existing and ongoing innovation in the computing space one area which appears to be lagging is in the current application and correspondingly the returns derived from this investment. An interesting report by the Masachusetts Institute of Technology (MIT) revealed that of firms widely adopting AI as part of their workflow only around 5% were extracting any real value, with the other firms remaining "stuck with no measurable P&L impact". A review into the work performed by AI agents, by Carnegie University, revealed that one of the best performing AI agents, Google's Gemini Pro, was only managing to complete real world office tasks around 70 percent of the time with other agents performing significantly worse.

With the accounting firm PWC suggesting AI would contribute around six trillion to the global economy by 2030, the kinds of productivity gains from AI would have to be almost exponential. An article appearing in MoneyWeek made similar observations. Starting from the law of conservation of value which states that prices cannot stray too far or for too long from value and that value depends, among other things, on output. Investors in this scenario should be able to look at future streams of income and from them discover how to recoup their initial investment. With significant amounts being invested in seven technology firms namely, Nvidia, Amazon, Meta, Apple, Tesla, Microsoft and Google one should expect to see amounts of revenue in line with the earlier quoted sums. However so far the expected revenue has been around $35 billion. For the this to be commensurate with the level of investment however this should be much more (a typical financial analysis would put the amount around $600 billion).

The distorting effect of this speculation is starting to become increasingly clear with 26% of the growth seen in the S&P 500 appearing to be attributed to investment in the big 7, meanwhile profits in markets like consumer goods, financial markets and raw materials appear to lag significantly behind the seemingly inexorably climbing stock prices. Returning to MoneyWeeks observation of the law of the conservation of prices, something would appear to be amiss and indeed this can be found in measurements like corporate profits barely moving from the second quarter of the year before.Similarly how much of the market for AI operates in practice reveal in many ways the paradox at the heart of much the AI economy. In the Atlantic's coverage of CoreWeave one of the companies cited as being at the heart of the AI revolution they noted a key part of how this company operated was by buying up high end hardware and developing datacentres and then leasing these datacentres to companies unwilling to take on the cost of this infrastructure themselves including NVIDIA itself meaning that essentially streams of finance stemming from NVIDIA are essentially being used to create at least part of the demand for NVIDIAs products.

Indeed many of CoreWeave's customers however are many of the companies already operating in this space with 70% of its revenue coming from Microsoft and smaller amounts coming from NVIDIA and OpenAI.The financing for this, outside some of the initial IPOs has largely been linked to debt with the company so far spending upwards of 19 billion dollars whilst bringing in 5 billion dollars in revenue, whilst creating several newly formed business entities to take on debt on CoreWeaves behalf. This is not a mode of operation unique to CoreWeave, with companies like Amazon recently issuing around 15 billion dollars worth of debt via a six part deal following similar moves by Google, Meta and Oracle. Such behaviour typically, but not only can be explained by a shift towards capital intensive investment, repay debt maturities and broader business investment however they are somewhat premised on companies being able to recapture this value at a reasonable time, something which, as  the financial reporting outlet Morningstar suggested investors might be looking at more keenly in 2026.Similarly the partnership CoreWeave has with NVIDIA is not entirely dissimilar to multiple other companies also specializing in AI products. Anthropic and OpenAI both have partnership's with NVIDIA allowing them to exchange equity for computing infrastructure due part to a lack of immediate cash flow compared to NVIDIA (one of the companies that has not unlike the others mentioned so far relied largely on new debt issuance for investment). 

In a similar fashion to CoreWeave however OpenAI is expected to only make around 10 billion dollars in revenue and lose around 15 billion this year with the expectation that it will only become profitable in 2029. AI spending according to some estimates is set to surpass around 400 billion dollars against 60 billion dollars in revenue however those advocates of AI regularly caution against those looking critically at the nature of AI spending with VCs such as Magnus Grimeland who cited the fast pace of adoption, however in the article covering his comments the figures cited, such as the 10 billion in revenue made by OpenAI don't quite match up to the capital expenditure previously mentioned similarly they do not mention the various instruments including the further financialization of debt used to purchase datacentres or GPUs themselves reportedly being used as collateral for chip purchases this of course being at least partly based on the underlying assets continuing to hold their value.

AI & Accumulation

Over the course of this year, spending in AI Capex, which can in part be seen as spending on information processing hardware and software,  for many economies has already been seen to outstrip consumer purchasing power resolving at least for now the surplus absorption problem but in actuality papering over weaknesses in the real economy most notably the lack of profitable investment opportunities and in many countries increasingly poor jobs data and higher costs.The spectre of over-accumulation looms over the large over the AI economy with investment in fixed capital so far running up against underutilization. Something which continues to be handwaved away in recent statements by figures such as Huang who frequently speaks of an exponential demand for both AI products and computing power. However at this point in time seem more akin to a temporal fix, or in other words a claim on future profitability as opposed to accurate description of current trends.

The continuing struggle to valorise capital, due in part to faltering, purchasing power also places a cap on future profitability and further weakens the strength of labour. While venture capitalist and White House 'AI and Crypto Czar' David Sacks has already stated there will be no AI bailout, his comments mask continued state intervention in the current AI economy, not only in the private-public partnerships driving investment in data centres but the continuous adoption of AI technologies by state entities.

More recently the US President Donald Trump signed an executive order instructing federal institutions to allow 401k holders to invest directly in alternative assets like the private credit often used to support investments in AI products. Similarly the UK government, represented by Science Minister Lord Vallance has also urged for pension fund investment in sci-tech "unicorns of the 2030s" joining corporate spending as part of a push towards increasingly speculative investment as opposed to an expansion of productive capacity and subsequently a reduction in the rate of profit.

The beginning of the end?

The events of last week which saw around $1 trillion dollars wiped off the stock market in many ways signals if not the end a substantial change in the some of the flows of speculative investment with news outlets such as CNBC remarking that even huge e-commerce platforms such as Amazon were potentially generating concerns with their current amount of spending. While articles continuing to praise the latest AI models continue to trickle into the media ecosystem following the release of Claude 4.6, it seems investors looking at the actual returns on LLM build outs appear far more cautious for now, with some analysts questioning Amazon in particular on it's prioritization over AI over it's already tried and tested web services model.

Apple by contrast with it's slightly more cautious approach to AI compared to many other firms saw it's stock price increase by 7% in the same period. Behind the hype it appears real questions are being asked not only of expected productivity increase that can be delivered from these technologies but perhaps more crucially the time frame.

Notes

  1. https://www.techtarget.com/whatis/feature/Stargate-AI-explained-Whats-in-the-project
  2. https://ec.europa.eu/commission/presscorner/detail/en/ip_25_467
  3. https://eurohpc-ju.europa.eu/public-consultation-ai-gigafactories-2025-04-09_en
  4. https://www.bain.com/insights/ais-trillion-dollar-opportunity-tech-report-2024/
  5. https://mlq.ai/media/quarterly_decks/v0.1_State_of_AI_in_Business_2025_Report.pdf
  6. https://futurism.com/ai-agents-failing-industry
  7. https://moneyweek.com/investments/tech-stocks/the-most-likely-outcome-of-the-ai-boom-is-a-big-fall?utm_source=chatgpt.com
  8. https://futurism.com/ai-bubble-economy-bleak
  9. https://www.theatlantic.com/economy/2025/12/nvidia-ai-financing-deals/685197/
  10. https://fortune.com/2025/08/11/data-centers-are-eating-the-economy-and-were-not-even-using-them/
  11. https://www.cnbc.com/2026/02/06/ai-sell-off-stocks-amazon-oracle.html