The AI Supercycle
Octobber 2024
Supercycles
Supercycles are long-term economic or technological waves lasting decades, distinguishable from shorter business cycles. For many of us, experiencing just one or two supercycles in our professional lives can be a profound journey, allowing us to witness the powerful forces that shape our economic landscape. These waves are primarily driven by powerful forces such as capital dynamics, technological advancement, and demographic shifts, significantly shaping economic history. Notable periods include the post-Gold Standard era and the current fiat currency-driven system, marked by significant supercycles impacting the global supply chain, from raw materials to finished goods.
Over the past 125 years, supercycles have defined economic patterns, with the mid-20th century as an exception due to restricted global capital flows during the Bretton Woods period. This period, established in 1944, was marked by a fixed exchange rate system and strict controls on international capital movements, which significantly influenced the trajectory of supercycles. These cycles typically begin with dramatic price movements in commodities, progress through manufacturing, and culminate in household consumption, alternating between inflationary and deflationary phases.
Historical examples include the Classical Supercycle (1870s–1930s), tied to the Gold Standard, which ended during the Great Depression, the severe worldwide economic depression that occurred during the 1930s, beginning in the United States. The Modern Supercycle was initiated by the Volcker Federal Reserve in 1979 in the US. Theoretical frameworks like Kondratieff Waves and Carlota Perez's model offer valuable supercycle insights. Kondratieff Waves describes 50-year cycles driven by innovation clusters and socio-economic shifts. Perez's framework outlines four phases of technological revolutions: irruption, where innovation creates new industries; frenzy, marked by speculative investments often detached from actual value; synergy, where financial and production capital align after speculative bubbles burst; and maturity, where saturation leads to stabilization and capital seeks new areas of growth.
Technological shifts have always been central to supercycles, with historical innovations like the steam engine and electricity transforming economies. In the modern context, technologies such as the internet, mobile, cloud, blockchain and AI are not just defining but significantly shaping the current supercycle. These advancements influence demand for rare earth resources, interconnected global supply chains, and capital markets, causing economic booms and busts across sectors.
Lessons from the Smartphone/Mobile revolution highlight that transformative technologies take time to mature. For example, while the iPhone launched in 2007, significant applications like WhatsApp and Uber emerged years later. It is the same with cloud solutions, which started in 2007, but big SaaS businesses began emerging in 2014. Similarly, the AI revolution, currently in its early irruption phase, will take time for its "killer apps" to appear. AI represents a fundamental shift from logical computation to human-like reasoning, creativity, and language understanding, comparable to the transition initiated by the internet revolution in 1993.
The AI Super Cycle
The 'AI Supercycle' refers to a period of extraordinary growth and innovation in artificial intelligence, fueled by factors like the explosion of big data, advanced machine learning algorithms, and widespread adoption of AI across industries. Leaders like Andrew Ng Dario Amodei and Kai-Fu Lee emphasize AI's potential to impact every industry and job, likening its significance to electricity. As AI advances, it promises new opportunities in fields like data science while automating many tasks and jobs.
Artificial General Intelligence, as envisioned, would surpass human intelligence in various domains, operate autonomously with access to a wide range of digital tools, and perform tasks faster than humans. This "country of geniuses in a data centre" could tackle complex problems collaboratively or independently, potentially driving unprecedented progress. However, such progress is tempered by practical and physical constraints, including the speed of real-world processes, data availability, intrinsic complexity, and societal or legal barriers.
While AI promises unprecedented progress, its potential is tempered by practical and physical constraints. These constraints, while initially bottlenecks, could diminish over time as AI innovates around them. However, absolute limits, like physical laws, will remain immutable. The rate at which AI overcomes these challenges will define its ultimate impact, necessitating new frameworks to evaluate its contributions relative to other production factors like labour and capital. Understanding the interplay between intelligence and these factors will be crucial for navigating this transformative era effectively.
AI is already fundamentally altering sectors like construction, healthcare, and finance through applications ranging from generative tools such as ChatGPT and Stable Diffusion to breakthroughs in biotechnology, including generative biology and DNA-modifying therapies. The AI revolution is in its irruption phase, akin to the early days of the mobile revolution, signalling transformative changes but requiring time to reach its full potential. Unlike past technological shifts, the current supercycle involves converging AI, connected ecosystems, and biotechnology, generating synergistic "flywheel" effects that redefine industries and economies.
The competitive advantage in this supercycle lies in proprietary data, enabling businesses to develop AI solutions tailored to specific needs. Additionally, decentralized protocols, such as blockchain, remain resilient despite market volatility, complementing AI-driven innovations. However, the rapid pace of technological change challenges leaders to adapt quickly. McKinsey estimates that AI could contribute $2.6–$4.4 trillion annually to the global economy, underscoring the significant economic opportunities AI presents. Leveraging foundational AI platforms is critical for incumbents, while startups must capitalize on scalable innovations.
Energy Demands and AI Super Cycle
AI's rapid adoption is reshaping global energy demands. Data centres, consuming 4% of US electricity in 2022, are projected to see a 50% increase by 2026, driven by AI workloads and GPU requirements. Goldman Sachs estimates that AI will drive a 160% increase in data centre power demand by 2030, raising their share of global power consumption from 1-2% to 3-4%. In the US alone, data centres will consume 8% of total power by 2030, necessitating $50 billion in new generation capacity and increased natural gas demand. Europe will need over $1 trillion to upgrade its grid, with data centres potentially matching the current consumption of Portugal, Greece, and the Netherlands combined. Significant energy generation, storage, and transmission investments are necessary to meet these demands, with innovative solutions such as small modular nuclear reactors gaining traction.
Environmental concerns also emerge as training AI models like GPT-3 generate substantial carbon emissions. However, AI provides solutions to these challenges, optimizing renewable energy integration, enhancing grid efficiency, and enabling intelligent grid technologies. Innovations like pruning and quantization reduce AI's computational demands, while specialized hardware improves operational efficiency. Innovations around green data centres and AI-driven cooling systems are emerging, further minimizing the environmental footprint.
Governments are taking steps to meet AI data centre demands. Notably, the reopening of the Three Mile Island nuclear plant to generate carbon-free power, the launch of a White House Task Force on AI Data Center Infrastructure, and bipartisan legislative efforts like the Department of Energy Artificial Intelligence Act aim to secure energy resources, streamline permitting, and invest in clean energy solutions. The Biden Administration emphasizes sustainability through incentives and technical assistance to foster AI infrastructure growth. In Europe, key initiatives include utilizing renewable energy sources like hydroelectric, wind, and geothermal power in countries like Norway, Sweden, and Iceland. It emphasizes the role of government policies like the EU's Green Deal and tax incentives to support sustainable data centre practices.
Semiconductors and AI Supercycle
Semiconductors are at the heart of the AI supercycle. AI's strategic importance lies in its role in real-time decision-making, driving global competition in data, defence, and economic development, with nations like China heavily investing to lead in this space. Semiconductors, the foundational layer of the AI technology stack, support data storage, processing, and training, transitioning the industry from cyclical growth to a more stable, long-term trajectory.
The market for AI-specific semiconductors is projected to grow by 18% annually through 2030, when they are expected to comprise 20% of total semiconductor revenue. This shift allows semiconductor companies to capture a much larger share of the AI technology stack, with revenue potential far exceeding previous technological shifts like cloud computing or smartphones. Specialized chips, such as Google's Tensor Processing Units (TPUs) and Application-Specific Integrated Circuits (ASICs), are increasingly vital for managing AI workloads efficiently. The rise of AI has created a "positive demand shock" in semiconductors, spurring advancements in high-bandwidth memory (HBM), on-chip memory (OCM), and high-speed networking to meet the demands of applications like autonomous driving and real-time decision-making.
Four key chip types are at the core of AI's demands: ASICs (Application-Specific Integrated Circuits), CPUs, GPUs, and FPGAs. ASICs stand out for their ability to accelerate specific AI algorithms efficiently. Companies like Google have optimized ASICs for machine learning and deep learning tasks through innovations like Tensor Processing Units (TPUs). This specialization is driving semiconductor companies to move beyond commoditized hardware into differentiated, AI-specific products.
AI is transforming semiconductor hardware demands across computing, memory, storage, and networking. Data centres and edge applications are witnessing a surge in training and inference workloads, with ASICs poised to dominate due to their efficiency. Memory preferences are shifting toward high-bandwidth memory (HBM) and on-chip memory (OCM) for faster processing and compact designs, with industry leaders adopting HBM despite its higher costs. Storage needs are escalating, with annual growth of 25-30% driven by AI's data-intensive processes, making non-volatile memory (NVM) critical. Networking requirements emphasize high-speed, reliable solutions for AI tasks like real-time edge computing, with innovations such as programmable switches addressing bottlenecks.
AI's demand shocks—driven by the need for brute-force computation and vast data processing—are reshaping the semiconductor landscape. Companies that innovate in AI-specific hardware and adapt to shifting demands will play a pivotal role in the next technological supercycle, capitalizing on unprecedented growth and market opportunities.
To remain competitive, businesses must invest in AI-specific hardware and integrate it into their infrastructure. Success requires aligning technology with business goals, a lesson underscored by parallels between Generative AI (GenAI) and the Big Data era. Like Big Data, GenAI follows a hype cycle, with risks of disillusionment if poorly implemented. Strategic adoption, focusing on technically feasible and economically viable tasks, ensures sustainable growth while avoiding pitfalls like misaligned goals and inadequate infrastructure.
GenAI applications go beyond content generation, including advanced tasks like entity extraction, sentiment analysis, and operational automation. Even small and medium-sized enterprises can capitalize on its potential by leveraging APIs and SaaS platforms. In conclusion, integrating AI and semiconductors drives transformative opportunities in industries and economies, marking a pivotal phase in the technological supercycle.
Data Centers and AI Supercycle
The evolution of data centres in the AI era is marked by a shift from reliance on the "Big 3" cloud providers—Microsoft Azure, Amazon AWS, and Google GCP—to the emergence of specialized GPU cloud providers like Coreweave, Lambda, and Crusoe. These newcomers focus on AI training and inference, leveraging advanced GPU technology for greater efficiency. Further, AI workloads require immense computational power, driving innovation in data centre efficiency and sustainability. Companies like Crusoe are utilizing renewable energy sources, while disruptive technologies like Fractal aim to address input/output bottlenecks, reducing costs and power usage. AI-related investments are projected to contribute over 2% to US GDP by 2032, with firms like NVIDIA playing pivotal roles in this growth.Private equity firms are also investing billions in hyper-scale data centres to meet rising AI infrastructure demands.
The new generation of data centres is rapidly growing in size, density, speed, and energy consumption. As Moore's Law, which predicts continuous improvements in semiconductor performance through shrinking transistors, slows down, integrating systems at both the chip and data center levels is the solution. This includes designing data centers as complete systems rather than just individual servers, which are also being integrated to improve efficiency. Companies like Nvidia and AMD are leading the way by offering integrated servers and data center systems to address this shift.
A key challenge in this evolution is managing the massive energy requirements. Companies like AWS are investing in renewable energy sources like solar and wind farms, while nuclear power and long-duration batteries are seen as potential long-term solutions. Data centers are moving toward liquid cooling and even immersive cooling methods to cope with the increased heat from dense servers.
In parallel, innovations in permitting and construction automation are addressing bottlenecks in scaling data centers. System-level design, exemplified by Nvidia's focus on accelerating computing, is revolutionizing how servers, networking, and memory interact to maximize performance while minimizing power use. Nvidia's "System Technology Co-Optimization" (STCO) strategy goes beyond improving individual chips, instead optimizing entire systems, including the design of GPUs and interconnects like NVLink. This approach allows for more excellent scaling and performance at the rack level, utilizing passive copper and liquid cooling to reduce costs and energy consumption.
The landscape is transforming into an "AI infrastructure supercycle," blending economic, technological, and environmental advancements. GPU cloud providers and innovative solutions are reshaping IT and infrastructure industries while challenging traditional dominance, ensuring efficiency and sustainability in this new era.
AI Supercycle and Global Macro
The rapid rise of GPTs and AI super cycle has significant implications for industries, economies, and individuals. While it's difficult to predict the exact speed and scale of AI adoption, business leaders, politicians, bureaucrats and professionals need to understand its potential impacts. AI's widespread use could significantly boost productivity, as seen in studies where AI assistance has led to notable gains in worker efficiency. This could result in increased economic growth, higher incomes, and changes in inflation dynamics. However, this transformation will pand out over the two to three decades, and the full benefits of AI may take time to realize, similar to the delayed productivity surge seen during the 1990s IT revolution.
Integrating AI into businesses is expected to revolutionize efficiency, profitability, and market stability while reshaping traditional roles and dynamics in finance and beyond. Companies face challenges in balancing innovation with potential disruptions, as rapid adoption of AI could create unintended consequences such as workforce dissatisfaction and operational misfires. Despite these hurdles, AI is poised to enhance processes across industries, offering cost savings, optimized supply chains, and margin expansions. Firms embracing AI will likely experience lower equity risk premiums, higher valuations, and improved credit quality due to greater predictability in earnings.
As AI democratizes, mega-cap tech and data-rich companies will dominate its revenue generation, but productivity gains will benefit various sectors, including industrials, retail, and financial services. This structural shift could lead to tighter credit spreads, reduced market volatility, and fewer opportunities for traditional traders and active managers, as efficiency-driven markets limit alpha generation. For credit investors, AI-driven operational stability aligns with their focus on risk minimization, promising a solid environment for credit investment through the decade. Enhanced corporate predictability and government intervention to mitigate systemic risks reinforce this trend, fostering an investment landscape characterized by stable growth, lower volatility, and higher valuations for efficient, AI-driven enterprises.
If AI lives up to its potential, it could profoundly affect economic variables like interest rates, inflation, and income distribution. AI's ability to enhance productivity could lead to higher growth and incomes, but it could also result in deflationary pressures as goods and especially services become cheaper. e.g. massive deflation, especially in high-skilled professions like law, medicine, consulting, and finance. As AI takes over complex tasks, human expertise will become less valuable, causing value to shift to software and AI companies and wages in these fields to drop. AI systems, which work faster, more accurately, and cheaper than humans, will disrupt industries by making expert labour less scarce, driving down service prices and causing wages to stagnate or shrink. Central banks may need to rethink their monetary policies, and investors must adjust their strategies based on how AI influences interest rates and economic conditions. Furthermore, AI's impact on labour markets could lead to changes in income distribution, potentially exacerbating income inequality.
The deflationary shock scenario will change the business and political landscape. Companies that don't adopt AI may face extinction, as they have in every previous cycle, while AI-powered firms thrive with lower costs and faster innovation. For investors, technology, especially AI, blockchain, and renewable energy, presents opportunities, but they must be ready for extreme volatility. Professionals should upskill to work alongside AI to prepare, while investors should diversify their portfolios, focusing on sectors that embrace this shift. Entrepreneurs who harness AI to build efficient businesses will find new opportunities as AI accelerates product and service development. The next six years are crucial for adapting to this economic seismic change.
The future of work can be better understood by looking at past technological disruptions, which have significantly reshaped job structures over time. In 1880, agriculture accounted for over 40% of US jobs, but now it's less than 2%. Similarly, blue-collar jobs peaked at 40% in the mid-20th century and have since dropped to about 20%, while professional occupations have grown to represent 45% of jobs today. This evolution was driven by technologies like steam power, electricity, and computers. Looking forward, generative AI is expected to automate up to 30% of US work hours by 2030. Still, technological change is slow and often unpredictable, as seen with electricity's delayed adoption despite early inventions. Critical lessons from past disruptions suggest that adoption is as important as invention, technology complements human skills, and rather than destroying jobs, technological progress often leads to more complex roles and greater prosperity, as seen in sectors like healthcare. Understanding these patterns helps frame the impact of AI on the workplace, emphasizing the need for skills-based approaches and continuous learning in navigating future job shifts.
Integrating AI into the workforce requires varying levels of understanding, from deep expertise in developing AI solutions to essential fluency for those in technical-adjacent and general roles. Rather than competing with human workers, AI should complement our tasks by automating repetitive work, allowing us to focus on more strategic and creative responsibilities. However, successfully implementing AI involves a comprehensive organizational shift, including reskilling employees and fostering a culture of continuous learning and experimentation. While many companies are still refining their AI strategies, starting with low-risk, high-visibility projects can generate momentum and demonstrate AI's potential. The future of AI will significantly transform industries, particularly the information sector, where AI will replace many roles but also create new opportunities that require human emotional intelligence and creativity. As AI rapidly advances, it's essential to recognize its potential to surpass human performance in tasks involving creativity and intellect, making it crucial for the workforce to adapt, upskill, and embrace the upcoming AI revolution.
The challenges of the AI Supercycle are profound for policymakers and politicians. Addressing structural changes in the economy, managing economic inequality, and restructuring value creation within an economy require solid recognition of the stages of supercycles. Effectively intervening is essential to mitigate risks and ensure stability.