<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Threading the Needle]]></title><description><![CDATA[A weekly publication about the political economy of AI.]]></description><link>https://writing.antonleicht.me</link><generator>Substack</generator><lastBuildDate>Sun, 12 Apr 2026 17:17:38 GMT</lastBuildDate><atom:link href="https://writing.antonleicht.me/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Anton Leicht]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[antonleicht@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[antonleicht@substack.com]]></itunes:email><itunes:name><![CDATA[Anton Leicht]]></itunes:name></itunes:owner><itunes:author><![CDATA[Anton Leicht]]></itunes:author><googleplay:owner><![CDATA[antonleicht@substack.com]]></googleplay:owner><googleplay:email><![CDATA[antonleicht@substack.com]]></googleplay:email><googleplay:author><![CDATA[Anton Leicht]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Press Play To Continue]]></title><description><![CDATA[&#8216;Pausing AI&#8217; is bad policy and worse politics]]></description><link>https://writing.antonleicht.me/p/press-play-to-continue</link><guid isPermaLink="false">https://writing.antonleicht.me/p/press-play-to-continue</guid><dc:creator><![CDATA[Anton Leicht]]></dc:creator><pubDate>Tue, 31 Mar 2026 12:25:19 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/4b21c8b0-2c4c-4f70-802d-c4478dfa3e6e_500x373.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em><strong>The zealous wing of AI safety advocacy is riding high on a string of recent PR successes</strong></em>: demonstrations in favour of a &#8216;pause&#8217; of AI development have expanded in recent months, and U.S. lawmakers have begun engaging with the idea. Last week, this culminated in Senator Bernie Sanders and Representative Alexandria Ocasio-Cortez introducing a <a href="https://www.sanders.senate.gov/press-releases/news-sanders-ocasio-cortez-announce-ai-data-center-moratorium-act/">bill</a> formalising one version of a pause: a moratorium on the construction of data centers. These data centers are the prerequisite of frontier AI development, so a ban on their construction means a crash of the AI market and a pause of the progress it is driving. Advocates for such a move are Luddites, of course&#8212;but on the eve of profound and often scary technological transformation, <strong>many feel that now </strong><em><strong>is</strong></em><strong> the time for some measured Luddism.</strong></p><p>I believe these advocates are mistaken about the politics even if we grant their view of the risks: <strong>pauses and moratoria likely sabotage our progress on a narrow path toward beneficial and safe advanced artificial intelligence. </strong>And in the likely event of their political failure, they&#8217;ll leave behind a much worse environment of AI politics.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p>It&#8217;s worth clarifying as much because clearly, a pause of some kind <em>is</em> something that some policymakers are asking for. Worse, it&#8217;s something a government <em>could</em> enact. We find ourselves in an AI paradigm that depends on the most complex value chain in history culminating in huge datacenters&#8212;shutting down this supply chain is costly, but not strictly speaking impossible. And so despite all caveats, pause proposals are now on the streets of San Francisco, in the U.S. Senate, and on the minds of many AI policy advocates. That makes them worth engaging with: <strong>the &#8216;pause&#8217; is on its way to becoming the canonical bad idea in AI policy.</strong></p><p>Many others have made the all-things-considered point that a pause introduces prohibitive strategic and economic costs. But to an ardent safetyist, these costs are often bearable. They see us on the path to existential catastrophe, and will pay a high price to avoid it. But I believe that, <strong>even if you are principally and perhaps exclusively concerned with reducing catastrophic risks, you should oppose the notion of a pause. </strong>The idea&#8217;s current uptake is not indicative of lasting political traction; its most likely implementations would be a huge safety setback; and it is lastingly making AI politics worse.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!X0lb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!X0lb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png" width="444" height="111" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:444,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!X0lb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a></figure></div><h1>The Golden Path</h1><p>It&#8217;s first worth clarifying the backdrop against which this debate takes place. That backdrop is one of <strong>an AI revolution that is going better than we had any right to hope for</strong>. The type of AI progress we have right now is&#8212;both by the standards of AI paradigms people had predicted years ago and by the standards of past waves of technological revolution&#8212;highly democratic and in the hands of forces for good. These features are not in themselves guarantees that things are going well; they do not justify laissez-faire, and they don&#8217;t mean the work will be easy. But they should call into question the idea of dramatic, pivotal action. In particular, three features of the current environment strike me as fortunate:</p><ul><li><p>We&#8217;ve <strong>discovered the necessary technical breakthroughs about as early as we could have</strong>; they&#8217;re always just at the edge of what&#8217;s feasible on today&#8217;s compute. That means we face little to no &#8216;compute overhang&#8217;: no innovation has suddenly broken away from infrastructure constraint, and we keep facing hardware and infrastructure constraints to further capability jumps. That has so far meant gated speed for progress, which in principle allows for iteration on risks. It also reduces the purview of AI policy to trackable, infrastructure-rich major entities: you cannot build AGI in a cave with a box of scraps.</p></li><li><p><strong>The frontier is led by multiple private companies</strong>&#8212;neither by governments nor by a single monopolist. That&#8217;s a double-edged sword, but I believe it ultimately cuts in our favour: there are realistic checks and balances through competition for talent and incentives to create consumer value through the need to justify market share. And despite all shortcomings, we have seen races to avoid harm work out in incentivising some responses to child safety and energy use issues.</p></li><li><p><strong>Liberal democracies control most of the frontier AI supply chain. </strong>This, too, is extraordinary for an emerging technology in the 21st century. Autocracies are leading innovation or controlling the supply chain on drones, digital opium, robotics, modern missiles, and neon-bright city skylines&#8212;but most of the critical supply chain nodes for advanced AI are under the control of liberal democracies. That provides <em>people</em> with legislative leverage over the technology.</p></li></ul><p><strong>This path, however, is highly volatile.</strong> If it is to continue, the investments have to continue working out; if the providers of private or political capital get burnt once, others might pick up the torch. It also spawned out of an awareness gap: you could, in fact, see the future first in San Francisco, but now that everyone has made the trip, they won&#8217;t forget about it again. If we stop the trajectory now, and whatever version 2.0 of the AI industry regroups in a new technical and political reality, the same favourable trends might not hold.</p><p>All these are the reasons I don&#8217;t <em>want</em> this to stop.<strong> I don&#8217;t want to reshuffle the deck&#8212;I like our hand. </strong>I say that at this point because it clarifies the stakes of dramatic intervention: if we pass policy that destabilises the current trajectory too much, that resets the race, I&#8217;m not sure how the next iteration will play out. What happens when industry no longer needs governments in awareness, when the free world no longer leads autocracies in the relevant infrastructure? When the barriers to development and deployment aren&#8217;t necessarily linked to consumer preferences and actual usefulness anymore? I&#8217;d rather not find out, and so I&#8217;d much prefer to get this right by working on the margins of our current trajectory. If you&#8217;re in favour of rocking the boat instead, you&#8217;re putting a lot at stake.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!X0lb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!X0lb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png" width="442" height="110.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:442,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!X0lb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1>Everyone But The Center </h1><p>These days, many feel that the current paradigm is headed for disaster soon, and so they advocate decisive intervention now. And at first glance, <strong>their idea of a pause is having a moment</strong>. Images of one of the bigger protests to date are spreading on social media, and Sanders in particular is starting to sound like &#8216;the good timeline&#8217; in arcane AI safety posts on LessWrong from 2013: he&#8217;s invoking hardcore safetyist talking points, publicly engaging with some of the most prolific safety advocates, and is engaging in what used to be called doom-posting on the floor of the U.S. Senate. The view through the mists that separate Berkeley from the world outside is: there&#8217;s something real happening, the start of a trend toward the fabled moment of policymakers &#8216;waking up&#8217;.</p><p>One specific feature of the current moment that has made pause advocates hopeful has been its bipartisan nature. This is not just coming from one AI-pilled lawmaker (though Sanders is doing his best), but is driven by a fairly <a href="https://www.theverge.com/ai-artificial-intelligence/888841/pro-human-ai-declaration-fli">heterogeneous crowd of advocates</a> worried about jobs, environmental effects, power concentration, big tech&#8217;s alignment with Democrats in the 2010s or tech CEOs&#8217; alignment with the Trump administration in 2025, and so on. While this entire coalition isn&#8217;t exactly aligned with all versions of a &#8216;pause&#8217;, they do<strong> </strong>share a common motivation to drastically intervene in the pace of AI progress in service of preventing its risks. This alignment has been visible in recent public discussion, and it has even survived the introduction of the bill by left-wing outliers of the Democratic party, with Republican Senator Josh Hawley expressing sympathy with their concerns shortly after.  </p><p>And while many safety advocates aren&#8217;t naive about the coherence of this coalition, they do feel they&#8217;ve found a tiger they can ride to legislative success.</p><p>Yet <strong>there&#8217;s a difference between the anti-AI Horseshoe bipartisanism employed by the pause movement and moderate bipartisanism </strong>that usually leads to successful policy. Not a lot of ideas that have started on the radical flanks of both parties have seen their time come. All past areas of overlap between characters like Sanders and Hawley&#8212;from interest rate caps on credit cards to helicopter money in an inflationary environment&#8212;have seen little electoral or legislative success. Clown cars rarely fit 60 U.S. Senators.</p><p>By now, most of Congress knows this, too. A policy platform that brings together the populist left and the populist right is very difficult to sell to most moderate lawmakers that can stall and block federal legislation. Even those sympathetic to the idea that their party&#8217;s respective flank sometimes comes up with interesting new ideas will be particularly apprehensive of the horseshoe alignment. &#8217;Yes, it&#8217;s a Bernie idea, but it&#8217;s one of the good ones - see, <em>Steve Bannon&#8217;s <a href="https://time.com/7377579/ai-data-centers-people-movement-cover/">go-to guy</a> for fire and brimstone</em> endorses it too!&#8217; can&#8217;t be a great pitch with the majority of Congress. The congressional origin of this policy idea is twice counterproductive: it&#8217;s bad PR that leaves supporters at risk of being branded as radicals, and it&#8217;s read as evidence of a populist lack of sophistication.</p><p>In fact, that coalitionary structure is not unlikely to have the opposite effect: lawmakers that are looking to carve out their own position are looking for ways to make tangible the old adage of &#8216;maximising the benefits while minimising the risks&#8217;. There will be no cheaper signal than to disagree with the datacenter moratorium idea specifically to clarify why your own proposal is not anti-growth or anti-tech. &#8216;Condemning the pause&#8217; could easily become the legible way to put distance between yourself and the maximally luddite position to balance powerful donor interests and popular appeal. To a pause advocate, that can&#8217;t possibly have been the point.</p><p>Of course ideas that start on the fringes usually have a different purpose: they&#8217;re a way to put a concern into policy language, start a conversation, expand the window of discussion for moderate voices to propose policy that&#8217;s better fit to similar ends. We&#8217;ll return to the pause movement as a political Overton window-spreader shortly; <strong>but if we judge it as a policy movement, it seems unlikely to succeed.</strong></p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!X0lb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!X0lb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png" width="442" height="110.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:442,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!X0lb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1>Against &#8216;Directionally Good&#8217; Pauses</h1><p>Though things aren&#8217;t quite black and white. It&#8217;s now of course somewhat more likely than before that a very specific version of pause policy garners meaningful support: one advanced by a momentary anti-AI coalition. It is that version&#8212;not merely the whitepaper version&#8212;that we must scrutinise when discussing this new movement.</p><p>Their version of a pause would be<strong> the version that satisfies the horseshoe I described above</strong>. The success story, if it ever could be written, would be that of a political syzygy: groups all over the political spectrum aligning into one broad anti-AI omnicause and settling on something like a pause as the lowest common denominator of their policy interests. We know this coalition from past fights around AI preemption already&#8212;but with a crucial difference: on preemption the coalition worked because <em>stopping</em> legislation is easy to rally around. Here, it&#8217;s supposed to be leveraged in <em>favour</em> of something substantive, which invites much more complexity. Such alignment would in my view be the only way that a pause proposal gets close to the votes it needs in the current political environment&#8212;it&#8217;s the only way around the political forces I&#8217;ve described in the previous section. It&#8217;s unlikely to assemble fully, but stranger things have happened in the aftermath of the kinds of upheaval that AI might soon cause.</p><p>The version of a pause that would result from this coalition seems particularly bad, even by the standards of a pause. This is both for structural and political reasons.</p><p>Structurally, when the details get ironed out,<strong> safety-motivated pause advocates will not be the most powerful in the room</strong>. This is coming together at a rare moment of alignment of many interests &#8211; anxieties about jobs, wealth concentration, humanity, the environment, and existential risks &#8212;, and it will need to tap into all of them to get through. Basically all of these interests have bigger lobbies and bigger constituencies than catastrophic risks, so when there are trade-offs, they&#8217;ll bite against the ability of safety advocates to implement their version of the details. You&#8217;ll get a pause, but perhaps not the export ban; you&#8217;ll get your deployment frictions, but perhaps not the restrictions on internal development; and so on.</p><p>Politically, this is still a domestic conversation. The <strong>incentives of many policymakers driving this are to make national policy </strong><em><strong>at the very best</strong></em>. Cynically, you might think many mainstream political actors are only engaging to introduce bills and brand themselves as thought leaders&#8212;but let&#8217;s assume they&#8217;re in it to pass some legislation at least. For any political operator, this would have to happen on a short timeframe, ahead of the presidential primaries, ideally. If you get to a point where the domestic moratorium language is done and has a majority ready to go, are you <em>really</em> going to stop because some of the arcane details are not in place? Or are you going to take the win, campaign on the achievement, write into the bill a pinky promise to take care of the hard questions later? If you look at the legislative record of election-year policy making, especially by the characters involved, I think you&#8217;ll know the answer.</p><div><hr></div><h4><em><strong>Second Best Is Worst</strong></em></h4><p>Now of course this dynamic is not exclusive to AI policy; and oftentimes in policy, a near-win is already good progress, so it&#8217;s worth riding the tiger anyways. Perhaps we shouldn&#8217;t let the perfect be the enemy of the good? Not so in this case. <strong>The pause proposal hinges on its most complicated and least politically feasible element: an enforceable international treaty. </strong>Any suboptimal version therefore likely backfires; but it&#8217;s the suboptimal version, not the whitepaper, that&#8217;s gaining support.</p><p>The logic is simple, and acknowledged by pause advocates: if only the U.S. introduces pausing policy, the compute, capital, and talent will eventually regroup elsewhere and restart the same progress&#8212;just under different stewardship, and having learned the lesson of being paused by the backlash, outside of democratic jurisdiction. This strips democratic activism of most feasible levers and upends the favourable paradigm I&#8217;ve described before. To get around that problem, pause advocates <a href="https://www.astralcodexten.com/p/every-debate-on-pausing-ai?hide_intro_popup=true">usually</a> take an international view: domestic pauses would have to be aligned and agreed-upon through an international treaty.</p><p><strong>However, that is an immensely complicated task. </strong>The safety literature has brought forward some <em>technically</em> <a href="https://techgov.intelligence.org/research/an-international-agreement-to-prevent-the-premature-creation-of-artificial-superintelligence">sound ideas</a> for how to approach it, but there&#8217;s very little by way of political strategy to achieve it. As someone who spends much of my time on international AI policy, I&#8217;ll say that whatever political progress you make in America is not enough to get to such a treaty quickly. The Sanders-AOC proposal handles this unilaterally through restrictive export controls on chips&#8211;a solution that provides no answer to the obvious issues of supply chains migrating over time, to extant compute being consolidated elsewhere, and so on. As <span class="mention-wrap" data-attrs="{&quot;name&quot;:&quot;Dean W. Ball&quot;,&quot;id&quot;:5925551,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!mLaj!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F49371abf-2579-47be-8114-3e0ca580af8b_1024x1024.png&quot;,&quot;uuid&quot;:&quot;3e44b38e-a2a9-48af-a5cb-c21981f61ffc&quot;}" data-component-name="MentionToDOM"></span> has pointed out in his many attempts to <a href="https://x.com/deanwball/status/2037238417730228341">engage</a> with pause advocates on the substance, a unilateral pause has unconscionable consequences for the rights and liberties of American citizens.</p><p>And so <strong>a treaty or even international organisation of some kind would have to be the vehicle of choice.</strong> I believe this, too, could not be brought about unilaterally: in the current setting, U.S. lacks the soft power over allied democracies and the hard leverage over China to force through the fast creation of any international institution. This is doubly true because one miss, one nation you cannot induct suffices to derail the policy: one sovereign country deciding that it doesn&#8217;t like the pause idea, and it becomes the lead candidate for a compute haven and AGI development hub in the future. And there&#8217;s plenty of economic and political incentive not to participate: economic, to attract all the AI investment; and political, to position against an America that has grown wildly unpopular in many electorates around the world. </p><p>America, with its diplomatic reach into increasingly confident middle powers growing thin, seems unlikely to stamp out each and every defection&#8212;not by trusted advice and not by threatened aggression. The way to get this done, then, would be <strong>slow progress in aligning an enormously heterogeneous set of international players</strong> with very different views on AI, America, and what to do about either. The world as it is is very far away from aligned on this, and I&#8217;ve seen no serious suggestion to further this alignment. </p><p>It&#8217;s not outright impossible to get around these problems; it has been done before, though under circumstances of a much more united world. But recall that for this to fall apart, the treaty doesn&#8217;t have to be outright impossible. It just has to be hard enough for the pro-pause coalition to take the easy win and delay the hard part under political and electoral urgency. Given the enormous complexity and, more importantly, timing gap between passing a domestic moratorium and negotiating an international treaty, <strong>I believe it&#8217;s very likely we get the second-best version first</strong>&#8212;inverting  the purported gains and jeopardising safety and progress alike.</p><p>I think by far the most likely version to come out of the political moment that pause advocates are seeking to exploit is a bill that speedruns the elements that satisfy the lowest common denominators&#8212;spiteful anti-tech measures that make for good rhetoric and allow the chosen champions to enter the primaries as having taken down big tech. Expecting the political forces involved to delay past critical political timings to reflect catastrophic-risk-motivated treaty nuances strikes me as outright naive.</p><p>That should, in my view, be the biggest reason even for hardcore safety advocates to be skeptical of summoning these spirits and pointing them at the pause:<strong> you don&#8217;t control what comes of any of it, and even the best pause ideas are too close to bad pause ideas.</strong></p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!X0lb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!X0lb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!X0lb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1>Jumping Out Of The Overton Window</h1><p><strong>Point these things out to an ardent pause advocate, and the response usually retreats to gesturing at broader political dynamics</strong>: perhaps the current, ill-fated versions of the pause idea lay the groundwork for better policy and politics in the future? This might be your response to the previous section: you could think I&#8217;m misguided in assuming a familiar political environment, and actually you&#8217;d want to see these pause proposals enacted after a big &#8216;warning shot&#8217; or near miss on AI risk of some kind. Or you&#8217;re in fact counting on the pause discourse moving the Overton window&#8212;though while that&#8217;s a reasonable defense for coalition-building, it does not justify the selection of what seems to be an actively harmful policy vehicle to do so.</p><p>I&#8217;ve written about this <a href="https://writing.antonleicht.me/p/dont-build-an-ai-safety-movement">before</a>, in a piece that touched upon the merits and flaws of building a popular AI safety movement more broadly. In this case specifically, two arguments apply.</p><p>First, I believe the logic of the &#8216;radical flank&#8217; boosting associated-but-not-allied moderates does not apply here: <strong>there </strong><em><strong>are</strong></em><strong> no friendly moderates that correspond to this particular radical flank. </strong>The radical flank effect requires a moderate wing that shares the movement&#8217;s broad goals but advocates softer means. The radical makes the moderate look reasonable by contrast. But this coalition&#8217;s horseshoe structure means there is no natural moderate counterpart waiting to benefit: Democratic moderates like Warner and Fetterman are not proposing a gentler version of the pause but actively repudiating the entire premise, branding it &#8220;<a href="https://www.axios.com/2026/03/25/warner-ai-data-center-moratorium-aoc-idiocy">idiocy</a>&#8221; and &#8220;<a href="https://www.foxnews.com/politics/fetterman-slams-ai-data-center-moratorium-proposal-china-first">China First</a>&#8221; within hours of introduction. And moderates on similar beats are not touching the Sanders language at all&#8212;other than using the term AI, Senator Slotkin&#8217;s recent <a href="https://www.slotkin.senate.gov/2026/03/17/slotkin-legislation-puts-common-sense-guardrails-on-dod-ai-use-around-lethal-force-spying-on-americans-and-nuclear-weapons/">proposal</a> does not even give the impression of being about the same set of issues and concerns. No one trades on the moderate version of the flank&#8217;s premise, and no one&#8217;s swooping in through the window it opens.</p><p>Second, the radical flank effect works out by introducing a pared-down version into moderate awareness&#8212;pared down either by introducing softer means, or by only agreeing on some of the ends. In climate policy, that&#8217;s a fairly robust play: as a climate activist, you&#8217;d appreciate a moderate passing whatever instrument to reduce carbon emissions, and you&#8217;d appreciate moderates accepting most versions of your problem statement. Not so in AI: if much gets lost in translation between the flank and the moderates, you end up with a lot of bad ideas. </p><p>As discussed above, the <strong>second-best means are likely to radically backfire</strong> for safety advocates; and a random sampling of the motivations expressed by the anti-AI omnicause would likely include some jobs doomerism, some pedestrian anti-tech sentiment, and none of the concerns pause advocates consider exceptionally important. In fact, that latter trend seems fairly likely: if I&#8217;m a moderate borrowing from my radical flank, I&#8217;d much rather adopt the far more salient jobs rhetoric and leave the fringe-y catastrophic risk concerns aside than vice versa.</p><p>To be fair, something like this was always going to happen. No good policy happens without adjacent bad ideas to moderate between, and you always need overreaching solutions to contrast effective interventions against. This is especially true because past congressional debate has mostly pitted fairly moderate safety positions against straightforwardly nihilistic applications of anti-regulatory sentiment. But the play for a discursive shift would be a lot more convincing if it made the radical position a little bit more sound, a little bit less salient, and further removed from the association with the political fringes. If we are to judge this as a play for political communications rather than policy strategy, I think it&#8217;s likely to backfire.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!X0lb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!X0lb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png" width="442" height="110.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:442,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!X0lb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1>The Hard Answer</h1><p><strong>I suspect a true believer in the maximalist safety position might still reject all these arguments</strong>, suggesting that any movement is better than heading for doom by default. I don&#8217;t think that&#8217;s our trajectory, but I do concede that we&#8217;ll need to come up with some good policy to handle the risks. That said, I don&#8217;t think that means I need a fleshed out agenda in response. In fact, one of my main points of disagreement with the pause advocates is that I don&#8217;t think the seriousness of the challenge means that we need action to be pivotal; and that a bit more broadly, I do not believe we know enough about the future contours of this technology to make a determination. </p><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;13f288cd-79f4-40aa-a8ba-69887e92555d&quot;,&quot;caption&quot;:&quot;British Far-East command had the Japanese threat under lock: Fortresses with heavy artillery were ready to dispel any naval assault from the south. When the Japanese went with bicycles instead of ships and coordinated a major attack from the north, Singapore fell, in what Churchill has described as Britain&#8217;s &#8216;worst disaster&#8217;.&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;sm&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;A Moving Target&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:113003310,&quot;name&quot;:&quot;Anton Leicht&quot;,&quot;bio&quot;:&quot;I write about the political economy of advanced AI.&quot;,&quot;photo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!FPyB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F75422da7-aafa-42ab-8fa6-cf4f0df85cf0_3166x3166.jpeg&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2025-07-01T13:34:29.005Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/$s_!WFlu!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1319b9ff-8659-4e87-b3c4-e65ea848cec9_1296x641.png&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://writing.antonleicht.me/p/a-moving-target&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:167261665,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:14,&quot;comment_count&quot;:6,&quot;publication_id&quot;:3834218,&quot;publication_name&quot;:&quot;Threading the Needle&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!4SKU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd7dff89d-5b66-4169-8ef8-eb2ec9ed94e8_1024x1024.png&quot;,&quot;belowTheFold&quot;:true,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><p>From where I stand today, though, my favoured bet is this alternative plan, taken in three steps:</p><ul><li><p>First, we pluck the low-hanging procedural fruit. On the state, federal and international level, we introduce overlapping <a href="https://www.gov.ca.gov/2025/09/29/governor-newsom-signs-sb-53-advancing-californias-world-leading-artificial-intelligence-industry/">provisions</a> to ensure transparency, safety reporting, industry coordination, whistleblower protections. Simultaneously, we ramp up state capacity to engage with this information (this is going pretty well).</p></li><li><p>Second, building on what we learn from that, we start holding the ecosystem to its promises through incubating a functional and tightly-overseen market for independent third party <a href="https://www.averi.org/ourwork/frontier-ai-auditing">assessment</a> (we&#8217;re getting started on this, and I feel much better about this than a year ago).</p></li><li><p>Third, building on what this ecosystem identifies as shortcomings of an effectively-audited frontier development space, we determine the kinds of surgical policy interventions that fix the safety-relevant market failures.</p></li></ul><p>I believe this plan could work, that we can deploy it at decent political robustness within months to years, and that we&#8217;re just a little bit behind the curve on realising it at the appropriate level of technocratic rigour and political salience. I also believe that <strong>the greatest threats to this plan are hasty disruptions to the politics of AI</strong> that drag good policy work into the crossfire and force it to justify itself not on technical merits, but on the twisted standards of an American presidential primary. That said, I&#8217;m under no illusions here: getting this right will still be hard, and it&#8217;s harder still in the face of unproductive political spending on AI policy matters. But in general, I have faith that there is progress to be made within the current political and technical constraint. Enough, to my mind, not to upset the gameboard and start anew far away from the lucky trajectory we find ourselves on.</p><p>Until then, the point of today&#8217;s piece is simple: movement toward a pause puts too much at stake, all while likely achieving less than nothing. Whether for safety or for progress, you should resist it.</p><div><hr></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p>]]></content:encoded></item><item><title><![CDATA[Can You Poach A Frontier Lab?]]></title><description><![CDATA[A realistic roadmap for U.S. allies after the Anthropic crisis]]></description><link>https://writing.antonleicht.me/p/can-you-poach-a-frontier-lab</link><guid isPermaLink="false">https://writing.antonleicht.me/p/can-you-poach-a-frontier-lab</guid><dc:creator><![CDATA[Anton Leicht]]></dc:creator><pubDate>Tue, 03 Mar 2026 13:13:05 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/4a8cda8e-f974-4539-9985-f072fdae5ef6_1260x825.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em><strong>Last week, the U.S. government threw into chaos the delicate public-private balance that has until now governed the rise of advanced artificial intelligence.</strong></em> Following an escalating conflict between the Pentagon and AI developer Anthropic, Secretary Hegseth announced that he&#8217;d declare Anthropic a supply chain risk, barring them from contracts with the Pentagon or, critically, other defense contractors. The implications are dire: the designation hits Anthropic&#8217;s business at a time when they&#8217;re gearing up for an IPO built around a story about business deals with governments and their contractors. And as they&#8217;re pursuing volatile infrastructure contracts and racing their chief rival OpenAI for the IPO timing, even the fact that the courts might kill the move provides only faint relief. That means the decision goes far beyond the sensible off-ramp of just cancelling federal contracts that had seemingly been endorsed by President Trump shortly before the announcement: the U.S. government is, in effect, trying to derail if not destroy one of its foremost technology companies.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p>This episode has made observers doubt <strong>whether the U.S. government in its current shape is a suitable hegemon over the next years of rapid AI progress. </strong>And while I maintain that the American project is the best realistic bet we have, recent days again underline that we&#8217;re currently not dealing with nearly the best version of that project. That invites an international angle: is there some way that the rest of the world can simultaneously pick up the slack and gain some leverage along the way? Can liberal democracies elsewhere offer more favourable conditions to concerned frontier developers, creating a hedge for the labs and a counterweight to the U.S. at the same time? I think yes&#8212;and though it&#8217;s not quite easy, the current window would allow taking concrete policy action in the coming weeks.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!X0lb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!X0lb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!X0lb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a></figure></div><h3><strong>Short-Term Follies</strong></h3><p>It brings me no pleasure to begin this essay by throwing some cold water on the worst version of a decent idea. If you&#8217;ve looked at somewhat clued-in Europeans&#8217; social media recently&#8212;and increasingly, higher-level chatter&#8212;you&#8217;ll have noticed a story: poor Anthropic, discarded by USG and temperamentally aligned with the more safety-minded Canadian-European axis, might just up and leave. In other words, people talk about actually moving Anthropic all the way to Europe as an immediate capitalisation on the current window. <strong>No, you cannot outright poach a frontier lab tomorrow. </strong>I think the idea itself betrays both a misunderstanding of what a frontier developer is and a counterproductive communications instinct.</p><p>It&#8217;s worth understanding what makes a frontier developer competitive in the next year or two. The relocation discussion mostly treats frontier developers as a software company: a bunch of smart engineers in a lab, and if you can make them prefer spring in Paris over amorphous fog season in San Francisco and the Tuileries over a park on top of a bus station, you can move Anthropic to Europe. This vibes-based model of poaching Anthropic flies in the face of two structural questions: compute and capital markets.</p><p><strong>First, frontier developers are fundamentally fairly thin wrappers around available compute: </strong>compute already online, and in particular the gigawatt-scale clusters coming online this year and next. And since Anthropic wants to remain competitive with its U.S. rivals, any poaching bid would have to guarantee a similar timeline to deploying comparable levels of computational resources. There is no consortium of Western democracies that can offer remotely comparable compute. Anthropic alone has around a million Trainium2 chips coming online through AWS&#8217;s Rainier megaproject&#8212;a 2.2 gigawatt campus&#8212;with further capacity through Google Cloud and a separate $50 billion datacenter deal. The entire European public AI compute estate, across all of EuroHPC, amounts to roughly 57,000 accelerators. Future plans are late: FluidStack&#8217;s 1 GW French supercomputer, the most ambitious European project, remains at the MoU stage; the EU&#8217;s AI Gigafactories are a policy concept with a call for proposals scheduled this year and operations no earlier than 2028. Even if you redeployed all of it, you&#8217;d arrive years too late and orders of magnitude short. And even if you rallied all political support to start construction on proprietary clusters to attract Anthropic tomorrow, any realistic timeline would still see them online far later than American projects already under construction.</p><p>Could Anthropic retain access to its U.S. compute base even if it decided to pursue the kind of visible departure that Europeans are entertaining? I strongly doubt it. The Trump administration definitely has the toolkit to control a now-foreign firm&#8217;s access to U.S.-based compute if it decided to do so: both remote access and export of the chips themselves could quite easily be export controlled; the attempts of a U.S. subsidiary of a now-European Anthropic to access AWS compute in America could quickly be curtailed, and so on. Before Anthropic even considered any such defection, the Pentagon already did not shrink from punitive treatment; I think it&#8217;s likely they would escalate at the attempt. There&#8217;d be a sort of bitter irony in the export control hawks at Anthropic getting the short end of export enforcement, while Chinese remote access remains live&#8212;but the same sort of irony applies to the supply chain risk designation, and that didn&#8217;t stop Secretary Hegseth, either.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!AunU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27859ba8-10eb-4b47-ae53-87e33177d769_1600x793.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!AunU!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27859ba8-10eb-4b47-ae53-87e33177d769_1600x793.png 424w, https://substackcdn.com/image/fetch/$s_!AunU!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27859ba8-10eb-4b47-ae53-87e33177d769_1600x793.png 848w, https://substackcdn.com/image/fetch/$s_!AunU!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27859ba8-10eb-4b47-ae53-87e33177d769_1600x793.png 1272w, https://substackcdn.com/image/fetch/$s_!AunU!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27859ba8-10eb-4b47-ae53-87e33177d769_1600x793.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!AunU!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27859ba8-10eb-4b47-ae53-87e33177d769_1600x793.png" width="1456" height="722" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/27859ba8-10eb-4b47-ae53-87e33177d769_1600x793.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:722,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!AunU!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27859ba8-10eb-4b47-ae53-87e33177d769_1600x793.png 424w, https://substackcdn.com/image/fetch/$s_!AunU!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27859ba8-10eb-4b47-ae53-87e33177d769_1600x793.png 848w, https://substackcdn.com/image/fetch/$s_!AunU!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27859ba8-10eb-4b47-ae53-87e33177d769_1600x793.png 1272w, https://substackcdn.com/image/fetch/$s_!AunU!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27859ba8-10eb-4b47-ae53-87e33177d769_1600x793.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"> Gigawatt-scale datacenters, some of which earmarked for Anthropic, are already well under construction in America. It&#8217;s too late. (<a href="https://epoch.ai/data/data-centers?view=table">Epoch AI</a>)</figcaption></figure></div><p><strong>Second, frontier developers&#8217; ambitious spending requires access to extraordinary capital markets. </strong>Next to the availability of deeper U.S. financial markets, the particularly important piece of the puzzle here is IPO timing: it would be quite valuable for Anthropic to be the earliest IPO among the big AI companies, to attract big institutional investors and the first wave of zealots in search of direct exposure of their portfolios to AI growth. Anthropic wants this IPO story, and it ideally wants it first. Relocating means starting the IPO story from scratch &#8212; reassuring underwriters and investors that the move doesn&#8217;t dent projections &#8212; while also moving to a stock exchange with far less favourable conditions and available capital. OpenAI would be first, and the Anthropic IPO would dramatically underperform expectations. In short, if you have a shot at being the first AI stock to IPO at the NYSE, you&#8217;re not going to swap that for a lukewarm listing in late 2027 at the LSE.</p><p>On these structural grounds, Anthropic is deeply committed&#8212;dependent, in fact&#8212;on processes unfolding in America right now. Anthropic is also dispositionally unlikely to bear large temporary costs to their competitiveness at the present moment: they believe now is crunch time, that the race is heating up and they might even have a leg up heading into recursive self-improvement. For a company as certain that now is the time, defecting from the most attractive capital market and established compute pathways will simply not be an option. In other words: if what they say is right and this is the beginning of the endgame for AI policy, then middle powers are down a rook and should cease attempts to simplify. </p><h4><em><strong>An Offer They Will Refuse</strong></em></h4><p>You might understand this, and still believe it&#8217;s worth for other democracies to reach out and attempt to poach anyway: to put an offer on the table, to increase Anthropic&#8217;s leverage, to provide a forcing function for intra-democratic coordination. I believe that would be mistaken: making this question politically salient would be bad for Anthropic and ultimately bad for any progress toward diversification short of a wholesale transplant.</p><p>Advocates of this persuasion are misreading Anthropic&#8217;s position. Since leaving is not a plausible path &#8212; and the administration knows it &#8212; the &#8216;offer&#8217; provides Anthropic with no additional leverage at all. Anthropic in its current PR posture will have to turn down the offer visibly and credibly if this becomes a big story, lest they invite more vicious retaliation from the administration for very little upside. This bears repeating: if you are one of my friends on the middle powers beat and are more enamoured with this idea than I am, <em>please</em> don&#8217;t go the way of big public statements, open letters, policymaker communications. Giving this idea public salience sharpens the political conflict in America, helps the administration paint Anthropic as illoyal and anti-American, and ultimately compels Anthropic to clearly reject the idea before there was time to approach it stepwise and with subtlety.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!X0lb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!X0lb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!X0lb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h3><strong>Long-Term Prospects</strong></h3><p>Stepwise and subtle, however, is a possible way to do this: understand the project of &#8216;poaching&#8217; a frontier lab not as an attempt to extract value from the U.S., but to diversify the Western stack to make it more resilient to transient political trends and disruptions. My broader claim here is simple: <strong>it would be good for the world if a sizeable minority of American developers&#8217; compute, business activity, and government cooperation were located in allied democracies. </strong>That could be about Anthropic, but I&#8217;d be just as happy with OpenAI or Google DeepMind. In a pinch, I might even take Meta. That outcome is eminently reachable and obviously beneficial in the aftermath of the Anthropic/Pentagon saga&#8212;and it&#8217;s never been more clear to the frontier developers that some hedging might be in their very best interest.</p><p>Why is it obviously beneficial? The general endgame theory here is that the worst version of the U.S. is not an optimal steward of AGI. And while I still think the American project is our best bet to get all this right, I do think there&#8217;s a big difference between a good and a bad version of America winning. Next to U.S. domestic engagement, I think sound reasoning on how to affect that delta starts with the question of how U.S. allies can elicit that best version. The fact that AI will not lead to European supremacy is not to say that there is no contribution to be made from the liberal democracies of the world. Western democracies have institutional stability, regulatory predictability, and supply chain security to contribute to an American political system that is increasingly fraying at the edges. And at the very least, the Western allies can build up to be a viable alternative&#8212;a backup place for western-aligned organisations to go; a credible threat if the U.S. overreaches. Like my European friends, I want Europe to succeed&#8212;and with an increasing number of American friends, I share the view that the most successful American project takes place in a world that succeeds, too.</p><p>I&#8217;ve mostly argued that we can reach this place through an expansion of the upstream and downstream leverage held by middle powers, thereby cutting them in on AI-driven growth and providing them some independent strategic foothold while still fundamentally providing for deep integration into the American stack. But a marginal increase in U.S. developers&#8217; footprints in middle powers is similarly helpful to national prospects and international balance, along three lines.</p><h4><em><strong>Middle Power Upsides</strong></em></h4><p><strong>First, if your country hosts frontier developers&#8217; infrastructure, you hold global leverage over the flow of artificial intelligence. </strong>That gives you <em>some</em> ability to make credible threats, and inversely makes you somewhat less susceptible to that kind of threat against you: no one can directly cut you off from the compute you need to run at least the models you have access to, and sometimes you can even cut off others&#8217; access. The important thing to note for that latter part is that a large coordinated share of global inference compute gives you global leverage even if you&#8217;ve mostly been using that inference yourself: if you have the ability to take 10% of the world&#8217;s inference compute offline, the shock to global supply can spike prices and inflict real pain on others as well. Compare this to OPEC countries: their single-digit shares of global oil supply have frequently rendered them less susceptible to superpowers&#8217; plays and more capable to negotiate their own fate.</p><p>What does this compute-focused notion of leverage have to do with frontier labs? First, cooperation with frontier developers is among the easiest drivers of major compute build-outs in most countries: developers are exceedingly hungry for compute, willing to enter all kinds of deals to increase future supply, and generally able to, in partnerships, mobilise massive amounts of capital. Compared to the idea of having local European firms or even governments build out the compute, simply <strong>making it attractive for the bullish and rich Americans to build in your country seems far easier. </strong>To add, your compute is much more valuable if it&#8217;s being used for frontier AI than if you&#8217;re using it to build the sixth-best open source model east of Lisbon. No global market really cares if the French government turns off its own datacenters, but if we&#8217;re talking about Anthropic&#8217;s inference stock, things change. You might wonder why a developer would accept this if the turning-off scenario is in the cards, but again: it&#8217;s in the cards no matter where your data center is, and the question is if you want all your eggs in the basket that just threatened to designate you a supply chain risk or not.</p><p><strong>Second, countries with a substantial frontier developer presence gain the ability to enforce contracts for frontier AI access.</strong> I&#8217;ve argued in the past in some more detail that importing frontier AI is imperative for middle powers due to the objective superiority of U.S.-built AI, but that access is unreliable due to securitisation prospects, economic pressures and competitive margins. The question then is how to forge contracts for access to American AI that last. The answer is that you need leverage, concretely over the company that might otherwise be compelled to breach its contract for economic or political reasons. If part of the lab is in your country, it&#8217;s much easier to ensure that access, because you can at least make sure the resources physically located in your country aren&#8217;t used for anything else: fine if you break your contract and don&#8217;t service the latest version of your latest model, but in that case your datacenter isn&#8217;t running anymore. Very similar logic applies to talent, corporate listings and intellectual property. Middle powers want stable relationships, and the more assets they physically hold within their borders, the more stable these relationships are, because it&#8217;s costlier for the labs to renege. That is valuable, not just because of the access itself, but because it allows countries not to run costly resilience strategies&#8211;like building their own models&#8211;to hedge against being cut off.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!YEET!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8c5c9c8-f6e0-457f-b31d-a5722b830726_1634x576.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!YEET!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8c5c9c8-f6e0-457f-b31d-a5722b830726_1634x576.png 424w, https://substackcdn.com/image/fetch/$s_!YEET!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8c5c9c8-f6e0-457f-b31d-a5722b830726_1634x576.png 848w, https://substackcdn.com/image/fetch/$s_!YEET!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8c5c9c8-f6e0-457f-b31d-a5722b830726_1634x576.png 1272w, https://substackcdn.com/image/fetch/$s_!YEET!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8c5c9c8-f6e0-457f-b31d-a5722b830726_1634x576.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!YEET!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8c5c9c8-f6e0-457f-b31d-a5722b830726_1634x576.png" width="1456" height="513" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d8c5c9c8-f6e0-457f-b31d-a5722b830726_1634x576.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:513,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:200434,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://writing.antonleicht.me/i/189747966?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8c5c9c8-f6e0-457f-b31d-a5722b830726_1634x576.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!YEET!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8c5c9c8-f6e0-457f-b31d-a5722b830726_1634x576.png 424w, https://substackcdn.com/image/fetch/$s_!YEET!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8c5c9c8-f6e0-457f-b31d-a5722b830726_1634x576.png 848w, https://substackcdn.com/image/fetch/$s_!YEET!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8c5c9c8-f6e0-457f-b31d-a5722b830726_1634x576.png 1272w, https://substackcdn.com/image/fetch/$s_!YEET!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8c5c9c8-f6e0-457f-b31d-a5722b830726_1634x576.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">A dire fiscal situation. (<a href="http://fai.org/posts/the-race-worth-winning-middle-powers-in-the-age-of-machine-intelligence">FAI</a> / OECD)</figcaption></figure></div><p><strong>Third, countries with frontier developer presence have a pathway to fiscal stability. </strong>Most non-US countries aren&#8217;t fiscally equipped for AI-driven labour market disruptions, in part because these disruptions would simultaneously present a policy challenge and also lead to a cratering of income tax revenue. Whereas the U.S. will recoup these losses through corporate taxes levied on the AI agents that displace the human workers, many other countries won&#8217;t be so lucky. Locating even some amount of frontier AI activity within the confines of your tax authority is therefore a very attractive fiscal pathway for a middle power vulnerable to this sort of disruption. That is admittedly not trivial&#8212;U.S. corporate tax is low, so you&#8217;d have to offer the frontier developers sizable tax breaks for a move in tax authority to ever be viable, and so the revenue wouldn&#8217;t be that impressive. But still: if a major share of labor income moves to corporate income in the near future, even securing a small slice of the corporate tax for a company like Anthropic is a <em>huge</em> deal for fiscal stability of any middle power bloc, or in fact a broader consortium of such powers.</p><p>Solving these three problems doesn&#8217;t solve many deeper questions of middle power strategy, of course. They&#8217;d still remain susceptible to U.S. leverage, and still not be able to unilaterally exert their will on the trajectory of frontier AI. But leverage is not an all-or-nothing question: every bit of friction and pushback helps, and frontier developer presence helps in the particularly important areas outlined above.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!X0lb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!X0lb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!X0lb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X0lb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F634738bc-decb-4b87-8e17-0080c2137b76_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h3><strong>Ways &amp; Means</strong></h3><p>Even though Western allies are currently not equipped to transplant Anthropic, or any developer, to their borders entirely, the situation in America still enables stepwise progress toward that goal. How? Quite fundamentally, the pathway to getting frontier developers to expand their footprint is to ask them what they want. In that sense, the first step is to get together a coalition of aligned democracies that share the analysis I just outlined, taking stock of the strengths they have and could pool, and then to approach the AI developers and simply ask them under what conditions they&#8217;d consider dramatically increasing their presence and footprint.</p><p>That said, I think there are measures that make sense to flag early, and that would  have to be part of an actually compelling expansion offer. They generally fall into three categories: Specific project-focused state-owned action, general changes to the regulatory backdrop, and efforts at international coordination.</p><h4><em><strong>State-owned Action</strong></em></h4><ul><li><p><strong>Keystone developer-government contracts</strong> for the use of frontier systems in public administration and national security. Government contracts are (usually) sticky and long-term, an important business anchor as well as an important token of deepening relationships between developers and host countries. Their funding and modalities are in the direct purview of governments, and they can decide to hand them out tomorrow. In the aftermath of the current episode, allied governments could approach Anthropic to use their systems in security-relevant applications &#8212; offering exactly the kind of partnership that the Pentagon just reneged on.</p></li><li><p><strong>Preferential access to bottlenecks.</strong> I&#8217;ve written before about the positions many middle powers hold on bottlenecks for advanced intelligence: the manufacturing required to turn tokens into real-world value, and the semiconductor supply chain that provides the substrate for AI. Short of aggressively leveraging these bottlenecks, middle powers could use them to steer capacity toward frontier developers they cooperate with: preferential and custom equipment access for ASICs&#8212;application-specific chips&#8212;for their best friends, preferential integration of developers&#8217; models into AI-empowered manufacturing, and so on. They can offer an accelerated timeline toward vertical integration that allows quicker iteration and perhaps even recursive self-improvement across the stack.</p></li><li><p><strong>Capital</strong> <strong>mobilisation</strong> to offset the difference in private capital markets between the U.S. and other countries. In particular, two pathways to this capital seem promising: closely involving large legacy firms in middle powers for investments and integrated deals; and leveraging sovereign wealth and pension funds as anchor investments. Funds and legacy corporations are huge untapped sources of capital that are currently routinely underperforming capital markets precisely because they are not exposed to technology growth&#8212;rallying them around this plan is risky, but can solve two problems at once.</p></li></ul><h4><em><strong>Regulatory Conditions</strong></em></h4><ul><li><p><strong>Copyright &amp; data protection carveouts</strong>: frontier developers that increase their footprint should be allowed to train at least as liberally as they would in America. That would ensure that developers at least have the choice to create some share of their suite of leading AI systems outside of American jurisdiction. Otherwise, even substantial nominal presence would still concentrate the most critical intangibles in the U.S., and little would be gained. The main failure mode to avoid is that developers have a large footprint in other powers, but their important business activities still exclusively take place in America. I&#8217;m sure frontier developers have a clear sense of the carveouts necessary to prevent that scenario, which should at least be a strong starting point for scrutiny.</p></li><li><p><strong>Preconditions for enough compute</strong>: downstream of all these questions lies the actual compute question. If you get the capital and integration questions right, developer presence would be likely to translate into compute footprint if the regulatory and energy requirements are met to motivate a buildout. That requires enabling the construction of proprietary energy infrastructure, including both behind-the-meter gas turbines and nuclear SMRs; as well as permissive licensing that allows for quick construction of datacenters. That, to my mind, is the way to think about compute: draw developers&#8217; interests and then get out of their way, not straight-up build it yourself. The alternative is speculative investments with unclear buyers, and so compute is not a first-order policy priority itself.</p></li><li><p><strong>Attractive corporate tax conditions</strong>&#8212;competing with U.S. rates, on the understanding that even a discounted share of what could become the dominant revenue base is transformative for middle power fiscal health.</p></li></ul><h4><em><strong>International Coordination</strong></em></h4><ul><li><p><strong>Consortium-building and burdensharing</strong>. It&#8217;s not quite clear who exactly would make this offer: The EU alone lacks the capital, talent concentration, and infrastructure, but some of the Five Eyes or Pacific allies might be a little too close to America to risk what might be read as a somewhat adversarial play. I think Canada and broader Europe might be a strong starting point, and they should endeavour to include Five Eyes as well&#8212;especially the UK, which is well positioned to play a leading role in this coalition. Setting this up is not easy: there are agglomeration effects that cut against distributed gains from coordination. Everyone has to pitch in, but not everyone gets the tax revenue and the immediate leverage from datacenter location. Any alliance would have to complement coordination with robust risk- and benefit-sharing mechanisms that distribute the outcomes of this play across the alliance. Shared fiscal responsibility for the investments, ad hoc harmonisation of regulatory landscapes for the copyright issues, special distribution channels for levied taxes, and so on.</p></li><li><p><strong>U.S. reassurance. </strong>This can&#8217;t be framed or come across as &#8216;we&#8217;re taking away your AI developers&#8217;. If it is, the administration will react restrictively and punitively in a heartbeat. I think there&#8217;s a good substantive argument to be made in favour of this diversification project, ironically especially to the current administration. The Trump administration&#8217;s views on AI are fundamentally pro-AI, and soft on restricting the outflow of AI capabilities in general terms. This&#8212;and decidedly not questions of rule of law, military use, leverage, or the Pentagon&#8217;s choices&#8212;have to be the frame. In many ways, pursuing this strategy constitutes buying American: investing into U.S. firms, providing them infrastructure, integrating them deeply into local stacks. Progress along the diversification trend needs to be pursued in that spirit, and therefore needs to happen in close communication with the U.S. government, which should in turn retain the ability to set some red lines for tech transfer and decentralisation. Generally, this will not work if it&#8217;s read as something middle powers are doing to the U.S.</p></li></ul><h4><em><strong>The Politics of Poaching</strong></em></h4><p>Now I&#8217;ve worked in politics long enough to know that if you&#8217;re a staffer and take this list into your principal&#8217;s office, you&#8217;ll have to hand over your access badge on the way out. The &#8216;local champions&#8217; will revolt, the incumbent ecosystems will curse you out, the nay-sayers will say you&#8217;re gambling on American pipe dreams, and well-meaning &#8216;patriots&#8217; will say you&#8217;re throwing in the towel. But the gap between where Western powers are and where they&#8217;d need to be to invite a major developer footprint really is that large, and that measures like these really would be necessary.</p><p>That is to say: We have a decent idea of what the policies should be, and if we hadn&#8217;t, the developers would help us find it. <strong>The bigger issue is that you&#8217;d need a lot of focused political will for this. </strong>That will would have to be based on a clear understanding of the three dynamics I&#8217;ve outlined: the leverage, the contractual stability, and the fiscal pathway that frontier developer presence provides. That&#8217;s why much of this post is about motivating reasoning. The mechanistic pathways for stepwise expansion of developers&#8217; footprints are much less complicated than explaining to incumbent allied governments why advancing this expansion is in their interest. To my mind, the biggest bottleneck remains making clear my three arguments above to the allied policymakers that would need to push for this play.</p><h4><em><strong>Outlook</strong></em></h4><p>I still think there&#8217;s a window here. Western middle powers have been looking for a promising and concrete play to rally around&#8212;and this might be it. As always on middle power strategy, this requires a delicate balance: you have to take frontier AI seriously enough to realise how important doing this is, but not get drawn into delusions of full independence from America. </p><p>But if policymakers come to understand what hosting frontier developers is actually worth, they can take first steps today. <em><strong>That would be</strong></em> <em><strong>progress toward a more stable world in which the best version of America could win.</strong></em></p><div><hr></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p>]]></content:encoded></item><item><title><![CDATA[The Delhi Gap]]></title><description><![CDATA[Billions of people or trillions of dollars are catastrophically wrong.]]></description><link>https://writing.antonleicht.me/p/the-delhi-gap</link><guid isPermaLink="false">https://writing.antonleicht.me/p/the-delhi-gap</guid><dc:creator><![CDATA[Anton Leicht]]></dc:creator><pubDate>Wed, 25 Feb 2026 13:13:35 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/fa1c7018-687c-4bec-a0b2-25e7a24492b7_1200x746.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em><strong>My experience of the AI Impact Summit in Delhi has been characterised by a bewildering gap</strong></em>: on one side, there&#8217;s the summit as a trade show, an admittedly energetic tech conference with a lack of depth that would have you think it might just as well be about anything else: the internet or solar panels<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> or robot vacuum cleaners. People sell new versions of the technology, strike narrow business deals, parade their local champions and go home with a few more deals. On the other side of this gap, there&#8217;s the summit as a meeting of leaders on frontier AI&#8212;serious and in touch with Silicon Valley&#8217;s rising realisation that progress toward advanced AI is fast. But while hundreds of thousands attended the former conference, almost no one but the AI companies and the U.S. delegation made it to the latter. This<em><strong> </strong></em>gap<em><strong> throws the world into danger of capturing all the risks and mitigating most of the benefits of AI.</strong></em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p>It&#8217;s easy to simply bemoan this disconnect and write a sneering post about normies not getting what&#8217;s going on. In fact, I suspect that doing so will become a genre on its own in the aftermath of the summit, just as much as it has become a genre to despair at the last AI summit in Paris cutting safety-related issues from the agenda. But I think we should do better than that. My Delhi reflections come a little bit later because I believe we should explain <strong>why this gap </strong><em><strong>matters</strong></em>: what it means for AI strategy, and where we should go next. I want to make two claims today: A this technological juncture, the gap is wider than most of us would have expected. And so it has grown wide enough to seriously endanger robust solutions. </p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!TgTv!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!TgTv!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!TgTv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png" width="442" height="110.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:442,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!TgTv!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a></figure></div><h4><em><strong>Geography of a Gap</strong></em></h4><p><em><strong>In one corner, the world. </strong></em>By numbers alone, the optimists have it. Their beliefs are difficult to characterise in detail, but I think some common features are the belief that capabilities at the very frontier have or will soon plateau; that the great majority of economic value will accrue to those that diffuse models through the economy; that domestic &#8216;good enough&#8217; AI beats internationally leading &#8216;frontier AI&#8217; for almost all use cases. That set of views reflects what I believe to be two broader underlying notions: that the American frontier developers are in some important sense mistaken about the expected value of pushing the frontier, and that appropriate national strategy fundamentally deals with software-forward questions of prosaically implementing AI systems not unlike today&#8217;s.</p><p><em><strong>In the other corner: the technocapital machine. </strong></em>A minority disagrees&#8212;but what a minority that is! The camp of true believers consists of those closest to the AI revolution. It&#8217;s made up of the representatives of the three actual frontier AI companies, some members of the U.S. delegation, and a ragtag group of policy types in middle powers. They, and I with them, believe that the rest of the world is fundamentally wrong: that the great powers are the only ones who are live players in this race, that the vast majority of AI&#8217;s impact on the world will come from systems and form factors yet to come, and that any diffuse implementation strategy with no recourse to technical and geopolitical leverage and resilience is doomed to fail. I&#8217;ve written <a href="https://www.foreignaffairs.com/united-states/ai-divide">repeatedly</a> and in some <a href="https://writing.antonleicht.me/p/import-imperatives">detail</a> about what middle power policy this would imply. </p><p>The rest of this essay takes elements of this view for granted: that frontier capability will be strategically mandatory, that AI will be transformative, and that middle powers can&#8217;t build frontier AI themselves. If you disagree, I&#8217;d still ask you to bear with me for the following&#8212;if not from conviction, then as an intellectual hedge. What I posit to you is the consensus view of the frontier AI developers, on which investment flows in the trillions hinge. Join me in suspecting it&#8217;s not entirely mistaken.</p><p>If you&#8217;re in search of a piece that elucidates the exact nature of the broader Delhi Gap, I recommend <span class="mention-wrap" data-attrs="{&quot;name&quot;:&quot;Dean W. Ball&quot;,&quot;id&quot;:5925551,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!mLaj!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F49371abf-2579-47be-8114-3e0ca580af8b_1024x1024.png&quot;,&quot;uuid&quot;:&quot;775ac25e-2552-477d-8ccf-22d64c844d20&quot;}" data-component-name="MentionToDOM"></span>&#8217;s <a href="https://www.hyperdimensional.co/p/the-moving-and-the-still">latest</a>: next to a more thorough account of the positions held, it provides a causal model by way of connecting the rejection of superintelligence to a wish of the world to emancipate from the American hegemony. </p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!TgTv!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!TgTv!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!TgTv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png" width="442" height="110.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:442,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!TgTv!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h4><em><strong>Buyer&#8217;s Markets?</strong></em></h4><p>What I&#8217;ll add is that, if you asked the middle powers, they&#8217;d give you a different reason than the ones Dean names. They&#8217;d say their instinctive rejection of the American message of dawning superintelligence has to do with <strong>the obvious financial incentive motivating those who believe in outsized AI progress</strong>. The frontier companies are seeking customers, are looking to sell to governments and local businesses instead, and charge a hefty premium compared to weaker capabilities; and the American government seeks to get the world hooked on its stack for purposes of market share and leverage. Middle powers feel comfortable rejecting these advances because they believe the underlying dynamic is still that of a buyer&#8217;s market: AI companies are competing for government contracts, and not vice versa.</p><p>This incentive overlap is concerning and unfortunate, but it doesn&#8217;t make the true believers wrong in any important way. And the <strong>buyer&#8217;s market dynamic is prone to flip quickly and decisively</strong>: currently, the world is more constrained in buyers of advanced AI than in providers of compute and therefore tokens of artificial intelligence. As more and more gainful implementations of AI systems are found, inference compute will become rare&#8212;especially given the raw compute requirements of advanced agentic systems&#8212; and so will chips to export and plug into datacenters. Countries that reject deals today could well find themselves strapped for exclusive AI access in a few years, scrambling for second-best GPUs and insecure access to inference from unreliable partners abroad. </p><p>The market seems so open because right now, developers need to rush to get the next years&#8217; tranches of compute online&#8212;but once they are, the infrastructural substrate of the economic transformation is locked, especially since globally available compute looks likely to run into <a href="https://epoch.ai/blog/can-ai-scaling-continue-through-2030">hard constraints</a> in a few years. <strong>Misreading a temporary boom as a guarantee of future availability would be a big mistake.</strong></p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!TgTv!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!TgTv!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!TgTv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!TgTv!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h4><em><strong>Why To Mind the Gap</strong></em></h4><p>Many on my side of this conversation have not been all that concerned by middle powers&#8217; lack of strategic awareness for some time: <strong>they&#8212;we&#8212;had long been convinced that directionally, things would get better.</strong> Increased capabilities would break through to public awareness, and thereby to policy action in middle powers. If there&#8217;s one thing I think the juxtaposition of the summit and the Silicon Valley conversation on coding agents and recursive self-improvement implies, it&#8217;s that this hope was misplaced. </p><p>Capabilities themselves, reported by select groups of AI insiders, will always have the makings of a Rorschach test: true believers will treat them as confirmation of their bullishness, and skeptics will see empty hype. In my view, it will be real effects that will change minds: safety-related warning shots, economic disruptions, and so on. But I think both <a href="https://writing.antonleicht.me/p/do-you-need-a-wake-up-call">warning shots</a> and <a href="https://writing.antonleicht.me/p/homeostatic-ai-progress">economic effects</a> are likely to lag the capability frontier so substantially that waiting for them prompts strategic action far, far too late.</p><p>Some agree with that view, but suggest the awareness gap itself is not that big of a problem&#8212;that directional progress toward &#8216;low-hanging fruit&#8217; is possible even on the minimal consensus that AI is vaguely important. I&#8217;m not convinced. I&#8217;ve <a href="https://www.thefai.org/posts/the-race-worth-winning-middle-powers-in-the-age-of-machine-intelligence">argued</a> in the past that middle power strategies should aim at improvement ultimately not centrally related to the core of AI development, but to a broader supply chain: to gain leverage up&#8211; or downstream of advanced AI and use this leverage to secure access to frontier capabilities and a financial share in AI-driven economic growth.</p><p><strong>But you need a minimum level of awareness to make effective supply chain plays</strong>, or you are headed for at least one of four failure modes. You risk&#8230;</p><ol><li><p><strong>Underpower your economy</strong>. If you don&#8217;t share the specific belief that <em>frontier</em> capabilities are what you need, the temptation is strong to favour &#8216;sovereign&#8217; or open-source solutions, perhaps even by regulatory fiat&#8212;forcing your government and economy to use AI that doesn&#8217;t make you directly reliant on American imports, but also leaves you exposed and uncompetitive with adversaries and economic rivals.</p></li><li><p><strong>Waste time on fake sovereignty</strong>. Middle power initiatives to reach the frontier might fail painfully slowly. They&#8217;ll spend money and time on chasing alternative approaches or attempts to scale, and by the time that governments will have realised they are unlikely to succeed, they will have lost valuable time to secure imports and bottlenecks instead&#8212;or might even be bound by political path dependencies to see through the attempt until the bitter end.</p></li><li><p><strong>Sell early</strong>. If you run the bottleneck strategy, but underestimate the future importance of AI, you run the risk of underpricing your assets. In a world with AI, selling off your semiconductor manufacturing equipment company, or your datasets, or your manufacturing plants, to a great power for a nice lump sum sounds like a great play for a quick windfall. But if they represent future bottlenecks for a truly transformative technology, your appraisal of their value might change: you should treat them as assets fundamentally capable of powering the entirety of your economy in a few years, and only ever consider a sale under very specific circumstances. Doubling down on bottlenecks while expecting a normal world risks just building up capacity that will be swept up and flipped in Delaware or purchased in Guangzhou.</p></li></ol><p>Next to the core of the supply chain strategy, <strong>the gap also endangers middle powers&#8217; ability to do well on political economy.</strong> A recent essay by advisory firm Citrini has made the rounds this week, predicting a derailing of the political economy as the result of a harsh reallocation of value and revenue and a collapse of aggregate demand. Much as this scenario might seem highly contingent in America, it could well unfold in local scenarios: most countries in the world are sleepwalking into the <a href="https://writing.antonleicht.me/p/ai-jobs-and-the-rest-of-the-world">fiscal and labour effects</a> of advanced AI, and unlike the US, they have no easy levers to mitigate them. How are you going to get the fiscal and social capacity in place to deal with these prospects&#8212;edge cases as they might be&#8212;if you&#8217;re operating under fundamentally mistaken assumptions about the future of the technology?</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!TgTv!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!TgTv!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!TgTv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!TgTv!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h4><em><strong>Endure, Export, or Engage?</strong></em></h4><p>Realising the gap persists and treating it as a substantial strategic risk for middle powers allows action along three pathways.</p><p><em><strong>Endure</strong></em>, i.e. just accept that things will be bad in middle  powers, and the window for action only arises once actually transformative effects manifest. Focus on the ways you might mitigate the harms, and begin thinking about the ways to rebuild&#8212;a view perhaps endorsed by an interestingly &#8216;AGI-pilled&#8217; recent post by <a href="https://marginalrevolution.com/marginalrevolution/2026/02/rebuilding-our-world-with-reference-to-strong-ai.html">Tyler Cowen</a>. You might be more inclined toward this view if you thought the &#8216;wake-up&#8217; moment will reliably happen fairly early, or that the effects will be somewhat kind to middle powers. I share neither of these views, and so I&#8217;d seek to avoid a world where we mostly play for &#8216;endure&#8217;.</p><p><em><strong>Export</strong></em>, i.e. try to build a mutually beneficial solution in America: through lab export programs and government export promotion frameworks, advanced AI capabilities along strategically valuable lines can be exported to middle powers. There is even U.S. incentive to elicit the buildup of strategic capabilities in allied governments in an attempt to match Chinese scale. I&#8217;m excited about these programs, and have written much about them in the past. But they&#8217;ll only ever go so far with ignorant buyers: as long as it&#8217;s Americans who make the pitch, middle powers will distrust them and perceive them as extractive salesmen. The breakthrough moment for these programs, I believe, will come when at least some trustworthy middle powers embrace them, experience them as highly valuable, and the word spreads between allies. But that requires us to break through in at least some important and widely trusted allied markets.</p><p>And so that leaves us with <em><strong>engage</strong></em>, i.e. to renew pushes to raise awareness. I&#8217;d like to do that, if you&#8217;ll join me, but essays will no longer do. The work has to be more specific, more trustworthy, and in much greater depth. I think that most who have the view from San Francisco and want to think about the international dimension have been too content to take the broad view. </p><p>I&#8217;m certainly guilty of this, having written much more about the admittedly vague category of middle powers than the idiosyncratic specifics that might be required to make progress on any one of the important powers. In some ways, I was a bit too optimistic&#8212;I thought I might plant a flag of general strategic considerations, and the interest would come as AI progress continued. Instead, and to enable more effective versions of &#8216;export&#8217;, I think future work should be surgical. The broad strategic toolkit exists, and I think it now ought to be translated into the policy processes of the most high-leverage middle powers. South Korea and Japan as east asian allies with comparatively little concern about U.S. dependency, as well as Canada as a likely leader of joint middle power efforts, immediately come to mind.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!TgTv!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!TgTv!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!TgTv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!TgTv!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!TgTv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc7973ddf-df35-4a69-a21f-19500dbdc26a_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h4><em><strong>Outlook</strong></em></h4><p>My thoughts on the comparative attractiveness of these strategies have somewhat changed over the last few weeks. Enduring seems to be even more of a gamble than I assumed, because if the wake-up comes from effects, not capabilities, the initial shock will be even greater. The export response is even more politically fraught than I thought (but still worth trying); and I suspect &#8216;engage&#8217; needs much more surgical country-level focus. And so the right next step might be to zoom in: on helping promising countries willing to take the first step to get their strategy right, and then to export the success case throughout other liberal democracies. I&#8217;ll think more on where we should start, and I hope I&#8217;ll have more on it soon. But in the meantime, I think <strong>Delhi should send us all back to the drawing board.</strong></p><div><hr></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>All the credit for that one goes to an observant colleague. </p></div></div>]]></content:encoded></item><item><title><![CDATA[What AI Summits Are For]]></title><description><![CDATA[The best international AI policy is to fix national strategies]]></description><link>https://writing.antonleicht.me/p/what-ai-summits-are-for</link><guid isPermaLink="false">https://writing.antonleicht.me/p/what-ai-summits-are-for</guid><dc:creator><![CDATA[Anton Leicht]]></dc:creator><pubDate>Mon, 16 Feb 2026 02:49:36 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/2142801a-f028-42f2-ae25-467fea94e6a6_1366x966.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Ahead of the AI Impact Summit in Delhi this week, I&#8217;ve published two longer-form pieces on the fate, prospects, and strategies of AI middle powers. I hope you give them a read:</em></p><ul><li><p><em>with </em><span class="mention-wrap" data-attrs="{&quot;name&quot;:&quot;Dean W. Ball&quot;,&quot;id&quot;:5925551,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!mLaj!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F49371abf-2579-47be-8114-3e0ca580af8b_1024x1024.png&quot;,&quot;uuid&quot;:&quot;0c781d18-f8c5-4d30-8acb-e2a7c067bd85&quot;}" data-component-name="MentionToDOM"></span> <em>for the Foundation for American Innovation: &#8216;<a href="https://www.thefai.org/posts/the-race-worth-winning-middle-powers-in-the-age-of-machine-intelligence">The Race Worth Winning</a>&#8217;, a long-form report on the future of middle powers &amp; their institutions;</em></p></li><li><p><em>with Sam Winter-Levy in Foreign Affairs: &#8216;<a href="https://www.foreignaffairs.com/united-states/ai-divide">The A.I. Divide</a>&#8217;, an essay on strategic challenges and pathways for these same powers. </em></p></li></ul><p><em>Today&#8217;s post places them in the context of what to expect from the Delhi summit.</em></p><div><hr></div><p><em><strong>En route to the AI Impact Summit in Delhi, there&#8217;s pessimism in the air. </strong></em>It&#8217;s shared by many of my fellow travelers to this fourth installment of the AI summit series: what had begun as a slightly premature safety forum in Bletchley and continued as a competition in naive boosterism in Paris now faces some risk of <a href="https://www.transformernews.ai/p/india-ai-impact-summit-new-delhi-trying-to-do">overextending</a> into an attempt to cover the impossibly large spectrum of AI-related questions. </p><p>But I remain hopeful: instead of chasing international governance far outside the overton window, we might be able to use the summit as a forum for the direly needed international conversation about national AI strategies.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!JtOb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!JtOb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png" width="430" height="107.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:430,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:&quot;&quot;,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!JtOb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a></figure></div><h4><em><strong>We&#8217;re Past Peak Summit</strong></em></h4><p>The governance advocates&#8217; pessimism is warranted: it should be clear to everyone that this summit&#8212;this summit series, this cast of governments, this arrangement of technological reality&#8212;<strong>will not culminate in &#8216;international governance&#8217;.</strong> </p><p>That&#8217;s first because <strong>AI is outgrowing any issue-specific international forum</strong> as it bleeds into core areas of domestic and international politics: in many ways making the Munich Security Conference and the Republican National Convention just as much and perhaps more of an AI summit. The AI Impact Summit does not have an obvious mandate to grapple with governance questions. This ties into a second reason: the <strong>governance of the underlying technology itself will happen in America and China</strong> for the foreseeable future, unaffected by the rest of the world. Now that the great powers have awoken to the national importance of AI, they&#8217;ll fight off any attempts to dictate its pace and shape from the outside.</p><p>Some of the summit&#8217;s attendees already know this, some will express their dismay at the revelation, and some will carry on anyways. By the standards of what many had hoped for when the summit series was inaugurated&#8212;binding treaties, global safety standards, convergence on safety principles&#8212;not much will come of it in the current environment. The summit instead serves as a platform for conversations, if things go well; announcements, if they go as expected; empty words, if they go badly.  </p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!JtOb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!JtOb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png" width="430" height="107.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:430,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!JtOb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h4><em><strong>A Multipolar Moment?</strong></em></h4><p><strong>The confused status of the AI summits links back to a deeper geopolitical uncertainty.</strong> Especially in the last few months, middle powers have started to argue that we might be headed for a more multipolar order, or at least departing from the clear unipolar moment of the post-Cold War era. I think the underlying aspiration for greater geopolitical self-determination on the part of the middle powers, perhaps most visibly articulated by Canadian Prime Minister Carney in Davos, is appropriate in the broader strategic context. Yet it&#8217;s clearly in tension with what&#8217;s happening in AI specifically: the concentration of frontier AI in one, maybe two, great powers could herald a technological gap indicative of a strongly bi- or unipolar order. </p><p>These findings pull in opposite directions: the multipolar thesis invites joint sovereignty efforts, perhaps even the rejection of deeper engagement with great powers&#8212;which would leave you cut off from either great power&#8217;s AI ecosystem if you realised and pivoted too late that the real world was turning bipolar again. It&#8217;s hard to justify a self-assured middle power project, an attempt to secure a more multipolar world order, in the face of that fact.</p><p><strong>But all this doesn&#8217;t need to mean that the opportunity for international engagement will be wasted altogether. </strong>In fact, I believe quite strongly that now <em>is</em> the moment for worldwide AI policy&#8212;not &#8216;inter&#8217;-national, but national: nations need to start actually grappling with the transformation they face. Events like the summit still provide an opportunity to advance this conversation: for those in the know to share what they know and those in control to share what will happen; for those set to bear the brunt of disruption to share what they require, and for all of them to get to the same table. These are not conversations of <em>governance</em> per se, but of information sharing and narrow transactions around national strategy, economic and strategic leverage. Their outputs won&#8217;t be treaties and papers, but export-import deals, ambitious industrial policy, and aggressive adoption roadmaps. And the very first thing to get right for this kind of international AI policy is to close a gap in awareness and strategic capability around AI. For that, a summit like this can serve as a venue.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!JtOb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!JtOb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png" width="430" height="107.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:430,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!JtOb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h4><em>Middle Powers Between Worlds</em></h4><p><strong>To lay out what I think this conversation should look like, I&#8217;ll share with you two recent pieces</strong> <strong>I&#8217;ve written on AI middle powers</strong>&#8212;countries that lack frontier AI development, but still have sufficient economic, institutional or strategic capacity to be live players far beyond their borders in the years to come. I feel strongly that their prospects should be a central theme of international AI policy, because they are where the greatest variance is: middle powers are united in facing the greatest gap in where they&#8217;re headed by default, and where they could be with the right strategy today.</p><ul><li><p>The first piece, co-authored with <span class="mention-wrap" data-attrs="{&quot;name&quot;:&quot;Dean W. Ball&quot;,&quot;id&quot;:5925551,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!mLaj!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F49371abf-2579-47be-8114-3e0ca580af8b_1024x1024.png&quot;,&quot;uuid&quot;:&quot;04a21c4e-1d73-401b-8619-83fee4f801ad&quot;}" data-component-name="MentionToDOM"></span>, is a report titled &#8216;<a href="https://www.thefai.org/posts/the-race-worth-winning-middle-powers-in-the-age-of-machine-intelligence">The Race Worth Winning</a>&#8217;. In it, we try to paint a <strong>picture of what&#8217;s at stake for middle powers and their future as nation states. </strong>We argue clear-eyed middle powers will realise that the race worth winning for them is to be the first to use AI to reform the concepts of nation states and national economies. If they embrace that change, back it up with hard leverage and clever import deals, they can get ahead of the curve. If they do not, they&#8217;ll bleed capacity and legitimacy until they fade into irrelevance.</p></li><li><p>The second piece is &#8216;<a href="https://www.foreignaffairs.com/united-states/ai-divide">The A.I. Divide</a>&#8217; in Foreign Affairs, co-authored with Sam Winter-Levy. It argues <strong>middle powers are at risk of &#8216;capturing the risks while minimising the benefits&#8217;</strong>: while it will be hard for them to access frontier AI themselves, it will be much easier for their adversaries and competitors to wield it against them. We develop strategic pathways to address that prospect on two levels: first to secure access to frontier AI either through &#8216;bandwagoning&#8217;, &#8216;playing both sides&#8217;, or &#8216;sovereignty moonshots&#8217;, and second to entrench a strategic position along the AI supply chain to retain an economic stake in AI progress.</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!zqPK!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5091556-06f8-4593-9d0a-b27d9e6db09c_1610x686.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!zqPK!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5091556-06f8-4593-9d0a-b27d9e6db09c_1610x686.png 424w, https://substackcdn.com/image/fetch/$s_!zqPK!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5091556-06f8-4593-9d0a-b27d9e6db09c_1610x686.png 848w, https://substackcdn.com/image/fetch/$s_!zqPK!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5091556-06f8-4593-9d0a-b27d9e6db09c_1610x686.png 1272w, https://substackcdn.com/image/fetch/$s_!zqPK!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5091556-06f8-4593-9d0a-b27d9e6db09c_1610x686.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!zqPK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5091556-06f8-4593-9d0a-b27d9e6db09c_1610x686.png" width="725" height="308.7225274725275" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b5091556-06f8-4593-9d0a-b27d9e6db09c_1610x686.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:620,&quot;width&quot;:1456,&quot;resizeWidth&quot;:725,&quot;bytes&quot;:236709,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://writing.antonleicht.me/i/188031686?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5091556-06f8-4593-9d0a-b27d9e6db09c_1610x686.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!zqPK!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5091556-06f8-4593-9d0a-b27d9e6db09c_1610x686.png 424w, https://substackcdn.com/image/fetch/$s_!zqPK!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5091556-06f8-4593-9d0a-b27d9e6db09c_1610x686.png 848w, https://substackcdn.com/image/fetch/$s_!zqPK!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5091556-06f8-4593-9d0a-b27d9e6db09c_1610x686.png 1272w, https://substackcdn.com/image/fetch/$s_!zqPK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5091556-06f8-4593-9d0a-b27d9e6db09c_1610x686.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">The G20 and a selection of further AI middle powers.</figcaption></figure></div><p><strong>The pieces have in common a sense of urgency: </strong>whenever I contrast the conversations I have in the U.S. with the conversations I have in middle powers, it looks like the rest of the world is far behind the curve. What&#8217;s more, the questions these middle powers face seem almost intractably difficult and interwoven. Time and time again, a question that starts as one of narrow AI strategy turns out to collapse to more fundamental issues of statecraft. It turns out the answer to a lot of middle power problems would be &#8216;better economic policy&#8217; and &#8216;a better geostrategic position&#8217;. It&#8217;s sometimes hard to know where to begin now that the problem is so clear.</p><p>But I think we have some promising ideas, and they&#8217;re slowly backed up by some political capital: as tired as the clich&#233; of the wake-up call may be, many of the middle powers really are slowly getting there. Beyond the many inevitable discussions about communiqu&#233;s and their absence, <em><strong>I think we could make some progress on this when we meet in Delhi. </strong></em></p><p><em>Do please reach out if you think we should chat at the summit!</em></p><div><hr></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p>]]></content:encoded></item><item><title><![CDATA[Import Imperatives]]></title><description><![CDATA[How and why middle powers should import frontier AI systems]]></description><link>https://writing.antonleicht.me/p/import-imperatives</link><guid isPermaLink="false">https://writing.antonleicht.me/p/import-imperatives</guid><dc:creator><![CDATA[Anton Leicht]]></dc:creator><pubDate>Fri, 06 Feb 2026 12:24:44 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/ac0696da-e60f-4c46-b1c8-568e2f3bcc02_580x380.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Together with Sam Winter-Levy, Liam Patell and Sarosh Nagar, I recently published a report for the Foundation of American Innovation: &#8216;<a href="https://www.thefai.org/posts/an-allied-world-on-the-american-ai-stack-a-strategy-for-export-leadership">An Allied World on the American AI Stack</a>&#8217;, describing how the U.S. might mount a program to promote the export of AI to its allies. </em></p><p><em>The following post outlines my motivation for that work and looks at the other side of the same coin, describing why and how middle powers should approach U.S. AI exports.</em></p><div><hr></div><p><em><strong>The issue of technological sovereignty</strong></em> is back on the agenda for many of the world&#8217;s major economies. A moment of geopolitical realignment is coinciding with an unfortunate timing in technology history&#8212;a moment dominated by the rise of advanced AI, a technology that looks superficially similar to past waves of digital products, but comes with drastically different infrastructural premises and therefore strategic conclusions. In the context of this argument, I and many others have <a href="https://writing.antonleicht.me/p/a-roadmap-for-ai-middle-powers">argued</a> that middle powers should attempt leverage-based plays, strengthen their negotiating positions, and so on. In this piece, I want to go into some more detail on one of the more technical links of this argument: how a framework for importing AI might be justified and executed.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p>The path toward this sovereignty in the age of AI is thin, with chasms on both sides. To the left, there&#8217;s <strong>the risk of not taking frontier AI seriously</strong>&#8212;to think that surely, you&#8217;ll find the AI someplace, and that it would be ubiquitous as the result of usual innovation dynamics. That will leave you scrambling for AI access, however coercive, at the eleventh hour. To the right, there&#8217;s <strong>the risk of misunderstanding what it means to take frontier AI seriously</strong>&#8212;and to think that it means you should build your own models at all costs. That leaves you wasting precious time and resources on a low-margin, winner-takes-all element of the AI stack. In between, there&#8217;s realising frontier AI is seriously important, and drawing the correct conclusion: guaranteeing redundant supply to both your government and your economy through a mix of sovereign sub-frontier capabilities and layered, diverse import deals for top-shelf inference.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!JtOb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!JtOb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!JtOb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a></figure></div><h1><strong>Middle Powers Need Frontier AI</strong></h1><p>This all starts with briefly motivating the need for AI imports altogether. There are broadly two classes of arguments on frontier AI access: why in a vacuum, middle powers should seek to maximise the capability of the AI they deploy; and why in comparison, middle powers should be wary of falling behind in the capability of AI they deploy. Both these arguments are shrouded in uncertainty&#8212;and I don&#8217;t mean to suggest that all future trajectories definitely point toward needing the very best AI for every use case. But I think taken together, the odds are decent that this requirement will arise at least along some of the following dimensions&#8212;good enough that it should stop a responsible middle power from foregoing access to frontier AI.</p><h4><em>Why You Want Good AI</em></h4><p>In the interest of keeping this post somewhat focused, I won&#8217;t rehash the general case for advanced AI at length. Suffice it to say that <strong>AI capabilities are on track to become meaningful drivers of economic growth, strategic capability, and scientific progress</strong>. They serve as a force multiplier for existing capacity: accelerating research, compounding productivity gains, and making previously scarce forms of expertise and analysis abundantly available. Governments that deploy advanced AI stand to dramatically improve the quality and reach of public services; firms stand to unlock substantial efficiency gains and new revenue streams; and the scientific enterprise stands to benefit from AI&#8217;s ability to accelerate discovery across domains. In a world where these capabilities are rapidly improving and widely deployed, any country that opts out of serious AI adoption is accepting a structural disadvantage across virtually every dimension of national capacity. The baseline intuition, of course, is then that better AI probably makes for better national performance across the board.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Bmeq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F208989f1-4e98-4b7e-a1f0-d6925635c3dc_1610x906.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Bmeq!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F208989f1-4e98-4b7e-a1f0-d6925635c3dc_1610x906.png 424w, https://substackcdn.com/image/fetch/$s_!Bmeq!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F208989f1-4e98-4b7e-a1f0-d6925635c3dc_1610x906.png 848w, https://substackcdn.com/image/fetch/$s_!Bmeq!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F208989f1-4e98-4b7e-a1f0-d6925635c3dc_1610x906.png 1272w, https://substackcdn.com/image/fetch/$s_!Bmeq!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F208989f1-4e98-4b7e-a1f0-d6925635c3dc_1610x906.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Bmeq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F208989f1-4e98-4b7e-a1f0-d6925635c3dc_1610x906.png" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/208989f1-4e98-4b7e-a1f0-d6925635c3dc_1610x906.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:441942,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://writing.antonleicht.me/i/186980525?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F208989f1-4e98-4b7e-a1f0-d6925635c3dc_1610x906.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Bmeq!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F208989f1-4e98-4b7e-a1f0-d6925635c3dc_1610x906.png 424w, https://substackcdn.com/image/fetch/$s_!Bmeq!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F208989f1-4e98-4b7e-a1f0-d6925635c3dc_1610x906.png 848w, https://substackcdn.com/image/fetch/$s_!Bmeq!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F208989f1-4e98-4b7e-a1f0-d6925635c3dc_1610x906.png 1272w, https://substackcdn.com/image/fetch/$s_!Bmeq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F208989f1-4e98-4b7e-a1f0-d6925635c3dc_1610x906.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Just as much as <a href="https://ourworldindata.org/grapher/energy-use-per-person-vs-gdp-per-capita">there are no low-energy, high-income countries</a>, I doubt that there will be rich countries without abundant access to AI capability.</figcaption></figure></div><h4><em>Why You Want The Best AI</em></h4><p>But if this was only a matter of pushing, relentlessly, the frontier of what AI can help societies achieve, you&#8217;d still have reason to object to the frontier-focused logic. Just as much as the best talent outperforms mediocre talent, but it is still not a strategic imperative to pay everyone hundreds of thousands in wages, you might say that your resources aren&#8217;t best spent on maximising AI capability. This underestimates the incredibly thin margins of strategic and economic competition that might come with an AI age, and what they mean for what constitutes a minimum viable stack.</p><p>A middle power&#8217;s AI capabilities need to keep up on three levels of competition:</p><ul><li><p><strong>Economically, firms and workers equipped with inferior AI risk being outcompeted by rivals wielding stronger systems.</strong> AI will exacerbate the pace and international nature of market competition, shrinking institutional moats and compressing the margins on which competitive advantage rests. We can already see glimpses of this in environments where competition is most intense: in high-frequency financial trading, microseconds of latency advantage justify enormous infrastructure investment; at the frontier of AI development itself, companies treat even marginal capability differences as competitively decisive. In an AI-saturated economy, these dynamics will extend far beyond niche markets&#8212;the capability gap between frontier and near-frontier systems may translate directly into competitive disadvantage for entire national industries.</p></li><li><p><strong>In terms of government capacity, states need AI capabilities at least comparable to those available elsewhere</strong> if they are to provide for their citizens at a competitive standard. But the sharper version of this point is that a government not adequately equipped with advanced AI runs the risk of being overwhelmed by the sheer volume and sophistication of AI-generated activity it must process and regulate&#8212;AI-assisted regulatory filings, legal challenges, synthetic media and automated public communications. Without matching capabilities, the administrative state risks being structurally outpaced by actors it is meant to govern.</p></li><li><p><strong>In terms of security, having AI as good as your adversaries&#8217; is a precondition for effective national defence</strong>. Attackers&#8212;foreign states, non-state actors, criminal networks&#8212;will be equipped with near-leading AI capabilities through stolen models, jailbreaks, or near-frontier open-source systems. They will use these to devise and execute cyberattacks, develop chemical and biological weapons, and serve as a force multiplier for hostile operations. The offence-defence balance in AI remains uncertain, but research consistently finds that more advanced models are more effective on both sides of the equation. A country wielding weaker AI than its adversaries is accepting a security deficit it cannot afford. To many researchers in the field, it&#8217;s plausible that this balance will remain on a knife&#8217;s edge&#8212;the question of whether a middle power wins its game of whack-a-mole against the quickly accelerating onslaught of AI-driven attacks might have a lot to do with the capabilities middle powers can deploy.</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!JtOb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!JtOb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!JtOb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>Middle Powers Can&#8217;t Build Frontier AI</strong></h1><p>The core mistake in middle power strategy today is to realise this importance of frontier AI, but take away from it an interest in building their own AI systems. Once the mistaken assumptions behind this impulse are clarified, the precise requirements for import deals become clear.</p><h4><em>You Don&#8217;t Have The Cards</em></h4><p>Once again, middle powers are exceedingly unlikely to develop their own frontier AI systems. I don&#8217;t want to derail this piece too far by repeating this well-developed point, so to summarise in brief terms: The key inputs for frontier development are capital, energy, chips, talent&#8212;and indirectly, the ability to eventually turn software innovation into revenue that can fuel a revenue flywheel. Any place in the world other than the US and China is outmatched along these lines already. But more to the point, the contours of the infrastructure race tomorrow are not shaped by today&#8217;s scramble to catch up, but by yesterday&#8217;s buildout decisions. The infrastructure that has built current American leadership in AI is negligible compared to the clusters that will come online this year&#8212;gigawatts of computational capacity, orders of magnitude greater than what we had in 2024. The most ambitious middle power projects promise a catch-up to America&#8217;s capacity as it was last year. But the moving target they are chasing keeps accelerating, and even the articulated ambitions from elsewhere have not kept pace.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!0fQM!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdc7a80f-511a-4dda-9d83-748a413f1970_1468x802.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!0fQM!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdc7a80f-511a-4dda-9d83-748a413f1970_1468x802.png 424w, https://substackcdn.com/image/fetch/$s_!0fQM!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdc7a80f-511a-4dda-9d83-748a413f1970_1468x802.png 848w, https://substackcdn.com/image/fetch/$s_!0fQM!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdc7a80f-511a-4dda-9d83-748a413f1970_1468x802.png 1272w, https://substackcdn.com/image/fetch/$s_!0fQM!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdc7a80f-511a-4dda-9d83-748a413f1970_1468x802.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!0fQM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdc7a80f-511a-4dda-9d83-748a413f1970_1468x802.png" width="1456" height="795" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/fdc7a80f-511a-4dda-9d83-748a413f1970_1468x802.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:795,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:311397,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://writing.antonleicht.me/i/186980525?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdc7a80f-511a-4dda-9d83-748a413f1970_1468x802.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!0fQM!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdc7a80f-511a-4dda-9d83-748a413f1970_1468x802.png 424w, https://substackcdn.com/image/fetch/$s_!0fQM!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdc7a80f-511a-4dda-9d83-748a413f1970_1468x802.png 848w, https://substackcdn.com/image/fetch/$s_!0fQM!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdc7a80f-511a-4dda-9d83-748a413f1970_1468x802.png 1272w, https://substackcdn.com/image/fetch/$s_!0fQM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdc7a80f-511a-4dda-9d83-748a413f1970_1468x802.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><a href="https://epoch.ai/data-insights/ai-supercomputers-performance-share-by-country">Compute infrastructure</a> is the core input to competitive development at the frontier.</figcaption></figure></div><p>The most persistent hope to reach the frontier runs through the notion of finding a shortcut past the infrastructure competition&#8212;perhaps a new paradigm that sidesteps the need for massive compute. I and others have <a href="https://x.com/anton_d_leicht/status/2011095569175847149">argued</a> at length why this is unlikely to work: paradigm search itself benefits from scale; whatever new approach emerges will almost certainly require substantial infrastructure to become strategically meaningful; and the major labs, far from being locked into a single bet, have introduced novel paradigms repeatedly while maintaining their lead. The scaling challenge is the one middle powers have consistently failed to solve, and there is little reason to think a government-backed initiative will break that pattern.</p><h4><em>The Peril of Sub-Frontier Champions</em></h4><p><strong>What middle powers </strong><em><strong>can</strong></em><strong> do, of course, is develop AI that&#8217;s just fine. </strong>This is, in effect, the business model of national champions like Mistral, Cohere, and so on. They can&#8212;and effectively often do&#8212;pursue a strategy of &#8216;fast following&#8217;: finding efficient ways to replicate the most successful approaches taken at the frontier, staying cost-efficient at doing so through a mixture of genuine innovation, quick replication, and shameless distillation.</p><p>It&#8217;s not quite clear to me if this will remain viable forever: if you assume the costs of fast-following stay stable at some percentage of frontier development, but think that revenue accrues mostly with developers at the very frontier, it follows that fast followers will burn through increasing amounts of cash without the solace of the kinds of massive revenue prospects that major developers enjoy. But even provided you can find some way to make a fast-following so-called frontier ambition cash neutral, misunderstanding their value can jeopardise any broader national AI strategy.</p><p>Today, some middle powers have realised the first part of my argument, but haven&#8217;t quite understood that their local frontier ambitions are likely to utterly fail. This is perhaps the most dangerous position to be in, because it<strong> fails extraordinarily slowly and binds political capital and financial resources on the way.</strong> If you simply don&#8217;t get that AI is happening, and reality blows up in your face, you can still at least try to pivot dramatically. But if you think that yes, AI <em>is</em> big, but you just invested a couple hundred million into a frontier AI initiative, and they&#8217;re telling you things are going great, you might be tempted to wait another few months, another year or two. It would be politically costly to admit the initial funding had been wasted, read as a lack of support to your local innovators, and so on. Whenever an external shock prompts you to think &#8216;maybe you should do something&#8217;, you&#8217;ll respond with &#8216;ah, don&#8217;t worry, our frontier AI initiative is working on it&#8217;---and by the time it has burned through all your money, it&#8217;ll be too late.</p><p>Banking too much on the illusion of solving the problem through fast-following also carries economic and strategic risks because it <strong>creates political incentives for middle powers to force their economies and institutions to use sub-par solutions. </strong>You can see inklings of this in past conversations on digital sovereignty (and in France): to protect their nascent or subpar alternatives, middle powers create regulatory constraints that forcibly create a steady demand signal. They&#8217;ll apply burdensome regulation to foreign alternatives, give tax privileges for using domestic solutions, or focus government procurement on national champions. Currently, this mostly means that French civil servants have to start every meeting by spending five minutes  logging into a bastardisation of Zoom that doesn&#8217;t work before jumping on a phone call instead. It&#8217;s a little bit less funny if it means that businesses and institutions experience the marked competitive disadvantages I describe above because the government wants to incubate national champions that&#8217;ll never go anywhere.</p><p>Insofar as you think subfrontier champions are viable and you want to pursue them as a middle power, you need to do your absolute best to <strong>resist the political reflex to protect them at all costs. </strong>They are precisely and only valuable if they do not distort your decision-making, do not hijack your broader AI strategies. Give them limited, non-discriminatory support that doesn&#8217;t trade off against imports, see whether fast-following actually stays viable without being a cash sink, and only keep doing it if that&#8217;s true. Above all, you cannot give in to the tempting illusion that these sub-frontier champions are on a path toward the frontier, or they&#8217;ll cloud your judgement for navigating important import deals. <strong>National champions can only ever be a small part of middle powers&#8217; AI strategies; and frontier AI they do not provide.</strong></p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!JtOb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!JtOb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!JtOb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>Middle Powers Can Get Frontier AI</strong></h1><p><strong>So, middle powers should import frontier AI. </strong>But that&#8217;s easier said than done. The current import mix, by volume of tokens provided, is massively dominated by three categories: import through the use of foreign-developed open-source models; through API calls to foreign developers; and through simple consumer-level subscriptions. None of these are a reliable channel for importing a strategically important input.</p><h4><em><strong>Against Naive Import Structures</strong></em></h4><p>Scaffolding around open-source models fundamentally assumes the continued deployment of highly capable OS models. This is already not really given today: open-source models are fine, but usually a generation behind the actual frontier. Attractors pull both ways: the U.S. administration would like domestic developers to deploy good open source models to eat into Chinese market share in that area, but it&#8217;s really hard to make revenue targets when you open source actually impressive capabilities, and serious misuse concerns still remain unanswered. Perhaps China does not care, and aggressively open-sources highly capable models to undercut U.S. market share&#8212;but who&#8217;s to say they&#8217;ll keep doing so once middle powers have gotten hooked on the prospect of Chinese open source, and who knows about the much-discussed security implications? Open source is far from a reliable pathway to accessing frontier capabilities.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!5_xl!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc023fdb9-59a9-4a1c-98c1-323000058248_1536x832.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!5_xl!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc023fdb9-59a9-4a1c-98c1-323000058248_1536x832.png 424w, https://substackcdn.com/image/fetch/$s_!5_xl!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc023fdb9-59a9-4a1c-98c1-323000058248_1536x832.png 848w, https://substackcdn.com/image/fetch/$s_!5_xl!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc023fdb9-59a9-4a1c-98c1-323000058248_1536x832.png 1272w, https://substackcdn.com/image/fetch/$s_!5_xl!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc023fdb9-59a9-4a1c-98c1-323000058248_1536x832.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!5_xl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc023fdb9-59a9-4a1c-98c1-323000058248_1536x832.png" width="1456" height="789" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c023fdb9-59a9-4a1c-98c1-323000058248_1536x832.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:789,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:339474,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://writing.antonleicht.me/i/186980525?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc023fdb9-59a9-4a1c-98c1-323000058248_1536x832.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!5_xl!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc023fdb9-59a9-4a1c-98c1-323000058248_1536x832.png 424w, https://substackcdn.com/image/fetch/$s_!5_xl!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc023fdb9-59a9-4a1c-98c1-323000058248_1536x832.png 848w, https://substackcdn.com/image/fetch/$s_!5_xl!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc023fdb9-59a9-4a1c-98c1-323000058248_1536x832.png 1272w, https://substackcdn.com/image/fetch/$s_!5_xl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc023fdb9-59a9-4a1c-98c1-323000058248_1536x832.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">You don&#8217;t get frontier open-source capability, and the <a href="https://epoch.ai/data-insights/open-weights-vs-closed-weights-models">comparative quality</a> of open source varies wildly&#8212;too wildly to structure a strategy around, I think.</figcaption></figure></div><p><strong>API and consumer-level access alone are obviously not a serious sovereign channel </strong>for securing access to a strategically relevant resource: the precise conditions for access are usually not enforceable, there is little leeway for negotiating and actually enforcing treatment of privileged data, little guarantee to migrate workflows and requirements from one developer to the other, and so on. In the AI developers&#8217; standard business offerings, the terms are set by volatile young companies still searching for their exact product portfolios, preferred markets, and so on&#8212;companies that still routinely experiment with their pricing, the suite of available models, etc.</p><p>If access to frontier AI as provided by foreign developers is to be reliable enough to assuage middle powers&#8217; sovereignty concerns, it can&#8217;t be principally provided through a channel this volatile. That&#8217;s not to say firms and consumers wouldn&#8217;t ultimately access AI capabilities through APIs and subscriptions&#8212;it&#8217;s just to say that the provision of these capabilities should flow through a higher layer of longer-term national-level framework instead of being strict ad-hoc B2B relationships between middle power firms and US developers.</p><h4><em><strong>Imports As Statecraft </strong></em></h4><p>What should instead be the shape of AI import deals? For the vast majority of middle powers, I think the most realistic solution is <strong>high-level government-backed framework agreements with major developers</strong> that provide for (a) directly negotiated capabilities provided to governments and (b) framing, guarantees and conditions for specific agreements between the private sector in middle powers and the developers abroad. These exist today already in the shape of &#8216;for countries&#8217; deals between AI developers and countries, such as OpenAI&#8217;s Stargate initiative; but most Stargates are limited in scope, and not designed to provide AI access at scale for entire national economies. You could combine dedicated Stargate-style projects with broader hyperscaler expansions as well as deals for remote access to foreign-hosted inference to develop a layered stack of AI access options for your public and private sector&#8212;perhaps the closest to this has been the <a href="https://www.whitehouse.gov/presidential-actions/2025/09/memorandum-of-understanding-between-the-government-of-the-united-states-of-america-and-the-government-of-the-united-kingdom-of-great-britain-and-northern-ireland-regarding-the-technology-prosperity-de/">UK-US Tech Prosperity Deal</a>, much of which points in exactly the right direction (and has regrettably now stalled).</p><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;9bc8d48a-7f8b-49f2-af00-f71cccabfaf5&quot;,&quot;caption&quot;:&quot;These days, when you glance at the top technology news, the odds are decent that you&#8217;ll find announcements of increasingly contrived partnerships between AI developers and atrophied legacy institutions. In the ensuing whirlwind of news updates, diagrams of investment flows, and hasty takes on what this means for &#8220;the bubble,&#8221; I think we&#8217;ve missed the mo&#8230;&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;sm&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;The Future of AI Export Deals&quot;,&quot;publishedBylines&quot;:[],&quot;post_date&quot;:&quot;2025-10-23T12:49:58.868Z&quot;,&quot;cover_image&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/56038b8b-755f-46c8-95c6-edce5ee1ce4d_770x508.jpeg&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://writing.antonleicht.me/p/diffusion-deals&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:176896683,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:0,&quot;comment_count&quot;:0,&quot;publication_id&quot;:3834218,&quot;publication_name&quot;:&quot;Threading the Needle&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!4SKU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd7dff89d-5b66-4169-8ef8-eb2ec9ed94e8_1024x1024.png&quot;,&quot;belowTheFold&quot;:true,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><p><strong>Some misgivings about this kind of deal have been brought forward </strong>by those who have noticed the U.S. government&#8217;s increased interest in leveraging technology deals to gain concessions in other areas of foreign policy. This will require a <a href="https://writing.antonleicht.me/p/a-roadmap-for-ai-middle-powers">substantial buildup</a> of domestic leverage on the side of middle powers to avoid. Nevertheless, I strongly believe despite all recent geopolitical turmoil, America and not China should be and remain the world&#8217;s partner of choice here, even if the Chinese did manage to mount a competitive export program at some point soon. But I recognise that some middle powers will be served well by &#8216;playing both sides&#8217; and playing off interested importers against each other, so I&#8217;ll try to provide a general account of how to structure reliable import deals.</p><h4><em><strong>Three Levels of Mechanism</strong></em></h4><p>The modalities of contracted AI access a middle power should negotiate for are best understood as corresponding to three levels of importers&#8217; risk. The differences between these risks matter: sovereignty solutions that only dodge the first level are ineffective against more serious dependencies; expensive independent stacks are not required to address minor pricing concerns. When middle powers aim policy at reducing dependencies, they should clarify what level they are addressing.</p><ul><li><p><em><strong>The first level of risk is getting iteratively gouged&#8212;AI exporters as predatory software companies. </strong></em>Rates increase, prices hike, you don&#8217;t get the full-powered version of the AI system next year anymore&#8212;just like dealing with a slightly inconvenient software vendor. The way around this level of risk is fairly simple&#8212;it requires provisions that prevent lock-in into any one vendor. Competition is the antidote to unfair pricing practices, and so middle powers need to ensure a modicum of optionality in their decisions between different imported AI services. You could imagine this to happen on a few different levels: building some optionality into datacenter deals that mean your infrastructure locks you into one provider of models<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a>; and building and procuring independent scaffolding that allows for the exchange of models used in important applications.</p></li></ul><ul><li><p><em><strong>The second level of risk is getting your supply pinched&#8212;AI exporters as OPEC. </strong></em>In conversations with middle powers, I sometimes hear uncertainty about why any AI exporter would really leverage access. I think this is where the analogy with software licenses obviously stops: to continue to service AI capabilities, a large amount of fundamentally highly fungible computing power is required. That makes it feasible and sometimes attractive to throttle middle power&#8217;s access: because infrastructure providers want to exert OPEC-style pressure on pricing to ensure profitability, or because exporters might want to redeploy limited computing capacity to use cases they consider more important than exporting AI to middle powers&#8212;large training runs, domestic inference, and so on. </p><p><br>It&#8217;s very difficult to avoid this: you can&#8217;t stockpile AI capability, so you&#8217;re always vulnerable to spot changes, and as we&#8217;ve seen with oil, exporters can always feign substantive constraints outside the scope of any contract. A heterogeneous landscape of inference providers for your economy helps here: different countries and developers are developing their respective approaches to servicing inference&#8212;out of the Gulf, Norway, local hyperscaler datacenters, joint projects and so on. A wide spread helps avoid pinches from any one constraint. It&#8217;s also much easier to avoid this risk if you build out enough domestic compute to ensure inference sovereignty, but there are <a href="https://writing.antonleicht.me/p/datacenter-delusions">costly</a> trade-offs involved in this.</p></li></ul><ul><li><p><em><strong>The third level of risk is getting cut off&#8212;AI exporters as Russia. </strong></em>If push came to shove, AI exporters might simply be able to turn off importers&#8217; access to AI systems altogether. This is, of course, a nuclear option: it destroys both the international credibility of the exporting country and the economic viability of the exporting companies for quite a long time. But in an unstable world order, some middle powers might feel uneasy about an outright cutoff even being a distant option. You cannot dodge this risk entirely&#8212;if you run your entire economy on wartime infrastructure that isn&#8217;t market-efficient in peacetime&#8212;building your own models on your own compute&#8212;, you&#8217;re going to be broke by the time wartime comes around. <br><br>But you can mitigate the worst risks. One way is deploying some genuinely sovereign high-security infrastructure to run national-security-relevant applications, for which you need high-security datacenters as well as appropriate contracts with developers. Another way is keeping a sub-frontier fast follower at least somewhere in your orbit of close and dependable allies that you could fall back to if you actually get cut off&#8212;if the existence of a developer like Mistral were a joint fallback for European allies, that might be a path to keep it viable.</p></li></ul><p>Taken together, the measures middle powers need to take to avoid these risks provide a decent outline of the contracts they should seek out&#8212;and inversely, the import deals the U.S. government should endorse and the American developers should offer. Ideally, they occur in a financialised market for infrastructure, in a network of independent and complementary contracts with hyperscalers, developers, and nation-states for inference provision, and they all feed into a layer of national scaffolding that is somewhat robust to changes in imported model capability. This export/import framework is mutually beneficial: <strong>it would provide America with allies that do not hesitate to buy its technology, and allies with the assurance they can trust their primary source of artificial intelligence. </strong></p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!JtOb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!JtOb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!JtOb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!JtOb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0435e36d-bb93-4777-8620-205540e18fbb_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>Outlook</strong></h1><p>This is quite the wishlist, and it will require middle powers to step up their game in related areas to leverage their way to get at least some of it: as I&#8217;ve discussed in the past, they need to aggressively use their positions in upstream and downstream bottlenecks and pursue joint negotiations. But as much as it&#8217;s a wishlist, it&#8217;s also advice to exporters: if they can&#8217;t make middle powers an export offer that feels tenable, middle powers might instead still choose to try their hand at ill-fated sovereignty ambitions. That would hurt middle powers the most, but it still means you lose out on your export sales, too&#8212;so it&#8217;s in everyone&#8217;s interest to scope out how a viable import/export paradigm would look. I&#8217;ve <a href="https://writing.antonleicht.me/p/forging-a-pax-silica">written</a> in some <a href="https://writing.antonleicht.me/p/making-ai-export-promotion-work">detail</a> about the exporter-side considerations on this topic, and I still believe there is substantial overlapping interest here.</p><p>But getting the question of frontier AI access right does not solve the question of middle power strategy, and in fact places a greater burden on the resulting strategic questions: especially if you import your frontier AI, you will need to put it to good economic use to make sure you retain the leverage to back up your dealmaking and guarantee the revenue required to finance the inflow of AI. Much more on these questions will follow on this publication in short order. For now, I&#8217;ll close by reiterating: as painful as the role of AI importer might be, middle powers cannot avoid it. Their focus must now be on building the best version of an import paradigm.</p><div><hr></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>This raises some interesting questions about the future of ASICs in country-level exports: developers might prefer to export their in-house compute to ensure long-term lock-in, but importers will prefer multi-purpose compute</p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[How AI Safety Is Getting Middle Powers Wrong]]></title><description><![CDATA[The case for pivoting from global governance to national interests]]></description><link>https://writing.antonleicht.me/p/how-ai-safety-is-getting-middle-powers</link><guid isPermaLink="false">https://writing.antonleicht.me/p/how-ai-safety-is-getting-middle-powers</guid><dc:creator><![CDATA[Anton Leicht]]></dc:creator><pubDate>Thu, 22 Jan 2026 12:28:48 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/031cc300-09c0-44ec-9778-c8131c3a166e_1424x970.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><strong>Geopolitical realities are shifting, and leave most of the world in a tough spot. </strong>That much will become clearer still once the current fault lines are exacerbated by the advent of advanced AI, developed by two great powers alone. Few who take this prospect seriously care about the fate of middle powers; technologists, accelerationists, national security types have moved to US and China policy. Yet one group remains well-positioned to contribute to the middle power question: AI safety advocates.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a></p><p>We face a <a href="https://www.cbc.ca/news/politics/mark-carney-speech-davos-rules-based-order-9.7053350">Gaullist moment</a> of reawakening national ambition in many middle powers. It provides the political preconditions to advocate for ambitious AI strategies the world over. The international branches of the safety movement are well-positioned to seize that moment: from their rare position of expertise on AI, and with a strong national foothold in many middle powers, they could pivot to improving the response of middle powers to the deployment and diffusion of AI technology: building misuse resilience, guarding against economic disempowerment, and making sure the world order in which advanced AI emerges is a stable one.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p>Instead, <strong>much of the safety community is stuck answering the wrong questions. </strong>They think middle powers can and should exert substantial influence on the <em>development </em>of frontier AI systems in the US and China &#8211; a proposition that was always contentious, but is untenable in the world as it is today. By sticking to it, they&#8217;re hurting their own credibility in middle powers that are now more attuned to their own national interest; and undermining the perception of AI safety work in the US. If safety advocates in middle powers abandoned the unrealistic pursuit of leverage and global governance, they could <strong>pivot to making AI deployment go well in middle powers</strong> &#8211; where this important work is tractable and neglected.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!KJpY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!KJpY!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!KJpY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png" width="424" height="106" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2288f183-9076-47b5-b248-884f54abe762_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:424,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!KJpY!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a></figure></div><h1><strong>A Platform In Review</strong></h1><p><strong>Safety advocates have largely engaged with the middle power conversation from a US- and China-centric point of view:</strong> they assume that critical risks emerge from the development of AI systems, and since that development happens in the US and China, their engagement in middle powers must aim at affecting these two great powers. Some readers may respond that domestic regulation does not fundamentally target development, just market entry conditions. That&#8217;s a fair argument about the legal principle, but the compelling force of the market and the infeasibility of developing wholly separate AI products for it makes it practically moot. When meeting market entry conditions for a vital market requires changing how you develop, it is in effect development-facing regulation &#8211; and in fact, this is arguably by design as the intended consequence of a &#8216;<a href="https://www.governance.ai/research-paper/brussels-effect-ai">Brussels Effect</a>&#8217;.</p><p>The variations on this theme are many: AI safety organisations have tried to pass safety-focused regulation in the EU, UK, and elsewhere; they have lobbied middle power governments to influence the US and even China to take AI risks more seriously; they are developing tools of leverage for middle powers to strongarm the US into safety concessions; and they are seeking signatories to a multilateral treaty. The core hypothesis is always the same: maybe one can use middle powers as a lever to affect US development. I believe this thesis was always somewhat flawed, but it is now growing untenable. There are two problems with the approach.</p><h4><em><strong>It Doesn&#8217;t Really Work</strong></em></h4><p>First, past attempts to influence development from outside are faltering. Using domestic law to <a href="https://scholarship.law.columbia.edu/faculty_scholarship/271/">constrain foreign developers</a> has found its landmark example in the <a href="https://digital-strategy.ec.europa.eu/en/policies/regulatory-framework-ai">EU AI Act</a>, which is not doing too well: the broader law is a famously unpopular piece of legislation, its enforcement has been <a href="https://www.euronews.com/my-europe/2025/11/19/european-commission-delays-full-implementation-of-ai-act-to-2027">delayed numerous times</a>, and its application to the cutting edge of AI technology is evermore uncertain: the technology moves fast, the commission moves slowly, and the regulation is necessarily a snapshot of 2023&#8217;s best and worst assumptions about AI. The weaknesses of the overall act apply least readily to its most safety-coded aspects concerning general-purpose AI, and the safety and security chapter of the <a href="https://digital-strategy.ec.europa.eu/en/policies/contents-code-gpai">Code of Practice</a> is a very effective implementation tool for them. My reservations have to do with the broader economic and geopolitical setting, where substantial future enforcement seems unrealistic, should it become actually burdensome: whenever any stipulation turns out too annoying for a frontier developer, they can quickly enlist the US administration and their own economic leverage to skirt the Code.  A safetyist might say the key outcome was the transparency pathways set in place; I&#8217;m unsure how great the marginal gain over <a href="https://carnegieendowment.org/emissary/2025/10/california-sb-53-frontier-ai-law-what-it-does">SB-53 </a>is compared to the effort, and arguing this difference warranted all that effort seems extraordinary.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a></p><div><hr></div><p><strong>International policy fares even worse. </strong>I&#8217;ve written at length about international policy attempts up to <a href="https://writing.antonleicht.me/p/the-early-death-of-international">last year</a>, which leaves me to discuss the more recent call for <a href="https://red-lines.ai/">Red Lines</a> at the UN. While safety advocates view this as an achievement, it strikes me that most powerful decisionmakers on AI haven&#8217;t even heard of it. For good reason: the UN&#8212;whether as a body or a collection of countries&#8212;has no real power it can exert on an issue if a major power doesn&#8217;t want it to. Safety advocates know this, but some view the Red Lines being in place as valuable nevertheless: once political salience arrives, they argue, the Red Lines provide a framework to hold onto. I don&#8217;t have much new to say on this &#8211; I see no way the US government lets any other country dictate what it does on AI. It&#8217;s too polarised an issue, too easy a domestic political win, and too much of a strategic imperative to exact absolute sovereignty over domestic development. A similar fate befalls the <a href="https://superintelligence-statement.org/">call</a> for international prohibition of superintelligence: in both cases the burden of proof is on the safetyists to explain how this ever happens.</p><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;221f1192-2c9d-40ad-b9e9-f83982eae13f&quot;,&quot;caption&quot;:&quot;In 1922, four years had already passed since the end of World War I &#8211; but a destructive arms race for naval power was still going on. No amount of aligned incentive and reasonable argument had stopped that race, great costs to the allies&#8217; economies notwithstanding. A policy window opened only when political incentives shifted: The British could no longe&#8230;&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;sm&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;The Early Death of International AI Governance&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:113003310,&quot;name&quot;:&quot;Anton Leicht&quot;,&quot;bio&quot;:&quot;I write about the political economy of advanced AI.&quot;,&quot;photo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!FPyB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F75422da7-aafa-42ab-8fa6-cf4f0df85cf0_3166x3166.jpeg&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2025-03-31T14:36:49.147Z&quot;,&quot;cover_image&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a9c65657-9e64-4996-9628-25b58070fe39_1600x1222.jpeg&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://writing.antonleicht.me/p/the-early-death-of-international&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:160210183,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:30,&quot;comment_count&quot;:6,&quot;publication_id&quot;:3834218,&quot;publication_name&quot;:&quot;Threading the Needle&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!4SKU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd7dff89d-5b66-4169-8ef8-eb2ec9ed94e8_1024x1024.png&quot;,&quot;belowTheFold&quot;:true,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><p>Most have never publicly tried &#8211; with perhaps the sole exception of a recent <a href="https://asi-prevention.com/">paper</a> by prominent AI safety organisations Conjecture and ControlAI, which outlines an admirably honest account of what it would take: middle powers in pursuit of a global mission, outright leveraging the threat of sabotaging critical digital infrastructure against an erstwhile ally. The paper correctly identifies the required scope, but also demonstrates the lack of feasibility: it&#8217;s an all-in strategy with no off-ramps and great rewards for defection for any of the involved middle powers, and in a politically heterogeneous setting, it seems like an outright impossible sell. If this is the best actual way to affect development, that&#8217;s bad news for the development-focused strategy.</p><div><hr></div><p><strong>All this will grow even less effective over time.</strong> The specific reason is that legislation leveraging the consumer market for outside effect works best the first time you try it &#8211; sneak provisions in on a low, technocratic level, slowly build up compliance frameworks and path dependencies that entrench your law and make it last even against political headwinds in the future. This is perhaps the success story of the GDPR; but with every iteration, the political headwinds happen earlier, and the entrenchment is less and less resilient to opposition from the markets you seek to regulate. The general version of this is that development-focused regulation from abroad works best in a low-salience environment, outside the direct purview of the public and political decision-makers, negotiated and accepted by mid-level technocrats who converge on substantive policy thought.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ZMN7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadb406f3-a596-45af-9e0a-18374f90b349_1600x421.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ZMN7!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadb406f3-a596-45af-9e0a-18374f90b349_1600x421.png 424w, https://substackcdn.com/image/fetch/$s_!ZMN7!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadb406f3-a596-45af-9e0a-18374f90b349_1600x421.png 848w, https://substackcdn.com/image/fetch/$s_!ZMN7!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadb406f3-a596-45af-9e0a-18374f90b349_1600x421.png 1272w, https://substackcdn.com/image/fetch/$s_!ZMN7!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadb406f3-a596-45af-9e0a-18374f90b349_1600x421.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ZMN7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadb406f3-a596-45af-9e0a-18374f90b349_1600x421.png" width="1456" height="383" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/adb406f3-a596-45af-9e0a-18374f90b349_1600x421.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:383,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ZMN7!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadb406f3-a596-45af-9e0a-18374f90b349_1600x421.png 424w, https://substackcdn.com/image/fetch/$s_!ZMN7!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadb406f3-a596-45af-9e0a-18374f90b349_1600x421.png 848w, https://substackcdn.com/image/fetch/$s_!ZMN7!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadb406f3-a596-45af-9e0a-18374f90b349_1600x421.png 1272w, https://substackcdn.com/image/fetch/$s_!ZMN7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fadb406f3-a596-45af-9e0a-18374f90b349_1600x421.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">This is the <a href="https://www.nbcnews.com/tech/tech-news/us-rejects-international-ai-oversight-un-general-assembly-rcna233478">geopolitical backdrop</a> we are working with for the foreseeable future. Both OSTP Director Kratsios&#8217; and President Trump&#8217;s comments at the UN should be required reading.</figcaption></figure></div><p><strong>We&#8217;ve now rapidly left this low-salience environment in two unrelated ways. </strong>First, AI is no longer low-salience: governments around the world have identified it as a geopolitically and economically important issue, and they&#8217;re hesitating to hand it over to peripheral decisionmakers and institutions. For instance, I suspect that neither the American nor the European electorate will be particularly happy to trust the European Union on this one. Second, international technology regulation is no longer low-salience. The US administration is keeping a watchful eye on who and what compels their technology companies, and European powers are increasingly careful in choosing their battles, because while they might be gearing up for conflict on some fronts, they know they have to fear outright retaliation. I don&#8217;t think anything is turning back the clock on this &#8211; even a Democratic administration would not be enthusiastic to reaffirm the Europeans to please continue regulating their AI companies after all. The heyday of governing the US from outside its borders is over.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!KJpY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!KJpY!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!KJpY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png" width="434" height="108.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2288f183-9076-47b5-b248-884f54abe762_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:434,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!KJpY!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h4><em><strong>In Fact, It Hurts</strong></em></h4><p><strong>Development-focused international work is also increasingly unhelpful to the broader safetyist agenda.</strong> In the US and many techno-optimistic countries, AI safety concerns have grown closely associated with the vehicles and alliances safety advocates have chosen abroad. Insofar as the US administration rejects international governance or the American electorate rejects European policy approaches to technology, there&#8217;s increasing opposition to the safety agenda too. Today, safety advocates in America often have to defend against the allegation that they seek to import European and internationalist notions of governance. Being branded as &#8216;European-style&#8217; is not helpful in the current political environment.</p><p>The reputational harm also affects international governance more generally. AI safety has risen in political salience, even if it hasn&#8217;t kept up in political power, and what safetyists say they want sticks in decisionmakers&#8217; minds. International governance was never going to be the main way AI gets regulated. But safetyist organisations publicly announcing that they were using national laws and international treaties to externally constrain the US has made it much harder to get treaties that would also have this incidental effect. Instead of letting a gentle Brussels effect proliferate quietly, the strategy has been broadcast loudly and led to more forceful rejections of any attempt at international governance.</p><div><hr></div><p><strong>The focus on AI development frequently leads safety advocates to work against the narrow national interest of their respective countries.</strong> It seeks to leverage resources of a middle power not to pursue that middle power&#8217;s unique goals, but a broader altruistic mission. That&#8217;s a problem, because it weakens the safety movement&#8217;s standing in national environments. This is especially true today, when middle power leadership frequently views its own attempts to export values through legislation as a costly mistake of the now-bygone &#8216;end of history&#8217;. A local safety movement whose theory of change runs through leveraging local resources for global solutions with indirect national benefits at best risks political sidelining in today&#8217;s environment of national interest. The most obvious issue is national regulation, which asks countries to spend their precious leverage against the US on frontier regulation to benefit the world. But it relates to other narrow interests too &#8211; consider how safety advocates have positioned themselves on European compute buildout and frontier model development. They&#8217;ve largely been hesitant to support this, not for the sound reason that it&#8217;s economically unwise, but for the bad reason that it might create another party to the AI race. With friends like these&#8230; </p><p>The Conjecture/ControlAI paper again is admirably honest in this. To be clear, I&#8217;m not harping on the paper itself &#8211; it&#8217;s entirely realistic in its suggestions, in that they are the only way to actually <em>do</em> what material like the superintelligence statement seems to <em>want</em>. Don&#8217;t even imagine it was enacted &#8211; just imagine it got traction and became a prominent piece of safety advocacy. It would cause great offense on all sides of the conversation. Middle power governments, strategically and fiscally overextended and desperate to retain their standing, with no spare resources to invest in global missions, would be very skeptical of the advice of a movement that wants them to sacrifice their national interests in service of a greater good.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a>To the US as the target of this mission, it paints the safety movement as interested in regulating American economic activity by sneaking past the US electorate &#8211; this will correctly be read as outright offensive. Luckily, this perception hasn&#8217;t yet spread &#8211; but if the safety movement doubles down on this as some viable pathway to multilaterally affect development today, a reputation hit is only one or two media campaigns away.</p><div><hr></div><h4><em><strong>Real Trade-Offs</strong></em></h4><p>You might dismiss all this and say that the odds are long and the costs are high, but AI safety is so important that this is still the most effective work you can do in middle powers. My conviction is that this is not true, and that the economic and strategic questions are themselves of utmost importance. But even if you disagree, I think <strong>political and financial capital for AI safety is better spent elsewhere: on advancing the national and collective AI strategy of middle powers writ large. </strong>This directly trades off against the development-focused approach for two reasons.</p><p>The most obvious axis of trade-off is simply limited talent and funding &#8211; the safety movement is only so big, only has so many capable operatives, only so much breadth of talent pipelines, and so on, so simply adding the type of work I&#8217;ll suggest to the current portfolio seems unlikely to work.</p><p>The second, more important trade-off relates to credibility. Due to the political ramifications of endorsing safetyist domestic regulation, safety advocates lose influence on a range of other issues they&#8217;d otherwise be helpful voices on. They could be leading voices on national sovereignty, AI strategies, beneficial deployment, and downstream resilience; but they are much less than they could be, not least because they still operate in the reputational shadow of their attempts to trade off their nations&#8217; interests in favour of the greater good. Some organisations have tried to change this perception, pivoting toward a more sovereignty-focused approach to middle power work. But the reputational effects are great enough that these organisations often operate quietly and dodge affiliation with the rest of the ecosystem. As a result, <strong>the more absurd versions of safety middle power work gain disproportionate attention</strong>, worsening the reputational problems even further. All this is a far cry from an AI safety ecosystem that confidently owns tractable safety-relevant issues in middle powers. They could lead this new conversation if they focused on it &#8211; and I believe they should do so.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!KJpY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!KJpY!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!KJpY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png" width="436" height="109" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2288f183-9076-47b5-b248-884f54abe762_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:436,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!KJpY!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>Why Middle Powers?</strong></h1><p><strong>Even if middle powers will never influence AI development, I believe working on middle power policy is valuable. </strong>I also believe this for many reasons that don&#8217;t have much to do with AI safety in the narrow sense, but instead with capturing and proliferating AI&#8217;s benefits throughout the world. But I&#8217;m making a case to safety advocates &#8211; so I&#8217;ll focus on the safety-relevant reasons to work on middle power policy today.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!I2Bz!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f8cebe1-3373-413f-9ef6-e25a07f68584_1280x527.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!I2Bz!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f8cebe1-3373-413f-9ef6-e25a07f68584_1280x527.png 424w, https://substackcdn.com/image/fetch/$s_!I2Bz!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f8cebe1-3373-413f-9ef6-e25a07f68584_1280x527.png 848w, https://substackcdn.com/image/fetch/$s_!I2Bz!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f8cebe1-3373-413f-9ef6-e25a07f68584_1280x527.png 1272w, https://substackcdn.com/image/fetch/$s_!I2Bz!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f8cebe1-3373-413f-9ef6-e25a07f68584_1280x527.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!I2Bz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f8cebe1-3373-413f-9ef6-e25a07f68584_1280x527.png" width="1280" height="527" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/8f8cebe1-3373-413f-9ef6-e25a07f68584_1280x527.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:527,&quot;width&quot;:1280,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:726672,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!I2Bz!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f8cebe1-3373-413f-9ef6-e25a07f68584_1280x527.png 424w, https://substackcdn.com/image/fetch/$s_!I2Bz!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f8cebe1-3373-413f-9ef6-e25a07f68584_1280x527.png 848w, https://substackcdn.com/image/fetch/$s_!I2Bz!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f8cebe1-3373-413f-9ef6-e25a07f68584_1280x527.png 1272w, https://substackcdn.com/image/fetch/$s_!I2Bz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f8cebe1-3373-413f-9ef6-e25a07f68584_1280x527.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">&#8216;And on AI we&#8217;re co-operating with like-minded democracies to ensure we won&#8217;t ultimately be forced to choose between hegemons and hyperscalers.&#8217;, says Canadian PM Carney in a much-lauded recent <a href="https://www.cbc.ca/lite/story/9.7053350">speech</a>.</figcaption></figure></div><div><hr></div><p><strong>First, middle powers are where harms from AI misuse might manifest earliest and most dramatically</strong>: their governments are less engaged on AI, so they might fail to adopt defensive measures in homeland and national security in time; but criminals will view rich middle power populations as targets for exploitation, terrorism, and blackmail. This makes work on strengthening misuse resilience in middle powers important and promising: if the threat were made clear to middle power governments, they&#8217;d have little choice but to invest in defensive measures with positive spillover effects. If you get the US worried about misuse, they might try to regulate the models; but the EU can&#8217;t regulate the models, so if guided well, they&#8217;d have to respond by funding defensive measures instead.</p><p>One promising area is <em><a href="https://vitalik.eth.limo/general/2025/01/05/dacc2.html#1">differential</a>, <a href="https://www.joinef.com/posts/introducing-def-acc-at-ef/">defensively focused</a> <a href="https://ifp.org/the-launch-sequence/">acceleration</a></em>: outpacing the development of potentially harmful AI capabilities by hastening the development of technologies that guard against these harms. This is in the interest of many middle powers, both because proliferation of harmful capabilities seems a foregone conclusion and because there is a political window opening. Many middle powers, alarmed by the ongoing rearrangement of geopolitical realities, are building out militaries and relying less on American and Chinese beneficence. The ongoing <a href="https://www.consilium.europa.eu/en/policies/defence-numbers/">rearmament</a> of Europe (and, to a lesser extent, East Asian countries) provides an obvious context for ambitious projects in this vein. But they need to be framed as linked to national interest. The current pitch suffers from the safety movement&#8217;s reputation for prioritising global over national interests: no middle power wants to bankroll tech that lets US AI development speed ahead. But framed as a reaction to the uncontrollable trajectory of frontier AI development, resilience-focused innovation measures could become a highly successful part of middle power policy.</p><div><hr></div><p><strong>Second, middle powers are where gradual disempowerment and widespread destitution seem most plausible.</strong> In an AI great power, there are policy backstops against disempowerment: taxation and redistribution are possible, and the government can intervene. This plays out differently in a middle power &#8211; if AI rips through the workforce, moving revenue and power from domestic workers to US AI labs, there&#8217;s little they can do. Discussions about <a href="https://econofact.org/factbrief/fact-check-has-the-economic-gap-between-europe-and-the-united-states-increased-in-the-past-decade">growth divergence</a> and the &#8216;Europoors&#8217; are ruefully amusing at 3% growth differences, but become existential at 10+% growth differences. The short version: if a risk of &#8216;millions end up poor and destitute&#8217; is substantial enough to motivate safetyists, it should motivate them to fix the economic trajectory of middle powers as it relates to AI.</p><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;d6273ae3-e960-44b5-9ad1-1b1c571dbe45&quot;,&quot;caption&quot;:&quot;Isma&#8217;il Pasha was sure he had cracked the code of policy arbitrage. In 1863, he endeavoured to modernize Egypt&#8217;s failing economy and military through massive Europeanisation: schools and railways, boulevards and line infantry. Without a European-scale tax base, the plan quickly collapsed: Egypt was placed under humiliating external supervision of its de&#8230;&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;sm&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;AI, Jobs, and the Rest of the World&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:113003310,&quot;name&quot;:&quot;Anton Leicht&quot;,&quot;bio&quot;:&quot;I write about the political economy of advanced AI.&quot;,&quot;photo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!FPyB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F75422da7-aafa-42ab-8fa6-cf4f0df85cf0_3166x3166.jpeg&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2025-09-09T12:54:33.325Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/$s_!SEXO!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c9a9985-a994-4284-b4a2-e602eb13efb9_1600x1087.png&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://writing.antonleicht.me/p/ai-jobs-and-the-rest-of-the-world&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:173158971,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:20,&quot;comment_count&quot;:4,&quot;publication_id&quot;:3834218,&quot;publication_name&quot;:&quot;Threading the Needle&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!4SKU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd7dff89d-5b66-4169-8ef8-eb2ec9ed94e8_1024x1024.png&quot;,&quot;belowTheFold&quot;:true,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><div><hr></div><p><strong>Third, middle powers getting AI wrong can lead to a destabilised world</strong> &#8211; making the path to superintelligence more dangerous. The most salient aspect is the threat of conflict. There&#8217;s already considerable geopolitical volatility, but it can always get worse. The asymmetrical diffusion of new strategic technologies has often triggered dormant conflicts and exacerbated existing ones: parties attack because they believe themselves at a temporary advantage, or inversely because they think the window to compete is closing. If dormant conflicts suddenly erupt as a result of jagged diffusion of advanced AI, that can quickly destabilise everything from supply chains to a shaky US-China peace.</p><p>But it doesn&#8217;t have to go to war. If middle powers feel sufficiently threatened by advanced AI &#8211; either by the economic effects or by power shifting toward AI developers &#8211; they might attempt to halt AI development. Countries with <a href="https://www.state.gov/pax-silica">substantial positions in the semiconductor supply chain</a> have real leverage here. Using it would be economically suicidal &#8211; but if their next few years go badly enough, they might feel they have no other choice. It will be difficult enough to get the transition to transformative AI right under the best conditions &#8211; and the current inadequacy of middle power strategies will destabilise conditions substantially.</p><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;32a6a02f-d65c-4f22-b5b8-a6c0556dce88&quot;,&quot;caption&quot;:&quot;Building a dam in a great waterway is one of the most high-stakes, dangerous moments in modern engineering. Once the concrete is poured and the river is diverted, we can only watch and face the consequences. When, in the 1960s, Italian civil engineers faced pressure to ignore early warning signs in pursuit of sustaining the ongoing economic miracle by h&#8230;&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;sm&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;The Most Dangerous Time in AI Policy&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:113003310,&quot;name&quot;:&quot;Anton Leicht&quot;,&quot;bio&quot;:&quot;I write about the political economy of advanced AI.&quot;,&quot;photo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!FPyB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F75422da7-aafa-42ab-8fa6-cf4f0df85cf0_3166x3166.jpeg&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2025-03-05T14:19:13.093Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc4f5598-e3c3-43fa-8611-1e906cc6b0ff_1024x640.png&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://writing.antonleicht.me/p/the-most-dangerous-time-in-ai-policy&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:158422471,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:17,&quot;comment_count&quot;:3,&quot;publication_id&quot;:3834218,&quot;publication_name&quot;:&quot;Threading the Needle&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!4SKU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd7dff89d-5b66-4169-8ef8-eb2ec9ed94e8_1024x1024.png&quot;,&quot;belowTheFold&quot;:true,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><div><hr></div><p>Fourth, it follows that <strong>contributing to national AI strategies in middle power governments should be a safetyist priority itself  </strong>&#8211; I believe even if this work reduces these governments&#8217; immediate interest in safety-focused regulation. The above are all safetyist talking points, but they slot into &#8211; and require &#8211; a broader strategic conversation in middle powers:<strong> </strong>taking seriously the transformative potential of AI, and grappling with the geopolitical and technical implications. The questions that arise are ones safetyists can answer, and the answers will reduce substantial risks from advanced AI. Not only should answering these questions be a safetyist focus area, but making sure they&#8217;re asked as well. It&#8217;s an advantage to AI safety when middle powers think clearly about AI strategy, and so it&#8217;s worth contributing to that strategic clarity itself.</p><p>All this even opens a potential door back to the development focus &#8211; if there is some way to <strong>bring middle powers into a position of strength, they might once again affect AI development, </strong>serve as a real third center of gravity on the question of advanced AI. But the path to this runs through the effective and uncompromising pursuit of their national interest. It will not be reached from a position of economic and strategic weakness they are headed for on their current trajectories.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!KJpY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!KJpY!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!KJpY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2288f183-9076-47b5-b248-884f54abe762_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!KJpY!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!KJpY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2288f183-9076-47b5-b248-884f54abe762_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>What Can You Do?</strong></h1><p>The safety movement can contribute to answering these questions: they require expertise, awareness of AI&#8217;s scale, the ability to assemble effective policy operations, and a desire to get this right. Few other players combine these factors: geopolitical and national strategy researchers who take advanced AI seriously are rare, organisations willing to host them rarer still, and these few places lack scale and funding. On these criteria, the safety movement is a capable player, and it could start mobilising resources toward this goal.</p><p><strong>To the safety movement&#8217;s credit, this is already happening in some places. </strong>European policy organisations, for instance, are doing substantial work advancing middle powers&#8217; strategic agendas. But none of this is endorsed from the top, none of it has made its way into the mainstream view of what matters &#8211; none of it is reflected in big funding or talent pipelines or presented as a key pathway to safety-minded impact. To grapple with the reputational dimension and deploy resources at scale, a decisive pivot is needed.</p><div><hr></div><p><strong>Existing policy organisations </strong>would not be the main drivers of this process. Some are positioned to pivot, and pivot they should &#8211; including by clearly breaking with their past choices and proposals. The world is changing rapidly, and anyone who says &#8216;we get it now&#8217; and starts pulling for the sovereign fate of middle powers will be welcomed. That said, the safety movement has gotten into trouble for quick pivots and perceived two-facedness in the past, and it&#8217;s probably not realistic for deeply entrenched organisations to reverse course completely. In fact, it&#8217;s probably good if some safety organisations with strong positions on this stay where they are: to catch the true believers in global-mission thinking about safety in middle powers, and to promote internal disagreement. It&#8217;s probably good if there are organisations to which new middle power orgs can point and say &#8216;we strongly disagree with them, and we represent a different approach to international safety policy&#8217;.</p><div><hr></div><p><strong>The research ecosystem can do more.</strong> Established scholars already work closely on national sovereignty, but until recently, their work on strategic questions has always been wrapped in layers of interpretation that contextualised it with some development-focused AI safety point. Giving these researchers freedom and encouragement to pursue these questions without a predetermined takeaway, suggesting that whatever strategic pathway they find is of value, could unlock substantial resources. These people have the ideas &#8211; make it clear it&#8217;s part of the mission and not taboo to get them out there.</p><div><hr></div><p>But the even more promising aspect of the research ecosystem is <strong>talent pipelines.</strong> Until recently, major safety-aligned mentorship programs have mostly focused their project selection in international governance on this development-focused approach. If these programs offered a starting point for middle-power-focused researchers, I&#8217;m confident more people would choose this path. The number of bright, ambitious people who understand AI and want to help their home countries, but end up defecting to US-focused work or contorting into development-focused safety work, is staggeringly high. Give them a home.</p><div><hr></div><p>Much of this <strong>comes down to funding.</strong> Major funders could publicly prioritise these cause areas; launch RFPs around them; and incubate organisations that tackle them. In the most ambitious version, they apply the same strategy to the middle power space that they have to the national security conversation, where safety-aligned organisations have made keystone grants to top-tier institutions and cultivated deep expertise between old-school policy hands and capable, entrepreneurial safety advocates who brought cutting-edge knowledge to enrich the discussion. I&#8217;d welcome the same for middle power work. Right now, there simply isn&#8217;t enough space to do work that grapples with AI and its strategic implications for middle powers. It&#8217;s in the safety movement&#8217;s interest to step up, and the world would be thankful for it.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!844s!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f28fc73-ecd5-411a-b329-a706d7c6089c_2142x345.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!844s!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f28fc73-ecd5-411a-b329-a706d7c6089c_2142x345.png 424w, https://substackcdn.com/image/fetch/$s_!844s!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f28fc73-ecd5-411a-b329-a706d7c6089c_2142x345.png 848w, https://substackcdn.com/image/fetch/$s_!844s!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f28fc73-ecd5-411a-b329-a706d7c6089c_2142x345.png 1272w, https://substackcdn.com/image/fetch/$s_!844s!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f28fc73-ecd5-411a-b329-a706d7c6089c_2142x345.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!844s!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f28fc73-ecd5-411a-b329-a706d7c6089c_2142x345.png" width="1456" height="235" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/8f28fc73-ecd5-411a-b329-a706d7c6089c_2142x345.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:235,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:267228,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://writing.antonleicht.me/i/185388441?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f28fc73-ecd5-411a-b329-a706d7c6089c_2142x345.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!844s!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f28fc73-ecd5-411a-b329-a706d7c6089c_2142x345.png 424w, https://substackcdn.com/image/fetch/$s_!844s!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f28fc73-ecd5-411a-b329-a706d7c6089c_2142x345.png 848w, https://substackcdn.com/image/fetch/$s_!844s!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f28fc73-ecd5-411a-b329-a706d7c6089c_2142x345.png 1272w, https://substackcdn.com/image/fetch/$s_!844s!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8f28fc73-ecd5-411a-b329-a706d7c6089c_2142x345.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a><figcaption class="image-caption">This is the limited extent of non-US AI policy work requested in Coefficient Giving&#8217;s headline AI governance <a href="https://coefficientgiving.org/funds/navigating-transformative-ai/request-for-proposals-ai-governance/">RFP</a>, for instance.</figcaption></figure></div><p><strong>The safety movement has the people, the institutions, and the resources.</strong> <strong>What it lacks is the right theory of change for middle powers. </strong>The development-focused approach was always a long shot; today it&#8217;s actively harmful. The alternative &#8211; helping middle powers navigate AI deployment, build resilience, and avoid strategic blunders &#8211; is tractable, neglected, and would actually advance safety. The moment for that is now. Seize it with haste.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>By which I mean the broad AI safety ecosystem &#8211; funders, researchers, policy organisations that primarily focus on substantial and catastrophic risks from very advanced AI systems. It pains me to say that <a href="https://www.aipanic.news/p/the-ai-existential-risk-industrial">this polemic</a> provides a good overview.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>Some safety advocates will say this was them making the best of a bad situation &#8211; that the AI Act would have happened anyways, and they sought to improve it. I doubt this, both because safety advocates played a substantial role in passing it and because their opposition would surely have been counterfactually influential &#8211; but it matters little for the forward-looking evaluation.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>Yes, I know that some safety advocates believe the reduction of existential risk is also the primary national priority of any smaller government. But even if that&#8217;s true, no one&#8217;s gotten very far in selling it. </p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[The Next Three Phases of AI Politics]]></title><description><![CDATA[Narrow windows worth preparing for]]></description><link>https://writing.antonleicht.me/p/the-next-three-phases-of-ai-politics</link><guid isPermaLink="false">https://writing.antonleicht.me/p/the-next-three-phases-of-ai-politics</guid><dc:creator><![CDATA[Anton Leicht]]></dc:creator><pubDate>Thu, 08 Jan 2026 11:44:18 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/987e0481-cf1c-47df-a6d7-b024b1c16808_996x635.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em><strong>We left last year&#8217;s AI politics unfinished</strong></em>: with an executive order promising legislative action that did not come, amid rising political salience with no policy vehicle to attach it to. Now, we&#8217;re set for a year heavy in politics and light in policy &#8211; and are looking at three upcoming phases of AI politics in the US: attempts and blockades before the midterms, a brief policy window right after, and political chaos once primaries commence.</p><p>This moment is hard to read, and a lot of countervailing factors are at play. Super-PACs and political funding on both sides are ramping up and won&#8217;t show their effects until after the midterm season, political salience is increasing but can fizzle out without popular policy ideas to latch on to. As a result, you can just as easily paint a picture of an upcoming anti-AI techlash as of a Congress paralysed by fear of the AI industry&#8217;s unprecedented political spending. That is how we have ended up with a lot of different factions all thinking that things can only get better for them. </p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p>But while most of the underlying effects are real, they occur at different speeds and time scales. That means the most valuable way to think through them is to look at how their timings line up: when is politics worst, and when is it best? When is it time to play defence, and when to play offence? I think: if you want to get something done in AI policy, spend your time preparing for the window right after the midterms, and get something through before the primaries close the window again. </p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!X591!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!X591!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X591!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X591!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X591!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!X591!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png" width="418" height="104.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:418,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!X591!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X591!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X591!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X591!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a></figure></div><h2><strong>One Last Preemption Push</strong></h2><p><strong>2026 begins with an accelerationist coalition compelled to cash a cheque</strong> that David Sacks has written in the form of <a href="https://www.whitehouse.gov/presidential-actions/2025/12/eliminating-state-law-obstruction-of-national-artificial-intelligence-policy/">EO 14365</a>. This executive order, controversial among the broader American public and within the GOP itself, is the latest attempt at federal preemption of state laws on frontier AI. It came after failed attempts to push preemption through Congress, and after it had become clear that SB-53 and (a weakened version of) the RAISE Act would be signed into law.</p><p>But while the signing of the EO is an intracoalitionary victory for the accelerationists, who have convinced the president in spite of populist right objections, it is not yet a complete policy victory. <strong>The EO is widely understood not to be a standalone achievement. </strong>Experts debate the legality of its provisions, and drawn-out court battles could take away the EO&#8217;s teeth and leave only a vague chilling effect. And politically, even supporters have contextualised it as an immediate stopgap preceding a national legislative framework &#8211; Congress and the American public alike feel AI should ultimately be addressed legislatively.</p><p><strong>So, the reigning coalition is looking for legislative codification. </strong>Just yesterday, OSTP Director Kratsios <a href="https://punchbowl.news/article/tech/white-house-ai-framework/">reaffirmed</a> his intent to produce a proposal this year. It seems hard to believe that congress would simply pass a White House proposal as a standalone law, but it&#8217;s also hard to find another way to get a law done. The admin is short on clever ways to get the framework through: a last-minute attempt to pass preemption as part of the NDAA failed last fall, and there are no other similar must-pass bills left. Now, accelerationists might be tempted to find another convenient vehicle. But would leadership attach contentious preemption to a continuing resolution and risk a shutdown fight? Would GOP legislators jeopardize priority reforms like permitting with a controversial AI rider? The story of the NDAA push suggests that attempts like these would be easily sunk by single influential legislators, and thus remain unlikely &#8211; and more unlikely still every time last-minute attempts burn goodwill.</p><p>What&#8217;s left? <strong>The most promising accelerationist play is to attach their framework to a standalone bill on a specific AI harm, most probably child safety.</strong> That&#8217;s twice attractive: because it&#8217;s easiest to rally the GOP behind, and because it&#8217;s hardest for democrats to say no to. Many internal preemption opponents would very much like to take credit for a child safety law. And even the AI companies have realised the importance of doing something on child safety, recognising the associated PR risks. There are a couple of child safety bills in <a href="https://energycommerce.house.gov/posts/cmt-subcommittee-forwards-kids-internet-and-digital-safety-bills-to-full-committee">committee</a> right now, and many of them could make for an attractive preemption vehicle.</p><h4><em><strong>Nothing Ever Happens</strong></em></h4><p>It&#8217;s just not clear any of this is enough to move the Democrats. <strong>Right now, Congressional Democrats are in a great position to play for time:</strong> any law they can get right now, they can probably also get next Congress, on even better terms and with much more credit to them. That&#8217;s especially true given previous accelerationist pushes have left them skeptical of good-faith offers. And in the meantime, they can continue to run anti-tech and anti-AI campaigns against the GOP, keeping the issue alive for the elections. It might not win any races, but it can&#8217;t hurt.</p><p>Getting something done on AI this Congress mostly comes down to moving these Democrats. How might preemption proponents try to dislodge them? There are three potential pieces of political leverage:</p><ul><li><p>First, Congressional Democrats would rather not be at odds with tech-affiliated donors and their super-PACs, especially the accelerationist &#8216;Leading the Future&#8217;. By all accounts, Democratic leadership has already been somewhat sympathetic to a preemption deal in the fall &#8211; this could be a reason, and one that might persist through this year&#8217;s congressional calendar.</p></li><li><p>Second, it will be hard for Democrats to argue why they don&#8217;t want to rush on AI legislation. The object-level reason might be simple: they don&#8217;t think they&#8217;ll get any federal legislation beyond what&#8217;s already provided by SB-53 and RAISE. But that can&#8217;t be their public reason, lest they admit they&#8217;re happy to let Gavin Newsom and Kathy Hochul run national AI policy.</p></li><li><p>Third, accelerationists might manage to offer something too good to refuse publicly. If they come out with a substantively strong bill with provisions serious enough to convince child safety advocates, it will be hard to kill. Preemption or not, no one wants to go into an election year with a vote against child safety on the record and a PAC in the field that&#8217;s willing to exploit that vulnerability.</p></li></ul><p>Still, I wouldn&#8217;t be too optimistic. Between <a href="https://www.nytimes.com/2025/11/25/us/politics/ai-super-pac-anthropic.html">pro-regulation super-PACs</a> and increased salience, Democrats might not be too worried about the LTF threat; and given accelerationist messaging so far, I don&#8217;t think they&#8217;ll manage to put together a legislative package that&#8217;s good enough to make Democratic opposition look completely unreasonable. And so my best guess is that we&#8217;re headed for deadlock and litigation of the EO, and not much else will happen in policy until the midterms. </p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!X591!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!X591!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X591!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X591!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X591!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!X591!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png" width="420" height="105" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:420,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!X591!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X591!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X591!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X591!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h2><strong>A Post-Midterm Window</strong></h2><p><strong>The midterms, then, will change things in two ways.</strong> Most obviously, they remove the political barriers to compromise that come from wanting to campaign on an issue for some time. But they&#8217;ll also provide the debate with a lot more data on the politics of AI policy: drawing on increased polling attention and the performance of candidates and spending, we&#8217;ll have much more to go on early next year. On all sides of the debate, analysts are placing a lot of hope in this prospect &#8211; they hope that more data will clarify what they are certain is true: that the public is already on their side.</p><p>I don&#8217;t think we&#8217;ll be afforded the luxury of that sort of clarification. Perhaps the most interesting case study is NY-12, where RAISE Act sponsor Alex Bores is competing in a crowded primary. There has been a lot of safetyist triumphalism about the admittedly puzzling decision of LTF to effectively buy Bores a lot of free media by publicly targeting him and letting him make his case on national television. But still, on base rates alone, Bores is likely to lose in his crowded field, and if he does, the super-PAC still gets its visible win. The same effect likely applies to a lot of ostensibly AI-specific electoral indicators: because AI is still not an election-deciding topic, a lot of the signal from the midterms will be drowned in broader political noise that obscures simple analysis.</p><h4><em>The Salience Story</em></h4><p>But while the midterms might not give us much clarity about what version of AI policy is in the voters&#8217; interests, <strong>they will clarify the need for some version of federal AI policy once again. </strong>As is now widely acknowledged, the perceived political salience of AI will continue to increase: yesterday&#8217;s frontier systems are entering mainstream application and used for impressive and harmful purposes. That generates public attention, media reporting and policymaker interest. And even if all this happens less dramatically than boosters expect, I believe the associated meme has already reached escape velocity &#8211; everyone already &#8216;knows&#8217; AI will be politically big, which can quickly become self-fulfilling.</p><p>Critics of this &#8216;AI salience&#8217; notion sometimes point at <a href="https://news.gallup.com/poll/1675/most-important-problem.aspx">issue polling</a>, which sees &#8216;advances in the capabilities of computers&#8217; still delegated to marginal positions. But I think that is mistaken. There&#8217;s a much-quoted sentiment that  &#8216;as soon as it works, we don&#8217;t call it AI anymore&#8217; &#8211; in much the same way, I believe that &#8216;once it&#8217;s salient, we don&#8217;t call it AI anymore&#8217;. Where it will matter, it might instead poll as part of the actual big-ticket issues &#8211; as a symptom of &#8216;tech oligarchy&#8217;, an issue of economic equality, of job prospects, of environmental harms, and so on &#8211; all of which steadily poll as important issues. Once that reality shows up in polls around the midterms, and once it&#8217;s paired with the very clear policy-level <a href="https://news.gallup.com/poll/694685/americans-prioritize-safety-data-security.aspx">polling</a> suggesting voters want legislation, Congress will identify AI as an issue they could touch but have remained silent on. No self-respecting lawmaker will pass on the chance to put their name on a bill regulating something they and the electorate feel is important, and so legislative appetite will increase on the backs of the salience discussion.</p><h4><em>A Brief Window&#8230;</em></h4><p>As a result, <strong>a policy window for substantial congressional action on AI may open shortly after the new Congress is sworn in. </strong>Policymakers of all stripes will sketch out their priorities for the term, many of which will be contradictory. But critically, far fewer lawmakers will be satisfied to do nothing on AI. That removes the greatest current barrier to action &#8211; which is that too many people are satisfied to wait. But when everyone wants <em>some</em> action and no one is satisfied with the status quo of a legislature with nothing to say on a transformative technology, a process opens up. </p><p>Any law that makes it out of it would be the result of much triangulation and negotiation, including a host of provisions aimed to give every important voice a win. Broader preemption, national security provision, narrow substantive rules on current harms, federal codification of SB-53 and many others could be components; there might just be enough vaguely compatible ideas to get a law through Congress. And once it&#8217;s out of Congress, it seems likely enough that the president would sign it: it would be too much of a political liability for VP and presidential hopeful Vance if the administration vetoed a rare congressional consensus on AI.</p><p><strong>Whether the law that makes it through that window will be good is less clear. </strong>I have my concerns: the dynamic I describe is principally driven by a sense of having to do something rather than by the quality of any one specific policy proposal. And the final contours of a deal would be shaped by rounds and rounds of haggling over language, making it easy to lose legislative nuance in the process. The result could easily be unsatisfying provisions from one end of the spectrum &#8211; say, broad preemption without substantive stipulations &#8211; traded for equally myopic provisions that only address incidental current harms.</p><h4><em>&#8230;And How To Use It </em></h4><p><strong>What can you do? </strong>Shaping that prospect is not about coming up with a particularly clever one-size-fits-all legislative framework as much as about <strong>identifying the best versions of bad ideas. </strong>Policymakers will try to push their political priorities through this window, and they&#8217;ll remain fairly immutable; so you have to ask yourself what the best legislation based on this priority would be. What&#8217;s a stipulation on AI and labor that doesn&#8217;t just sound good? What measure on child safety actually helps us get at the underlying and scalable problems of deception and sycophancy? The response to these questions can&#8217;t be &#8216;do these things and also do very clever frontier policy on top&#8217;. That risks clever frontier policy being thrown out of the negotiations when it gets into conflict with other, more immediately politically salient goals. </p><p>To <strong>insulate good policy against these politics, the response to the political driver and the actual policy merit have to be closely interwoven</strong>. In practice, that means <em>isolated</em> frontier safety policy is rarely effective, and you have to link specific areas of frontier policy to specific areas of near-term public salience. In <a href="https://writing.antonleicht.me/p/ai-and-child-safety-against-narrow">child safety</a>, that might mean making a case against age gates and for evals-based solutions that get at underlying sycophantic and deceptive tendencies. In labor, that might mean going through the extra effort to make labor market policy actually scalable by defining appropriations mechanisms for expandable safety nets early, and so on.</p><p>These solutions sit between two unfortunate attractors: being happy with piecemeal solutions that don&#8217;t further mid-term policy goals &#8211; which would be dismissive of the fact that good policy windows are rare and require actual progress; and making purely horsetrade-based policy that don&#8217;t anchor frontier-focused policy in politically important issues &#8212; which would be dismissive of the true political drivers making AI policy possible in the near future.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!X591!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!X591!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X591!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X591!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X591!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!X591!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png" width="408" height="102" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:408,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!X591!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X591!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X591!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X591!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h2><strong>Politics At Last</strong></h2><p><strong>That window, too, will pass quickly.</strong> I suspect that AI politics will truly intensify once we enter the Presidential primary season &#8211; which you&#8217;d usually expect to start around late 2027, when candidates start building national profiles, testing messages, and considering policies. The incentives for outright politicisation of AI will be much sharper than even in the midterms. That partly has to do with further increases in salience that make more people think they ought to talk about AI. But it also has to do with how primaries work &#8211; they reward candidates who manage to carve out a niche within their own party and distinguish themselves from the mainstream and leading candidates. And current party politics offer some such opportunities:</p><ul><li><p>On the Republican side, frontrunner and current Vice President JD Vance will be stuck holding the bag of the Trump administration&#8217;s record on AI policy, for better or for worse. That leaves a gap for tech-skeptical voices to attack him if the administration remains accelerationist, or for unapologetic technooptimists to contest his support from Silicon Valley if he ever pivots to a more skeptical position himself. Already today, Senator Josh Hawley and Florida governor Ron DeSantis are lining up for the tech-skeptical primary angle.</p></li><li><p>On the Democratic side, AI seems like a likely point of contention between moderates and left-wing populists, with the latter already making headlines through deeply anti-AI views and proposals that intersect with their general distaste of big tech, billionaires, environmental harms and labor disruption. It gets even more complicated because the leading moderate, California governor Gavin Newsom, can&#8217;t completely pivot to an anti-AI position without upsetting his base of donors and supporters in his home state.</p></li></ul><p><strong>Even if you don&#8217;t put much stock in the increasing-salience story, these dynamics make AI politicking very attractive </strong>&#8211; they&#8217;re excellent wedges to drive party bases apart and to secure a foothold in each of the two crowded and contentious primaries. And once that happens, the legislative windows once again will close: when people want to campaign on an issue, they have an incentive not to make policy happen beforehand. Any compromise, especially a bipartisan one, reduces the profile of the issue and calms down the debate. And with majorities as thin as they are these days, even a few electorally motivated defectors can jeopardise any legislative attempts. In terms of actionable lessons, there&#8217;s not much to be done with regard to the primaries today &#8211; but the prospect of primary season means that the post-midterm window will be short and precious.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!X591!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!X591!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X591!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X591!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X591!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!X591!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png" width="420" height="105" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:420,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!X591!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!X591!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!X591!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!X591!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d68960-6f6b-4653-92b8-b723efcb7075_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h2><strong>What Follows</strong></h2><p>We face three phases: attempts to break the blockade through the midterms, a brief window after, and then political chaos through the primaries, making for a narrow opportunity worth preparing for and capitalising on. What follows from that depends on where you sit:</p><ul><li><p><strong>For accelerationists, it means that you need to think hard about how to make use of borrowed time. </strong>If you think you can <em>actually </em>win on legislation before the midterms, you need a better lever to move the Democrats. I know I&#8217;m a broken record on this, but I still believe the best path is compromising around deep and narrow regulation for broader preemption. But if you think you won&#8217;t get a law this year, you might rather want to stop posturing a little bit to retain some capital and good faith for the negotiations in the next congress.</p></li><li><p><strong>For safety advocates and regulation proponents, it seems to spell a comfortable next few months of playing defense. </strong>But they need to be used well to prepare for what comes after: the political drivers of whatever AI policy push we can expect will not be perfectly aligned with any reasonable advocate&#8217;s policy priorities &#8211; and so there&#8217;s translation and groundwork to be done to develop solutions that harness these political drivers for actually good policy. If this opportunity is squandered, safety advocates might find themselves in support of AI laws that don&#8217;t do much, but confirm their reputation as unabashedly pro-any-regulation.</p></li></ul><p>More generally, this all means it&#8217;s <strong>worth recognising the most politically likely times for policy action rarely guarantee good policy. </strong>In principle, that opens two avenues: try to make good policy more likely in the unlikely moments, or try to make bad policy better in the likely moments. The mistake is optimising for policy quality when politics are prohibitive, or for political viability when politics are already favourable. Plan accordingly, or else we're headed for a 2026 with little policy and much political theater.</p><div><hr></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p>]]></content:encoded></item><item><title><![CDATA[Forging a Pax Silica]]></title><description><![CDATA[Can the Western alliance be rebuilt on techno-strategical leverage?]]></description><link>https://writing.antonleicht.me/p/forging-a-pax-silica</link><guid isPermaLink="false">https://writing.antonleicht.me/p/forging-a-pax-silica</guid><dc:creator><![CDATA[Anton Leicht]]></dc:creator><pubDate>Wed, 17 Dec 2025 14:32:17 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/ec037b92-2c3e-44ee-833b-af6a5f87586c_8733x5442.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em><strong>The Trump administration is moving to change the underpinnings of American alliances</strong></em> from shared values to mutual technological leverage. If the US can convince its allies to play their part, a narrow path toward a stable global order might still lie ahead.</p><p>In that vein, last week, Under Secretary of State <strong>Jacob Helberg announced the &#8216;<a href="https://www.state.gov/releases/office-of-the-spokesperson/2025/12/pax-silica-initiative">Pax Silica</a>&#8216; initiative.</strong> It seeks to restructure bilateral alliances around access to critical technologies: an offer to import and partake in US software and AI infrastructure, in exchange for access to allied capabilities in semiconductor manufacturing, critical minerals, and advanced production. </p><p>All this comes just a week after the publication of the new <a href="https://www.whitehouse.gov/wp-content/uploads/2025/12/2025-National-Security-Strategy.pdf">National Security Strategy</a>, which provides a stick to go with Pax Silica&#8217;s carrot. It formalised the ongoing American withdrawal from value-based, no-questions-asked commitments to its alliances &#8212; and led to reactions between apprehension and outright panic among Western allies. The headline story told is one of hemispheric consolidation to the detriment of the rest of the world.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:&quot;button-wrapper&quot;}" data-component-name="ButtonCreateButton"><a class="button primary button-wrapper" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p>But alongside broader American <a href="https://writing.antonleicht.me/p/making-ai-export-promotion-work">export and technology policy</a>,<strong> Pax Silica offers a genuine glimpse of how one might rebuild and redefine the Western alliance</strong> &#8211; of how to make it durable in the face of international technological and domestic political disruption. It forces the alliance to grapple with an unassailable American lead in key technologies, and, though forcefully and a bit uncomfortably, invites allies to consider what their place might be in the face of it. That&#8217;s direly needed.</p><p><strong>Yet for all the appeal of that prospective endgame, I fear for the midgame.</strong> Forging a new order is a difficult task &#8211; the Pax Romana was not announced via press conference, not instantiated by administrative decree. It wasn&#8217;t even really Roman <em>policy</em>. It flowed downstream of a hegemon&#8217;s reputation and international recognition, of which an embattled America has much less than ascendant Rome had. For a Pax Silica to hold, politics and policy alike still have to fall into place.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!fF5d!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89e63493-be49-4297-883c-10bd75fac032_1892x556.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!fF5d!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89e63493-be49-4297-883c-10bd75fac032_1892x556.png 424w, https://substackcdn.com/image/fetch/$s_!fF5d!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89e63493-be49-4297-883c-10bd75fac032_1892x556.png 848w, https://substackcdn.com/image/fetch/$s_!fF5d!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89e63493-be49-4297-883c-10bd75fac032_1892x556.png 1272w, https://substackcdn.com/image/fetch/$s_!fF5d!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89e63493-be49-4297-883c-10bd75fac032_1892x556.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!fF5d!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89e63493-be49-4297-883c-10bd75fac032_1892x556.png" width="1456" height="428" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/89e63493-be49-4297-883c-10bd75fac032_1892x556.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:428,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1098503,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://writing.antonleicht.me/i/181873223?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89e63493-be49-4297-883c-10bd75fac032_1892x556.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!fF5d!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89e63493-be49-4297-883c-10bd75fac032_1892x556.png 424w, https://substackcdn.com/image/fetch/$s_!fF5d!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89e63493-be49-4297-883c-10bd75fac032_1892x556.png 848w, https://substackcdn.com/image/fetch/$s_!fF5d!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89e63493-be49-4297-883c-10bd75fac032_1892x556.png 1272w, https://substackcdn.com/image/fetch/$s_!fF5d!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F89e63493-be49-4297-883c-10bd75fac032_1892x556.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h1><strong>What&#8217;s to Like?</strong></h1><div><hr></div><p><em>You, O Roman, govern the nations with your power- remember this!<br>These will be your arts &#8211; to impose the ways of peace,<br>To show mercy to the conquered and to subdue the proud.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> </em></p><div><hr></div><p><strong>Trump or not, I feel the end of the old alliance paradigm was somewhat overdetermined. </strong>The Western alliance has long been stuck in an untenable division of labor, with the US increasingly dominating the most innovative and strategic sectors, further and further upsetting the balance of economic and military power and leaving weakening allies on the periphery of power. The resulting gap was already starting to take its toll both on European and American capabilities. Now, on the eve of a major technological revolution, the shortcomings of this setup are becoming clearer still; and no self-respecting middle power should have been happy to head into it on hopes of American good graces. </p><p>And so however comfortable the old transatlantic arrangement was at the &#8216;end of history&#8217;, I think it might have struggled to get this new era right. In contrast, Pax Silica offers the beginnings of a more robust paradigm. </p><h4><em><strong>Alliance Stability</strong></em></h4><p>The end state I imagine &#8211; what I take to be the optimistic read of Helberg&#8217;s ambition &#8211; is an <strong>arrangement between the US and its allies that fundamentally hinges on mutual technological leverage.</strong> Deals and arrangements are structured around allies providing something the US needs, in exchange for close participation in the American tech stack. This change away from deeper alliances built on normative commitment cuts two ways: it is more contingent and up for renegotiation, providing less unconditional assurance to allies; but it is more reliable in the face of quickly shifting politics, just as long as allies continue to be able to produce something America wants.</p><p>Many observers draw many kinds of lessons from the last few years, but mine is: <strong>in the face of volatile national politics everywhere, we need to stabilise alliances against topical disruptions </strong>from cultural disagreements and the politics of the day. Neither abstract strategic logic nor underpinning values have proven up to the task. Instead, building a Western alliance that can prevail means backing up our ties with hard leverage and mutual dependency &#8211; which is exactly the thrust of the Pax Silica. Nowhere is that more visible than in AI: middle powers <a href="https://writing.antonleicht.me/p/a-roadmap-for-ai-middle-powers">need</a> to find some way to <em>reliably</em> access advanced AI, but the danger of simply bandwagoning with the US is that expecting the US to provide its most advanced systems for free is not reliable. I&#8217;m not reassured by others&#8217; hopes that things would be much different under a hypothetical Democratic administration, which might itself have security-based <a href="https://carnegieendowment.org/research/2024/12/defense-against-the-ai-dark-arts-threat-assessment-and-coalition-defense?lang=en">reservations</a> about upsideless exports to allies. <strong>A reduction of alliances to mutual leverage, then, makes the bandwagoning strategy easier</strong>, not harder &#8211; because it insures the arrangement against unilateral reneging, just as long as middle powers keep up their part of the bargain.</p><h4><em>A Path, of Sorts</em></h4><p>You might say that a transition to this setting shouldn&#8217;t be the result of abrasive and unilateral US-led realignment &#8211; and in fact, I&#8217;ve repeatedly argued US allies should get ahead of the curve and shift their strategies toward a clear source of leverage no matter the US policy. But the fact is, <strong>a new order has failed to emerge naturally</strong> from the post-Cold-War set of alliances &#8211; the current division of labor is simply unworkable, the changes in departing from it too painful, and many major allies still stalwartly assume that things might be diplomatically difficult but not materially different.</p><p>Next to US intervention, <strong>AI in particular serves as a forcing function to dramatically change this arrangement</strong> &#8211; to improve it, in my view. That&#8217;s for two reasons: First, because AI puts the nail in the coffin of some allies&#8217; approach to pursue a highly diversified economic structure, then execute on it worse than America does. Particularly in the software and service industry, where America has long outcompeted its allies but has left niches for local competitors, the story might soon be very different: advanced AI systems will sweep through these sectors, and likely lead to the accretion of more and more revenue with US software companies far away from the taxable economic activity in middle powers.</p><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;35d791d9-82ac-4769-b483-cb4bc6d4634b&quot;,&quot;caption&quot;:&quot;Isma&#8217;il Pasha was sure he had cracked the code of policy arbitrage. In 1863, he endeavoured to modernize Egypt&#8217;s failing economy and military through massive Europeanisation: schools and railways, boulevards and line infantry. Without a European-scale tax base, the plan quickly collapsed: Egypt was placed under humiliating external supervision of its de&#8230;&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;sm&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;AI, Jobs, and the Rest of the World&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:113003310,&quot;name&quot;:&quot;Anton Leicht&quot;,&quot;bio&quot;:&quot;I write about the political economy of advanced AI.&quot;,&quot;photo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!FPyB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F75422da7-aafa-42ab-8fa6-cf4f0df85cf0_3166x3166.jpeg&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2025-09-09T12:54:33.325Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/$s_!SEXO!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c9a9985-a994-4284-b4a2-e602eb13efb9_1600x1087.png&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://writing.antonleicht.me/p/ai-jobs-and-the-rest-of-the-world&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:173158971,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:20,&quot;comment_count&quot;:4,&quot;publication_id&quot;:3834218,&quot;publication_name&quot;:&quot;Threading the Needle&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!4SKU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd7dff89d-5b66-4169-8ef8-eb2ec9ed94e8_1024x1024.png&quot;,&quot;belowTheFold&quot;:true,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><p>In that same context of middle powers&#8217; economic structures, <strong>a Pax Silica setup also has the invaluable advantage of a favourable political economy.</strong> It does not ask US allies to become independent and competitive by reaching sovereignty through moonshots &#8211; which would be expensive, speculative, and dangerous, and has so far resulted in a broad portfolio of failing half-hearted attempts. Instead, it asks them to do what they&#8217;re already good at, and closer integrate it into the US alliance. That&#8217;s a much easier sell in domestic political economies, because it comes with at least the illusion of retaining economic structures and sources of national pride in the face of an otherwise disruptive trend. Selling Germany on &#8216;we&#8217;re building AI models now&#8217; seems nigh-impossible; selling Germany on &#8216;we&#8217;re building industrial components powered by US AI in exchange for access to frontier capabilities&#8217; seems like a better pitch.</p><h4><em>Allied Scale for a Wary World</em></h4><p>And second, because <strong>the sharpening AI race is also taxing US resources to an unprecedented extent. </strong>As America&#8217;s best and brightest, concentrated government support, and greater and greater parts of its capital market are poured into enduring supremacy in the AI race, other bottlenecks are becoming more and more pronounced. Upstream inputs into AI capability, like semiconductor manufacturing and high-quality data, are still very strong outside the US; and increasing AI deployment will lead to increasing downstream bottlenecks: robotics, advanced manufacturing, automatable R&amp;D that actually allow for the translation of AI performance into real-world impact. The latter becomes particularly important as China is particularly good at expanding these downstream bottlenecks &#8211; in the logic of &#8216;allied scale&#8217;, analysts have <a href="https://www.nytimes.com/2025/09/07/opinion/us-trump-china-allies.html">remarked</a> that coordinating allied manufacturing capability would be one way to counter the breadth of the Chinese industrial base. </p><p>One way to understand this, then, is that &#8216;<strong>Pax Silica&#8217; might be the Republican word for &#8216;allied scale&#8217;.</strong> I think everyone in the West could find something to like about this.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!qJzO!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!qJzO!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!qJzO!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!qJzO!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!qJzO!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!qJzO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png" width="400" height="100" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:400,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!qJzO!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!qJzO!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!qJzO!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!qJzO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>What Needs to Change?</strong></h1><blockquote><p><em>To plunder, to slaughter, to ravage, they call empire by false names; and where they make a wilderness, they call it peace</em>.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a></p></blockquote><div><hr></div><p><strong>So much for the promise of the endgame; now to the perils of the midgame.</strong> For all the material benefits of this new order, I&#8217;m not yet sure that the world is buying what America is selling. My sense is there are two distinct challenges here &#8211; one of them comes from the pace of the transition, and the other from the actual terms of the deals.</p><h4><em><strong>Managing the Pace of Transition</strong></em></h4><p>The transition is the hardest part. <strong>No matter how attractive the substantive parts of a Pax Silica might be, they&#8217;re still a departure from what many US allies perceived as a better time </strong>&#8211; one where they could rely on the US not because they were providing something of value, but because they shared in common values. Moving from the latter to the former fundamentally asks more of middle powers, and breaks with a trust and certainty that the domestic decisionmakers had carried for decades. It also throws the domestic politics of these middle powers into disarray &#8211; again, take Western Europe, where local narratives hold that industry models have broken down in part because the US is no longer happy to foot the defense bill. </p><p>When the phase change in foreign policy comes with costs and inconveniences to allies and partners, they&#8217;ll have a hard time evaluating any new paradigm on its merits, and instead will be drawn to the comparative. Selling the Pax Silica to someone who was never a US treaty ally before could be the easiest thing in the world, and pitching it to long-standing US allies now would still feel like a downgrade and incur resistance accordingly. US foreign policy cannot be naive about this &#8211; the deals need to be better than they would be on objective merits to get buy-in at scale.</p><p>To make matters worse, <strong>the transition is also politically embattled way above the paygrade of AI policy. </strong>In the past, I&#8217;ve argued that the UK-US Tech Prosperity Deal is one model example for bilateral cooperation around critical capabilities for middle powers &#8211; but warned that any sectoral partnership was subject to an overarching volatility in US foreign policy. Just this week, it has been <a href="https://www.ft.com/content/afd45e58-5351-4379-8f7e-5788da3d2e20">reported</a> that US delivery on the terms of the deal has been halted &#8211; apparently because the administration is seeking UK concessions on unrelated matters of trade and foreign policy. This is exactly the kind of thing middle power strategists are worried about: if the US mixes international tech policy with its political desiderata, how can we trust the merits of strategic deals? Any visible examples of such a trend hurt the initiative at large. </p><p>I haven&#8217;t heard a good answer yet &#8211; &#8216;you&#8217;ll just have to deal with it&#8217; is certainly not sufficient. Even if it&#8217;s true, it will inevitably make the middle powers stubborn and drive them into further sovereignty ambitions with no upside for the US. If the Pax Silica is a strategic priority, it needs to impose a degree of message and negotiation discipline across the whole of American foreign policy efforts, or it will fail.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!jfqo!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd738cb7-d527-4108-86af-f3eb60639efb_1598x608.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!jfqo!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd738cb7-d527-4108-86af-f3eb60639efb_1598x608.png 424w, https://substackcdn.com/image/fetch/$s_!jfqo!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd738cb7-d527-4108-86af-f3eb60639efb_1598x608.png 848w, https://substackcdn.com/image/fetch/$s_!jfqo!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd738cb7-d527-4108-86af-f3eb60639efb_1598x608.png 1272w, https://substackcdn.com/image/fetch/$s_!jfqo!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd738cb7-d527-4108-86af-f3eb60639efb_1598x608.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!jfqo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd738cb7-d527-4108-86af-f3eb60639efb_1598x608.png" width="1456" height="554" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/cd738cb7-d527-4108-86af-f3eb60639efb_1598x608.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:554,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1413696,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://writing.antonleicht.me/i/181873223?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd738cb7-d527-4108-86af-f3eb60639efb_1598x608.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!jfqo!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd738cb7-d527-4108-86af-f3eb60639efb_1598x608.png 424w, https://substackcdn.com/image/fetch/$s_!jfqo!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd738cb7-d527-4108-86af-f3eb60639efb_1598x608.png 848w, https://substackcdn.com/image/fetch/$s_!jfqo!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd738cb7-d527-4108-86af-f3eb60639efb_1598x608.png 1272w, https://substackcdn.com/image/fetch/$s_!jfqo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd738cb7-d527-4108-86af-f3eb60639efb_1598x608.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">A graphic shared by Helberg. It features not a network, but a series of bilateral inputs &#8212; a pitch that might make each single partner fear an asymmetry.</figcaption></figure></div><h4><em><strong>Structuring the Content of Deals</strong></em></h4><p>And then there are two problems on the horizon relating to the actual structure of Pax-Silica-style deals with middle powers.</p><p>The first is that <strong>today&#8217;s deals still take place in a precious microcosm.</strong> None of the Pax Silica members &#8211; and arguably not even the US at large &#8211; understands AI and its input technologies as a strategically critical area just yet, and in pure macroeconomic terms, it still plays a minor role. In an emergent sector of no systemic relevance, it&#8217;s fairly easy to convince technocratic operatives of leverage-based bilateral treaties. The stakes aren&#8217;t all that high, and the domestic political attention is basically non-existent &#8211; the undersecretary level is frequently free to move on tactical merit alone. That will drastically change once AI reaches political and economic salience; once constituents and companies start asking questions about sovereignty, regulatory leverage, safety of access and security of deployment. Ideas about autarky, skepticism of America, and a barrage of political concerns will muddy the waters.</p><p>Put differently: <strong>it&#8217;s already hard to make a Pax Silica deal about 1% of your GDP; now imagine making a deal about 20%</strong>. In a political environment where approval of both AI technology and the US is low and keeps decreasing, it will be very difficult to domestically justify deals once the technologies involved become more and more relevant. Leadership in middle powers and US foreign policy should both take note, and give allies&#8217; electorates as little reason as possible to politically reject a favourable deal in the future.</p><p>The second problem is that <strong>the US is seeking to reshore the same capabilities they seek out in allies, </strong>thereby giving the impression that the deals themselves are not stable.<strong> </strong>The pitch for a deal is that an ally can rely on it because the US so desperately needs the imported capacity &#8211; and yet, America is aiming to recreate the same capacities at home. That reads as a major threat to these allies, and could derail favourable deals on two levels.</p><ul><li><p>First, it <strong>might make allies think that a deal is fundamentally time-limited</strong> &#8211; that they are on borrowed time until the US recreates the capability at home, and that they&#8217;ll lose access to whatever they&#8217;re getting in return once that happens. That makes it much harder for these allies to do what the US wants of them, which is pivot their economies toward strengthening that capacity at home: if you have to assume that demand is set to collapse once the US has finished reshoring, you can&#8217;t take ambitious bets on ramping up supply.</p></li><li><p>And second, it <strong>might make allies think that cooperating closely with the US is particularly risky because America seeks to </strong><em><strong>extract</strong></em><strong> their advantage.</strong> A deepening cooperation that includes tech transfers might make it easier for US firms to replace allied champions in the future: if a cooperation with ASML went deep enough to allow for the manufacturing and maybe even development of EUV at scale in America, the Netherlands wouldn&#8217;t be building a durable niche as much as signing away their advantage. Now this is easier to do in some cases than others &#8211; unless you&#8217;re foolish enough to sign away mining rights to raw materials for cheap, raw materials can&#8217;t really be reshored to the US &#8211;, but it&#8217;s a latent risk that will be salient to the involved powers.</p></li></ul><p>In a somewhat tragic symmetry, both these reasons mirror past relations between US allies and China &#8211; allies started exporting machinery to China, just for Chinese companies to recreate the exported products, collapsing demand and developing competing products. The resulting shock has already disrupted especially European economies at their core. They&#8217;ll need a good reason to think it won&#8217;t happen again.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!qJzO!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!qJzO!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!qJzO!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!qJzO!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!qJzO!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!qJzO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png" width="398" height="99.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:398,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!qJzO!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!qJzO!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!qJzO!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!qJzO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11d66cfd-05e6-4d47-b626-acfe522977e9_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>Outlook</strong></h1><p><strong>What does this all mean in practice?</strong> There is optimisation potential in so many areas of this ambitious attempt at a new order &#8211; as befits the scale of the challenge. The technical implementation of the Pax Silica&#8217;s champions in the State Department needs to be careful not to disregard the strategic sensibilities of promising partners. The broader Trump administration will have to shape some of its international politics around this plan, even at some cost to cultural or trade priorities. </p><p>And someone will have to make the actual case to middle powers that all this is worth going for &#8212; by translating the sometimes jarring sound of US alignment into their strategic parlance and helping them chart a course accordingly. That last part is perhaps the most underrated, and most worth working on.</p><p>The Pax Romana did not emerge from declaration, but from decades of calibrated coercion and credible commitment. The Pax Silica, if it is to mean anything, will require the same &#8211; a tall task for the Trump administration. In many ways, <strong>coming up with a new structure for the free world has been the easy part. The hard part is convincing allies that America&#8217;s word is worth building a paradigm around</strong>, at the exact moment when many are losing faith in it. The practicality of this order needs to be proven soon, before the politics of AI derail progress on this narrow path.</p><div><hr></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>Perhaps the first articulation of the Pax Romana, from Vergil&#8217;s enormously influential Roman founding myth, the Aeneid.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>Tacitus&#8217; Caledonian war chief Calgacus, offering a somewhat less optimistic outside perspective on the Pax Romana.</p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[Anthropic Against Itself]]></title><description><![CDATA[The frontier AI lab can be a neutral authority, a policy advocate, or a political renegade. But it can no longer be all three.]]></description><link>https://writing.antonleicht.me/p/anthropic-against-itself</link><guid isPermaLink="false">https://writing.antonleicht.me/p/anthropic-against-itself</guid><dc:creator><![CDATA[Anton Leicht]]></dc:creator><pubDate>Fri, 05 Dec 2025 11:16:43 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/b681c48a-0117-48aa-9105-14e75dc0b2a4_1296x1651.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em><strong>Anthropic does many things well. </strong></em>It&#8217;s a leading frontier AI developer, a premier source of information on the uses and effects of advanced AI, a powerful voice in policy debates, and a political outlier in a Silicon Valley increasingly aligned with the GOP. But as AI policy grows more complicated, deeper conflicts are opening between these roles &#8211; conflicts that ultimately put all of them at risk.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p>In the last few weeks, <strong>some fault lines have started to show</strong>. In quick succession, three separate events left some with an impression of agenda-driven behavior, incurring public attacks on Anthropic&#8217;s broader credibility. First, Anthropic provided industry support for SB-53, a California law mandating increased transparency from frontier developers. Second, co-founder Jack Clark spoke to a broader political strategy: create transparency into frontier systems, then leverage the findings for further policy asks. Third, Anthropic released detailed information on a cyber espionage campaign conducted through Claude Code.</p><p>Critics cried foul at each in a manner that I thought slightly missed the point. It seems to me like each of these decisions was made on its merits by well-meaning individuals. The suspicions around the transparency strategy especially seemed somewhat exaggerated: if you genuinely believe something to be dangerous, it follows that more information would advance your regulatory agenda &#8211; no fearmongering required.</p><p>But there&#8217;s a real problem here: <strong>it&#8217;s becoming increasingly difficult to square the many roles Anthropic is trying to play.</strong> If you want to be the neutral lab that can authoritatively call warning shots, it hurts to be on record pursuing controversial policy goals &#8211; people will discount what you say. If you want your support of transparency regulation to count as an industry endorsement, you can&#8217;t run too far ahead of the rest of the industry in publicly communicating your political strategy. And if you want to distance yourself from the Trump administration more than your competitors do, your short-term influence on any of these goals will suffer.</p><p>The underlying tension, then, is that Anthropic is very good at identifying what voices the ecosystem needs, but is beginning to overextend in trying to fill all of them. Taking stock of these roles might help find a path forward. In my view, that leads to a clear conclusion: <strong>to maintain its unique value as a trustworthy voice on AI effects, Anthropic might have to cut back on some political positioning.</strong></p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aLiS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" width="416" height="104" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:416,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:&quot;&quot;,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a></figure></div><h1>Four Roles</h1><p>So what are these roles Anthropic seeks to play? The first non-negotiable is that <strong>Anthropic intends to remain a frontier AI developer</strong>. I know that some old-school safetyists consider this contingent and perhaps even offensive &#8211; there are some arcane documents suggesting that Anthropic should not be involved in <em>pushing</em> the frontier &#8211; but I think it&#8217;s simply a reality of the day. To stay relevant to any conversation, a company has to stay at the frontier &#8211; particularly when it doesn&#8217;t have the market capitalisation of a tech giant just yet. Anthropic&#8217;s credibility and relevance, its ability to stay in the room, depends on continued frontier performance, and so that&#8217;s a given.</p><div><hr></div><p><strong>Neutral Authority. </strong>Anthropic seeks to be a neutral authority on the effects of advanced AI: an organization with exclusive insights into what AI is doing in the real world, communicated diligently. This is something Anthropic does quite well. The cybersecurity report is a recent example, but Anthropic is also far ahead of the field in other issues, most notably on economic data and transparency in safety research.</p><p>To my mind, this role matters greatly. On both the economic and safety fronts, we consistently lack live information on what is actually happening. Smart policy ideas are consistently bottlenecked by uncertainty around what the actual targets are, good politics are hamstrung by empirical disagreement. Data is an antidote &#8211; tracking usage statistics is valuable for calibrating labor policy, and even controversial safety research helps continue the technical discussion openly, grounded in an understanding of actual frontier capabilities.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!nJjs!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffe7195da-435e-4e29-9674-bf9ab01becbb_1624x444.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!nJjs!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffe7195da-435e-4e29-9674-bf9ab01becbb_1624x444.png 424w, https://substackcdn.com/image/fetch/$s_!nJjs!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffe7195da-435e-4e29-9674-bf9ab01becbb_1624x444.png 848w, https://substackcdn.com/image/fetch/$s_!nJjs!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffe7195da-435e-4e29-9674-bf9ab01becbb_1624x444.png 1272w, https://substackcdn.com/image/fetch/$s_!nJjs!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffe7195da-435e-4e29-9674-bf9ab01becbb_1624x444.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!nJjs!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffe7195da-435e-4e29-9674-bf9ab01becbb_1624x444.png" width="1456" height="398" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/fe7195da-435e-4e29-9674-bf9ab01becbb_1624x444.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:398,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!nJjs!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffe7195da-435e-4e29-9674-bf9ab01becbb_1624x444.png 424w, https://substackcdn.com/image/fetch/$s_!nJjs!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffe7195da-435e-4e29-9674-bf9ab01becbb_1624x444.png 848w, https://substackcdn.com/image/fetch/$s_!nJjs!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffe7195da-435e-4e29-9674-bf9ab01becbb_1624x444.png 1272w, https://substackcdn.com/image/fetch/$s_!nJjs!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffe7195da-435e-4e29-9674-bf9ab01becbb_1624x444.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Anthropic as a neutral authority.</figcaption></figure></div><p>Most importantly, <strong>a frontier lab as neutral authority is one of the very few actors that can credibly call a warning shot. </strong>This is a favorite pathway to political salience for many safety advocates: a large AI risk manifests early in a smaller harm or near-miss, and policy action follows. But for that to work, you need an authority: someone needs to call out a near-miss and what could have happened; and even in cases of harm, someone needs to trace it to AI, which is quite <a href="https://writing.antonleicht.me/p/do-you-need-a-wake-up-call">difficult</a> to do in cases of cyber or bio attacks. Frontier developers may be the most able and trustworthy actors to call these shots, given their access to usage information and their status as industry players who have some incentive against crying wolf. This becomes even more true as the capabilities of frontier models move from public to private, with the highest capability levels perhaps only being deployed internally within the major labs. In these settings, a frontier lab willing to blow the whistle on a critical risk could make all the difference.</p><p>Playing this role alongside frontier development is already tricky: publishing safety research risks revealing internal methods; sharing usage patterns gives competitors insight into vulnerable market segments. But if it stopped here, Anthropic could probably manage the split. The problem is that Anthropic pursues policy and political goals on top of all this.</p><div><hr></div><p><strong>Policy Advocate.</strong> Anthropic advocates for specific policies. It was the only major industry player that didn&#8217;t reject SB-1047 <a href="https://www.anthropic.com/news/the-case-for-targeted-regulation">outright</a> &#8211; and thereby distanced itself from the tech-right coalition that emerged in its opposition. It <a href="https://www.anthropic.com/news/anthropic-is-endorsing-sb-53">supported</a> SB-53 enough to be associated with State Senator Scott Wiener in ways that have repeatedly led to ill-fated social media back-and-forths. It works closely with pro-regulation voices, both in coalition engagement and through high-profile hires of ex-Biden officials that didn&#8217;t go unnoticed in Washington.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a></p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!-ALp!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b986463-db02-411c-b05d-07f7515004ce_1612x370.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!-ALp!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b986463-db02-411c-b05d-07f7515004ce_1612x370.png 424w, https://substackcdn.com/image/fetch/$s_!-ALp!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b986463-db02-411c-b05d-07f7515004ce_1612x370.png 848w, https://substackcdn.com/image/fetch/$s_!-ALp!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b986463-db02-411c-b05d-07f7515004ce_1612x370.png 1272w, https://substackcdn.com/image/fetch/$s_!-ALp!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b986463-db02-411c-b05d-07f7515004ce_1612x370.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!-ALp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b986463-db02-411c-b05d-07f7515004ce_1612x370.png" width="1456" height="334" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0b986463-db02-411c-b05d-07f7515004ce_1612x370.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:334,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!-ALp!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b986463-db02-411c-b05d-07f7515004ce_1612x370.png 424w, https://substackcdn.com/image/fetch/$s_!-ALp!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b986463-db02-411c-b05d-07f7515004ce_1612x370.png 848w, https://substackcdn.com/image/fetch/$s_!-ALp!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b986463-db02-411c-b05d-07f7515004ce_1612x370.png 1272w, https://substackcdn.com/image/fetch/$s_!-ALp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b986463-db02-411c-b05d-07f7515004ce_1612x370.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a><figcaption class="image-caption">Anthropic as a policy advocate.</figcaption></figure></div><p>This can be effective, particularly in a technocratic environment, where Anthropic provides industry cover for laws otherwise dismissed as decelerationist. The company tends to position to the right of safety organizations but to the left of industry at large, serving as an indicator of how offensive a policy really is: if Anthropic is on board, it can&#8217;t be that toxic, goes the reasoning behind closed doors. And because this positioning aligns with Anthropic&#8217;s stated mission, so some version of it will likely continue regardless of tactical considerations.</p><p><strong>But it does throw the warning-shot logic into disarray: your empirical outputs become less credible when linked to a policy agenda. </strong>This is the kernel of truth in recent debates over Jack Clark&#8217;s remarks: no, there&#8217;s no insidious fearmongering strategy. But if you want a specific policy outcome, and the main barrier is that adversaries don&#8217;t share your empirical read, they&#8217;ll view whatever information you release through skeptical eyes. Given Anthropic&#8217;s incentive to release precisely the most concerning pieces, the value of any released information gets discounted accordingly, which then incentivizes Anthropic to put out even more alarming findings to correct for the discounting.</p><p>It doesn&#8217;t much matter whether Anthropic is actually exaggerating or being selective. The perception is hard to shake, and adversaries can wield it easily. Taking Anthropic data into a room full of skeptics and adversaries might become harder and harder. This is also a feature of increasing politicization: for warning shots to break through a polarized environment, they don&#8217;t simply need to be empirically robust. They need to be ironclad, above reproach. The empirical findings of a political player are neither.</p><div><hr></div><p><strong>Political Renegade.</strong> Anthropic has acted as <a href="https://www.theinformation.com/articles/tech-leaders-flatter-trump-anthropic-takes-cooler-approach">something of a renegade</a> in the current political climate. While other tech companies have pivoted to align closely with the Trump administration, Anthropic has kept greater distance from an administration that runs on personal connections and demonstrations of allegiance. They haven&#8217;t yet paid the price in loss of support or government contracts, but they also seem far from qualifying for especially favorable treatment.</p><p>And a closer-still alignment with Democrats or at least with perceived tech-critical voices might be likely in the lead-up to the midterms. Whether through individuals or, as per some recent <a href="https://danieleth.substack.com/p/public-first-changes-the-ai-super">speculation</a> even the organisation itself, I wouldn&#8217;t be surprised to see Anthropic contributions to <a href="https://www.nytimes.com/2025/11/25/us/politics/ai-super-pac-anthropic.html">Public First</a> &#8211; the AI safety super-PAC that has positioned itself as the response to the accelerationist &#8216;Leading the Future&#8217;. Once opponents are able to paint Anthropic as bankrolling the decelerationists, the politics seem likely to get more ugly still.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!24Bx!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20ccb1cc-1744-479f-a287-c46aff393369_1612x358.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!24Bx!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20ccb1cc-1744-479f-a287-c46aff393369_1612x358.png 424w, https://substackcdn.com/image/fetch/$s_!24Bx!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20ccb1cc-1744-479f-a287-c46aff393369_1612x358.png 848w, https://substackcdn.com/image/fetch/$s_!24Bx!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20ccb1cc-1744-479f-a287-c46aff393369_1612x358.png 1272w, https://substackcdn.com/image/fetch/$s_!24Bx!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20ccb1cc-1744-479f-a287-c46aff393369_1612x358.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!24Bx!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20ccb1cc-1744-479f-a287-c46aff393369_1612x358.png" width="1456" height="323" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/20ccb1cc-1744-479f-a287-c46aff393369_1612x358.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:323,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!24Bx!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20ccb1cc-1744-479f-a287-c46aff393369_1612x358.png 424w, https://substackcdn.com/image/fetch/$s_!24Bx!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20ccb1cc-1744-479f-a287-c46aff393369_1612x358.png 848w, https://substackcdn.com/image/fetch/$s_!24Bx!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20ccb1cc-1744-479f-a287-c46aff393369_1612x358.png 1272w, https://substackcdn.com/image/fetch/$s_!24Bx!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20ccb1cc-1744-479f-a287-c46aff393369_1612x358.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a><figcaption class="image-caption">Anthropic grappling with the dangers of being a political renegade.</figcaption></figure></div><p><strong>Even today, some cracks are showing. </strong>In recent months, political criticism of Anthropic has grown, resulting among other things in heated social media discussions, a missing invitation to a high-level White House AI event and repeated public criticism from White House AI Czar David Sacks. In the aftermath, Anthropic seemed compelled to clarify, and issued a direct leadership <a href="https://www.anthropic.com/news/statement-dario-amodei-american-ai-leadership">statement</a> reaffirming areas of alignment with the Trump administration. In a subsequent, somewhat atypically voiced <a href="https://www.anthropic.com/news/anthropic-invests-50-billion-in-american-ai-infrastructure">press release</a> on datacenter buildouts, they again noted their interest in &#8216;building in America&#8217;. I suspect such moments will continue to mount: as government involvement in AI rises, skepticism over harms and bubble concerns grows, and dependency on government support for continued buildout increases, more adversarial positioning will be harder and harder to sustain without costs to Anthropic&#8217;s other goals. Their on-the-ground government affairs work will be sensitive to that fact.</p><p>I think the easiest way to understand this fourth role despite its downsides is to understand it as a genuine commitment on principle. But I&#8217;d be remiss not to note it could also be long-term strategy: if you expect a backlash against AI and the &#8216;tech right&#8217;, it could help in 2028 to have positioned differently, avoiding retaliation against perceived Trump allies.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a><sup> </sup>But whatever the purpose, this positioning is costly now. It jeopardizes frontier development by creating friction with an administration that controls profitable export prospects, influence with Anthropic&#8217;s potential collaborators, and control of all kinds of federal support for AI buildouts. It makes the position as neutral authority harder to maintain, since a partisan stigma invalidates some of that neutrality. And it makes policy advocacy less effective, since skeptics can frame it as partisan overreach rather than principled concern.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aLiS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" width="416" height="104" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:416,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1>Consolidation</h1><p>Is it coherent for Anthropic to <em>want to </em>play all of these roles? Of course &#8212; they&#8217;re intellectually and perhaps even strategically quite consistent: someone who believed in the promise of advanced AI but worried about the risks would probably act about the way Anthropic acts. But pursuing all of them at once strikes me as a <em>tactical</em> mistake in the current environment &#8212; especially as AI politics grows more factional.</p><p><strong>One effect of that trend toward politicisation is that it&#8217;s not enough to be right on the merits</strong>; it matters more and more that your allegiance and motives are legible and salient. Context-switching is poison to that prospect: the one thing you can&#8217;t do is play different roles in different settings, or you&#8217;ll incur mistrust for your every future contribution.</p><p>You could come up with many ways to disentangle these tensions, depending on your views of risks, benefits, and timelines. If you thought immediate policy action was most important, you might do everything to get into policymakers&#8217; good graces and cut down on political positioning. If you thought long-term credibility mattered most, you might recuse yourself from ongoing fights and focus on building an unassailable reputation for neutrality. If you thought the current administration was likely to entrench and tilt the field toward competitors, you might swallow some pride and make peace, and so on.</p><div><hr></div><h4><em>What&#8217;s Next?</em></h4><p>In my view, this all comes down to one very concrete question: can Anthropic maintain its role as a credible source of information while being a live player in policy and politics? My tentative view is that it cannot, especially not under the current administration. The central task is then figuring out how to deal with that tension. You could choose to give up on the prospect of being a <em>neutral</em> authority &#8211; you can still hold yourself to high epistemic standards, still put out valuable research that nudges more obviously neutral players to verify, collaborate with trusted sources, and so on, and maintain the policy agenda while you do. I suspect that this might be attractive to Anthropic, who frequently reiterate that they expect transformative effects in very few years, suggesting that policy crunch time might be right now. But I think I disagree.</p><p>Despite all difficulties, <strong>pivoting back toward a more neutral status of authority could be valuable.</strong> That would mostly consist of doing a bit <em>less</em> that falls into the third and fourth role; and perhaps being extra careful not to run too far from the rest of the industry on political funding, making sure that funding vehicles don&#8217;t grow too heavy on a single lab, and so on. Greg Brockman has given a precedent for lab leadership to invest into super-PACs with some attached level of plausible deniability &#8211; perhaps at least that is worth emulating.</p><p>I don&#8217;t think we lack shrewd policy actors or politically principled voices as much as we lack a basis to have all our discussions on &#8211; and for better or for worse, Anthropic seems one of the few voices willing and incredibly able to provide this basis. Considering this ability the unique advantage of a frontier lab in pursuit of some higher goal, I think it&#8217;s worth safeguarding and doubling down on. As discussed above, there&#8217;s really no one else who can do this, and it&#8217;s one of the crucial gaps in addressing both some of the most concerning risks from internal deployment, and some of the most difficult-to-parse policy challenges in AI labor policy.</p><p>This is doubly true because <strong>I&#8217;m comparatively less optimistic about Anthropic&#8217;s future as a policy voice either way. </strong>I suspect the fault lines between the technology industry and its opponents will deepen as AI policy battles continue, and I&#8217;m not sure there&#8217;ll be a good spot left for Anthropic: who exactly is the audience for &#8216;slightly more thoughtful tech company&#8217; in a polarised environment? Perhaps five Senate democrats, but not the populists on either side that might well drive the discussion. My &#8211; perhaps too cynical &#8211; view of the future is: when AI legislation passes, it will pass more and more as victory in a fight, not as sound compromise; and it will therefore not hinge on the endorsement of an actor between the fronts as much as it might have in the past. Somewhat more pessimistically, I suspect this is already true today: Anthropic&#8217;s attempt to advance a compromise on SB-1047 ultimately failed, and at least my sense is that the success of SB-53 was somewhat politically overdetermined either way. All in all, I think you can&#8217;t do everything for long, and if you have to choose, you should choose the thing only you can do well &#8211; and for Anthropic, that&#8217;s providing authoritative information on risks.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aLiS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" width="416" height="104" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:416,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1>Outlook</h1><p><strong>Of course, this will still be an uphill battle, and perceptions take time to change.</strong> If you&#8217;re optimising for the next few years, there&#8217;s very little upside to any pivot &#8211; no one believes you anyways, so you might as well openly advocate for policy. But if you believe, as I do, we&#8217;re in this discussion for the long haul, you might as well start now. People change, memories fade, new policy fights become the reference point, and you can slightly redefine your position month by month. And ultimately, we might arrive at a better division of labour &#8211; where Anthropic does what it does best, and the rest of the policy environment fills the gaps left by this transition.</p><p>Anthropic has built something unusual: a frontier AI company that takes its stated values seriously enough to act on them in costly ways. But those ways are coming into increasing conflict as the political environment tightens and the stakes rise. <strong>Something will have to give. It&#8217;s worth making the choice on your own terms.</strong></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>Yes, other labs have done so as well &#8211; but Anthropic is under greater scrutiny, and so such hires are particularly noticeable with them.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>Though that strategy seems uniquely ill-suited to Anthropic&#8217;s motivations, given their professed short timelines to transformative AI. </p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[The Night Before Preemption]]></title><description><![CDATA[Three dynamics to watch as the fighting begins]]></description><link>https://writing.antonleicht.me/p/the-night-before-preemption</link><guid isPermaLink="false">https://writing.antonleicht.me/p/the-night-before-preemption</guid><dc:creator><![CDATA[Anton Leicht]]></dc:creator><pubDate>Wed, 26 Nov 2025 12:44:44 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/84ef72b9-896d-410e-9b82-fc2f6dcbb757_800x577.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em><strong>If you want to understand AI policy today, you have to look back at the SB-1047 debate</strong></em>, where a growing safety movement inducted lawmakers and allies into its coalition, an unlikely heterogeneous opposition assembled and stuck together, and many of today&#8217;s most influential voices rose to prominence.</p><p><strong>If you want to know what </strong><em><strong>tomorrow&#8217;s </strong></em><strong>AI policy will look like, you should watch the next few weeks closely.</strong> After a hasty scramble in July and subsequent weeks of debate, the fight around federal preemption is now taking off in earnest. There&#8217;s a draft executive order, at least three congressional initiatives, and a rapidly closing window to act. The next few weeks could get messy: they&#8217;ll feature closed-door negotiations about the NDAA, maximalist public rhetoric, $10 million on one side and a coalition from <a href="https://abcnews.go.com/US/inside-magas-growing-fight-stop-trumps-ai-revolution/story?id=127824351">Steve Bannon</a> to <a href="https://www.ms.now/top-stories/latest/joseph-gordon-levitt-ai-superintelligence-ban-congress-rcna242628">Joseph Gordon-Levitt</a> on the other. Coalitions and grudges will emerge, promises broken and kept will change the terrain of AI policy.</p><p>A few weeks ago, I wrote two long pieces about my thoughts on the substance of this issue; arguing <a href="https://writing.antonleicht.me/p/a-preemption-deal-worth-making">first</a> for the policy merits of a deal exchanging narrow frontier AI regulation for broader preemption of state legislation; and <a href="https://writing.antonleicht.me/p/the-devil-you-know">second</a>, arguing for rapprochement between the safetyist and accelerationist factions. Now that the fight is on, I won&#8217;t relitigate the substance &#8211; that&#8217;s now for the politicos.</p><p>Instead, I want to leave you with three observations from the before-times: on the trajectory of political spending, on who can and can&#8217;t afford to play for time, and on the curious silence of the AI developers. We&#8217;ll be revisiting all of them soon, no matter how this ends.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aLiS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" width="450" height="112.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:450,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a></figure></div><h1><em><strong>Here Come the Super-PACs</strong></em></h1><p>If there&#8217;s one lasting effect of this debate beyond legislation, it&#8217;ll have to do with the spending of not one, but <strong>two massive sources of political money</strong>. One, already infamous and frequently featured here and elsewhere, is &#8216;Leading the Future&#8217;, an accelerationist vehicle worth $100 million. Yesterday, the existence of its counterpart was made <a href="https://www.nytimes.com/2025/11/25/us/politics/ai-super-pac-anthropic.html">public</a>: a pro-AI-safety super-PAC with close ties to leading organisations in safety policy and enough money to at least offset a lot of LTF spending. What happens now informs where this money will go in the future, and early signs point in a bad direction.</p><p>One of my main reasons to advocate for d&#233;tente between safetyists and accelerationists last month was <strong>the prospect of a bitter fight between these two PACs</strong>. Their combined committed volume of $150 million could seriously change AI policy for the better &#8211; inform policymakers about the merits <em>and</em> the risks of the technology, and form an uneasily unified front against the uglier politics that threaten to derail more sensible approaches. If they instead fight each other, much of that money might go to waste, as their strategies will coalesce around key races and candidates, supercharge both sides of contentious AI policy debate, and ultimately serve to burn a lot of money. This is a prudent reaction to the threat of being unilaterally outspent &#8211; but ultimately regrettable regardless.</p><p><strong>LTF&#8217;s opening salvo has, frankly, made it much harder to avoid this deepened conflict.</strong> In a somewhat puzzling move, they have picked out Alex Bores as <a href="https://subscriber.politicopro.com/article/2025/11/pro-ai-super-pac-targets-ny-democrat-alex-bores-00652148">their first target </a>&#8211;  a congressional candidate for NY-12 who, as an Assemblyman, advanced the RAISE Act in New York. In doing so, they have made Bores the centerpiece of a national media story, given him as much free national coverage as he wants, as well as the opportunity to position as a defender of citizens against big tech. They have also quite plausibly guaranteed the signing of the RAISE Act into law by uplifting it into a matter of anti-big-tech resistance. That decision has also further disillusioned accelerationists&#8217; opponents about their willingness to engage in good faith: Bores and the RAISE Act are generally perceived as reasonable and moderate, and LTF&#8217;s decision to open hunting season on them anyways did not exactly signal willingness to compromise.</p><p><strong>If LTF postures similarly in this fight, it&#8217;s hard to see a path away from the brink before the midterms. </strong>The jury is still out: LTF <em>will</em> be <a href="https://www.cnbc.com/2025/11/24/ai-pac-trump-congress-midterms.html">spending</a> $10 million in a three-week push for preemption, and one of its strategies so far is calling a major AI safety organisation &#8216;the Germans&#8217; based on X&#8217;s inaccurate location feature. But the actual talking points are not unreasonable: they recognise the need for a federal framework (though have not proposed any language on it), and have somewhat departed from the squarely anti-regulatory position. It&#8217;s not quite &#8216;balancing the risks and benefits&#8217;, but given where they&#8217;re coming from, it&#8217;s a shift. Such a shift would be prudent: unless you want your shiny new super-PAC to be counter-spent by the safety PAC in every future race (and in the process raising salience of that race to your disadvantage), there&#8217;s something to be said for not going all out.</p><p>Of course, preemption opponents don&#8217;t trust this shift at all, and suspect it&#8217;s mostly political deception. I guess we will see, but I think the political realities have changed enough since July that a slightly more conciliatory position is possible. While there&#8217;s no avoiding the fight, there are ways to get it right: if the issue moves out of the public eye into congress fairly quickly, and a somewhat tenable solution can be found in somewhat good faith, there are still many paths back from the brink &#8212; especially now that the safety PAC creates more obvious incentive for accelerationists to deescalate. But <strong>if there&#8217;s an all-out fight now, the current battle lines will entrench</strong>, and it will be difficult to pivot the PACs away from the safetyist-accelerationist conflict. I still think that would be a shame.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aLiS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" width="450" height="112.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:450,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><em><strong>Who&#8217;s Running Out What Clock?</strong></em></h1><p>This entire debate is not happening on neutral ground, but amidst an accelerating trend of AI entering the political mainstream. That makes for a <strong>time asymmetry, where accelerationists face a closing window, while safety advocates think time is on their side.</strong></p><p>The politically salient intersections are increasingly occupied by major political voices &#8211; economic populists on either side speaking to the jobs implications, anti-tech politicians identifying AI as a projection surface for their broader concerns, and so on. Amid that shift, AI technology continues to be unpopular, and its regulation remains popular. Even if you think that polling is the result of coordinated campaigns, it spells a broader shift: heading into an election year, salience and public opinion will make policymakers more likely to engage the issue on its political rather than its policy merits.</p><p>Broadly, that makes for <strong>bad news for the pro-preemption camp.</strong> They might, sometimes convincingly, argue that they&#8217;re not actually trying to preempt laws that would address citizens&#8217; concerns and relevant harms. But the trade-offs are hard to communicate, and the case itself is not particularly intuitive. Accordingly, accelerationist messaging has changed since summer, and now actively invokes the prospect of federal frameworks and trades on US prospects to win the race with China, instead of primarily attacking the prospect of state-based legislation. The smart accelerationists have read the room &#8211; and I suspect they&#8217;re also growing frustrated at their less subtle allies.</p><p>More elegant messaging on its own won&#8217;t do. Agreeing to empty preemption is also quickly becoming an electoral liability: media-salient AI harms will happen, regulation will remain popular, and no policymaker wants to be on record to have voted against legislation.</p><p><strong>The political attack ads write themselves</strong>: stories of harms to children, exploitative businesses, jobs at risk &#8211; interspersed with statements and votes against state laws that purport to address them. This is doubly true now that the pro-safety PAC exists to actually pay for ads like these, which is why leaking the PAC&#8217;s existence strikes me as very effective &#8211; particularly since a reported key figure behind the PAC has pointed out a vote for preemption as an <a href="https://x.com/bradrcarson/status/1991900566247666129">electoral liability</a> akin to a vote for the Iraq War.</p><div><hr></div><h4><em><strong>Using A Closing Window</strong></em></h4><p><strong>But the accelerationists know all this</strong>, and are responding in strategic kind. First by exerting a substantial amount of the committed PAC funding <em>right now</em>, for a campaign to get this through, while the safety PAC is still assembling. My suspicion is that the announcement of the $10 million LTF push is what prompted the safety side to leak its PAC before it was fully operational. But as it stands, it&#8217;s still fighting money with the quickly announced (near-term) prospect of money &#8211; that&#8217;s a disadvantage. Second, I suspect this push will see much more White House involvement than previous attempts. Whereas in July, White House resources were tied up in many different places, the admin seems better-prepared for a fight this time around. In a still-obedient Republican caucus, that can make a world of difference &#8211; at least as long as proposals remain reasonable enough not to offend the President&#8217;s political instincts.</p><p>This is why I think you should understand the<strong> </strong><a href="https://www.transformernews.ai/p/exclusive-heres-the-draft-trump-executive">leaked</a><strong> draft Executive Order as a forcing function.</strong> This EO, a document that compels agencies to prevent state AI legislation through a number of legally contentious mechanisms, is neither plan A, nor is it primarily a desperate attempt to make something stick. By all accounts, the President&#8217;s political preference, too, is to preempt by federal framework instead of empty litigation. To my mind, the EO instead attempts to move congress into action. It flows from the recognition that congress is not sufficiently motivated to move on preemption if the alternative is the cozy status quo of state-level legislation &#8212; that in an environment where preemption opponents were ready to run out the clock, it would have been hard to get anything done in congress. Why else leak the EO while congressional action was still coalescing? </p><p>I think <strong>preemption opponents&#8217; reactions to the EO have missed that point.</strong> Among other things, that has given rise to misguided expectation management: of course, the outcome will not be broad and empty preemption. Everyone knows this, to the point that even accelerationist operatives are conceding the point in public. The success of the EO is rather measured in congressional appetite to tackle the issue instead of waiting it out &#8211; and along these lines at least, it seems to have gone fairly well. Following that logic, I wouldn&#8217;t be surprised to see another (draft) EO early next week, following a similar goal, but slightly weakened so as to not inconvenience Republican lawmakers that might disagree with its specific provisions. </p><p>The White House, too, has to thread a needle here: <strong>using the EO as a stick only works if there is legislative language to serve as a carrot</strong>; the issue in last week&#8217;s news cycle was that there was no <em>public</em> language toward which to push congressional Republicans. I suspect next week will be different.</p><div><hr></div><p>To end,<strong> it also bears repeating that a worsening accelerationist position does not necessarily imply an improving safetyist position. </strong>As I&#8217;ve argued before, the relative sway of safetyists within their coalition diminishes as political salience rises. Running out the clock works as a mechanism to <em>stop</em> accelerationists from getting what they want, but it works much less as a mechanism to actually get what the safetyists want. There is no clear model to pass safety-relevant policy on the back of broader anti-AI political sentiment: the legislative asks that matter for frontier safety are narrow, but the range of politically favourable anti-AI laws is much broader. The next AI policy fight in congress won&#8217;t be yet another version of this same fight. It&#8217;ll happen along different battle lines, and they might not be favourable to any major player today. That doesn&#8217;t mean that the safety side should take a loss without a fight, but it does mean <strong>it might be worth taking a pyrrhic victory over adjournment if it&#8217;s on the table.</strong></p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aLiS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" width="450" height="112.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:450,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><em><strong>Where Are The Labs?</strong></em></h1><p><strong>The major AI developers have not been vocal on this issue</strong>: few statements from leadership, few public positions, neither official remarks nor offhand comments on podcast appearances. To some extent, that&#8217;s understandable: it&#8217;s a politically volatile moment, and before the chips are down, aligning with one position can be politically risky. Particularly OpenAI is currently not in a position to exert its legislative influence: after the backstop debacle, it&#8217;s under general suspicion to leverage its size and importance, and entering a policy conversation means painting a target on any coalition&#8217;s back. But still, the developers&#8217; absence from this debate runs a substantial risk in three directions at once.</p><p>First and most obviously, <strong>OpenAI in particular is susceptible to being painted as part of the preemption push</strong>, because its President Greg Brockman is personally involved in LTF. Media reports frequently portray LTF&#8217;s actions as representative of industry as a whole, leaning both on the Brockman link and a scarcity of public statements to the contrary. I wouldn&#8217;t expect enough nuance to distinguish between OpenAI and other developers in this case, and so I think this association is likewise applicable to GDM, xAI and Meta at least. This is a very risky game for the developers to play: their political opponents will not hesitate to attack them by drawing the line from Brockman to unpopular preemption. Back-channel dealing and closed-door conversations will only do so much to assuage the skeptics&#8217; worries &#8211; as long as they have not stuck out their neck for any particular position, the post-game analysis will be that Brockman&#8217;s PAC money speaks louder than Chris Lehane&#8217;s words. Beyond the obvious political ramifications, this also matters internally: many employees at the labs believe in the original, mission-led approaches and are dissatisfied to work for an employer that seems to engage in the standard anti-regulatory playbook.</p><p><strong>Second, the ambiguity cuts the other way, too</strong>. Opponents from the accelerationist side have every reason to paint the major developers as secretly pro-regulation, given the shadow cast by Anthropic&#8217;s involvement in the safety super-PAC &#8211; in fact, they&#8217;ve frequently done so. In public, the other labs may insist on their distance from Anthropic&#8217;s policy positions, but in the absence of clear statements to the contrary, skeptics will still make assumptions. That&#8217;s especially true if the safety PAC also receives donations from within OpenAI, which strikes me as highly plausible. Silence in that environment is a canvas for projection, and labs risk ending up in the worst of both worlds: blamed by safety advocates for bankrolling LTF through Brockman, and blamed by accelerationists for providing cover to regulation advocates over industry.</p><p>And third, AI developers may also eventually be at risk of losing their position as something more than just another industry lobby. At no small investment of time, talent, and political capital, all major labs have cultivated a role as trusted actors and authoritative voices on policy, whose input isn&#8217;t just sought as stakeholder testimony. I&#8217;m convinced this dynamic is a boon to our policy ecosystem. But that standing is at risk amid increasing politicisation, as congressional offices trust fewer and fewer outside voices and put a growing premium on showing up to the fights.</p><p><strong>All in all, I don&#8217;t think the current ambiguity is doing the developers any favours.</strong> I know that many people leading these companies have strong opinions on these matters &#8211; opinions that often diverge from the standard industry-versus-regulators frame. They might come to regret missing out on the opportunity to voice them on the record, and might be surprised how quickly they&#8217;ll get cast in political roles that have little to do with their own convictions. But even if this is the correct risk-averse strategy for labs, I&#8217;m not sure the rest of the debate should let them run it. These are major players with major stakes in the decision and deep insights into its potential ramifications, and we should be interested in their positions. Commentators should compel their public positions, and policymakers shouldn&#8217;t be satisfied to let them maneuver behind the scenes. If ill preparation keeps them from having a voice on this, AI labs&#8217; position as a major and somewhat trusted voice on AI policy is at risk from three directions.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aLiS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" width="452" height="113" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:452,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><div><hr></div><p>Soon, all these questions will take a back seat. There&#8217;ll be fights over line items, maximalist rhetoric, and anyone having called for reasonable compromise will look a little bit stupid as things escalate. But above that noise, those who find themselves before a closing window to act should still keep the trend lines in mind: despite all temptation to treat every battle like the last, this is far from the final AI policy debate.</p><div><hr></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p>]]></content:encoded></item><item><title><![CDATA[How To Grow Too Big To Fail]]></title><description><![CDATA[When AI developers overplay their strategic indispensability, they risk undermining pro-AI policy writ large]]></description><link>https://writing.antonleicht.me/p/how-to-grow-too-big-to-fail</link><guid isPermaLink="false">https://writing.antonleicht.me/p/how-to-grow-too-big-to-fail</guid><dc:creator><![CDATA[Anton Leicht]]></dc:creator><pubDate>Wed, 19 Nov 2025 14:10:21 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/c2e3c34d-83a2-4915-8c02-ebccb67ded7e_2600x1836.avif" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>In the 1770s, the British East India Company went from bad to worse &#8211; growing obligations in London and mismanagement in Bengal marked the end of its profitable heyday. For the British Empire, its collapse would have not only meant financial ruin, but a marked loss in British Imperial capacity for trade and administration on the new frontier. Britain recognised as much, and quelled concerns with massive subsidies and protective acts. Failure of its champion was not an option to the mercantilist government.</em></p><div><hr></div><p><em><strong>OpenAI, in its own fashion, aims just as high</strong></em>. In an attempt to capture the value of the AI revolution wherever it might accrue, their business is rapidly expanding. Searching for a durable moat, OpenAI is expanding into country-level infrastructure deals, chip design, and partnerships all throughout the American economy &#8211; fueling a months-long AI-driven stock market rally. At a valuation of $500 billion, observers were starting to wonder whether OpenAI is on the road to become &#8216;too big to fail&#8217;. Bailout or not, the failure of the leading AI developers was beginning to feel unthinkable.</p><p>Then, a few weeks ago, OpenAI CFO <strong>Sarah Friar said the quiet part out loud: </strong>in an <a href="https://www.wsj.com/video/openai-cfo-would-support-federal-backstop-for-chip-investments/4F6C864C-7332-448B-A9B4-66C321E60FE7">interview</a> with the Wall Street Journal, she floated the idea of a &#8216;federal backstop&#8217; &#8211; hinting that the government might step in should OpenAI&#8217;s financial momentum falter. The idea was <a href="https://x.com/sama/status/1986514377470845007">reined back in</a> by CEO Sam Altman, but not before White House AI Czar David Sacks <a href="https://x.com/DavidSacks/status/1986476840207122440">clarified</a> a bailout wasn&#8217;t on the table. That particular chapter was open and closed in a few days.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p>But the story does not end here: AI developers continue to grow, deepening their entrenchment in the broader economy. It&#8217;s still in their strategic interest to trade on the prospect of being &#8216;too big to fail&#8217;. The resulting political strategies sit right on the edge: sometimes they&#8217;re in pursuit of a genuinely important policy to boost technological progress amidst intense geopolitical competition, sometimes they&#8217;re in pursuit of drawing regulatory moats in an emerging market. If developers veer into the latter, they risk spoling the politics of the former. </p><p>So there&#8217;s a triad of questions worth addressing in the aftermath of recent debacles: <strong>what is OpenAI&#8217;s path to become too big to fail? Will it work? And what should we make of it?</strong></p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aLiS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" width="450" height="112.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:450,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a></figure></div><h1>Why Become Too Big To Fail?</h1><p>To understand what OpenAI does, it&#8217;s useful to start with what it needs. Why would they want to become too big to fail? I don&#8217;t think it&#8217;s primarily to set up an actual bailout. It&#8217;s always nice to have a fallback, of course. OpenAI makes a lot of risky investments, and it would of course be great to get bailed out if they fail. But I don&#8217;t think that&#8217;s the main point &#8211; I&#8217;m not sure you ever recapture the spark or regain public and investor trust again, bailout or not. </p><p>But short of a bailout, <strong>OpenAI greatly benefits from the impression of being too big to fail. </strong>The first reason is that it makes for a great policy argument. As I&#8217;ve written in some more detail before, AI developers require favourable treatment from the government to stay on track. As written, America&#8217;s frameworks are not conducive to a technological revolution: its legal institutions would hamstring data use and bury developers in litigation, its energy infrastructure could not sustain the necessary buildouts, its permitting systems would not get the chips online in time. Both the Biden and the Trump administration have worked to reduce these barriers in their own ways. Without those interventions &#8211; and were they to cease &#8211; AI progress would slow or exfiltrate to other, often rivaling, countries.</p><p><strong>To keep this support going is getting harder by the day</strong>, as anti-AI voices backed by <a href="https://news.gallup.com/poll/694685/americans-prioritize-safety-data-security.aspx">abysmal</a> <a href="https://www.gallup.com/analytics/695033/american-ai-attitudes.aspx">polling</a> are starting to doubt the industry and the government&#8217;s support for it. President Trump has frequently been quick to cut loose partners he deems to have grown burdensome. The &#8216;too big to fail&#8217; story helps to ward off that prospect. &#8216;Do you really want to preside over a huge stock market crash&#8217;, goes the story; &#8216;do you really want all these companies we have deals with to go under&#8217;, it continues; &#8216;do you really want to be stuck with the bill if we default?&#8217;, it ends. OpenAI has a lot to gain from growing big enough to raise the cost of ending government support for AI. In leveraging this, OpenAI sometimes flirts with a line: sometimes, it speaks for the entire industry, calling for copyright extensions or datacenter permitting; but sometimes, it appears to ask for special treatment amidst its competitors, such as in the ill-fated backstop saga. The former is necessary, the latter concerning. </p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!CSDJ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad60d9a0-0416-42aa-b2e9-7fa8b2f5e209_1578x390.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!CSDJ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad60d9a0-0416-42aa-b2e9-7fa8b2f5e209_1578x390.png 424w, https://substackcdn.com/image/fetch/$s_!CSDJ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad60d9a0-0416-42aa-b2e9-7fa8b2f5e209_1578x390.png 848w, https://substackcdn.com/image/fetch/$s_!CSDJ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad60d9a0-0416-42aa-b2e9-7fa8b2f5e209_1578x390.png 1272w, https://substackcdn.com/image/fetch/$s_!CSDJ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad60d9a0-0416-42aa-b2e9-7fa8b2f5e209_1578x390.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!CSDJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad60d9a0-0416-42aa-b2e9-7fa8b2f5e209_1578x390.png" width="1456" height="360" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ad60d9a0-0416-42aa-b2e9-7fa8b2f5e209_1578x390.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:360,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!CSDJ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad60d9a0-0416-42aa-b2e9-7fa8b2f5e209_1578x390.png 424w, https://substackcdn.com/image/fetch/$s_!CSDJ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad60d9a0-0416-42aa-b2e9-7fa8b2f5e209_1578x390.png 848w, https://substackcdn.com/image/fetch/$s_!CSDJ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad60d9a0-0416-42aa-b2e9-7fa8b2f5e209_1578x390.png 1272w, https://substackcdn.com/image/fetch/$s_!CSDJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad60d9a0-0416-42aa-b2e9-7fa8b2f5e209_1578x390.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><p>A second advantage of the &#8216;too big to fail&#8217; impression is that it <strong>surely helps with securing all these deals</strong> and funding arrangements. Datacenter business is notoriously risky, and country-level partners in particular have been burned a lot by assuming hyperscalers would follow through, only for the risk profile not to work out. Big funding sources, too, like assurances. If you were weighing a big investment of some kind into OpenAI, but felt dismayed at the bubble talk &#8211; would you not be reassured to open the Wall Street Journal and read that OpenAI was guaranteed a bailout if things went sideways?</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aLiS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" width="450" height="112.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:450,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1>The Making of a Narrative</h1><p>The impression of being &#8216;too big to fail&#8217;, partly a natural outgrowth of political economy and partly a crafted posture, rests on three pillars.</p><p>The first is a <strong>deep connection of the AI boom to the enormously favourable stock market</strong> of recent months. The rally is driven by tech stocks running hot on the prospect of a new industrial revolution and the reality of unprecedented infrastructure buildouts. This President in particular takes political pride in a favourable trajectory &#8211; and for good reason: while stock market performance is not a perfect predictor of political sentiment, a stock market <em>crash</em> is political poison for any incumbent. If OpenAI was to falter, faith in the overall AI rally could soon follow, and policymakers will do their best to avoid that.</p><p>The second is <strong>OpenAI&#8217;s deep, deal-based entrenchment with powerful actors</strong>. The most-noted part of this landscape are the many private-sector deals involving creative financing schemes to enable datacenter projects, buildouts all along the semiconductor supply chain, and preferential adoption deals with American legacy corporations. But a lot of foreign governments are likewise putting their faith in OpenAI, banking on the OpenAI for countries initiative to deliver them everything from software to large-scale infrastructure. Either way, deals create stakeholders invested in OpenAI&#8217;s survival. If needs be, they might make that case to the government as well &#8211; and suddenly, policymakers are not only looking at the prospect of a tech company going under, but at a large section of the American economy heading into a downturn. Deals can create linked vulnerabilities, and thereby further advocacy on OpenAI&#8217;s behalf.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!zlII!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F60ee269b-e661-4a77-9707-0b08600aeb7f_1578x958.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!zlII!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F60ee269b-e661-4a77-9707-0b08600aeb7f_1578x958.png 424w, https://substackcdn.com/image/fetch/$s_!zlII!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F60ee269b-e661-4a77-9707-0b08600aeb7f_1578x958.png 848w, https://substackcdn.com/image/fetch/$s_!zlII!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F60ee269b-e661-4a77-9707-0b08600aeb7f_1578x958.png 1272w, https://substackcdn.com/image/fetch/$s_!zlII!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F60ee269b-e661-4a77-9707-0b08600aeb7f_1578x958.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!zlII!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F60ee269b-e661-4a77-9707-0b08600aeb7f_1578x958.png" width="1456" height="884" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/60ee269b-e661-4a77-9707-0b08600aeb7f_1578x958.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:884,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!zlII!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F60ee269b-e661-4a77-9707-0b08600aeb7f_1578x958.png 424w, https://substackcdn.com/image/fetch/$s_!zlII!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F60ee269b-e661-4a77-9707-0b08600aeb7f_1578x958.png 848w, https://substackcdn.com/image/fetch/$s_!zlII!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F60ee269b-e661-4a77-9707-0b08600aeb7f_1578x958.png 1272w, https://substackcdn.com/image/fetch/$s_!zlII!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F60ee269b-e661-4a77-9707-0b08600aeb7f_1578x958.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The third pillar is the <strong>increasing value of AI as a fundamental utility</strong>, both as an economic input and a strategic resource. This is where some nuance is called for &#8211; because the fundamental argument is right: it&#8217;s in the unmistakable interest of America and the West that US firms remain at the frontier of AI development, and that there exists a funding and regulatory environment that allows us to see this technological leap through, even if it comes with some financial and potential risk. Even financial derisking &#8211; &#8216;backstops&#8217; &#8211; has an important place in that.</p><p><strong>I don&#8217;t think it&#8217;s nefarious lobbying for AI developers to suggest the same</strong>, though of course the incentives align nicely. The trickier question is whether needing AI means needing OpenAI. Not directly, to be sure, as David Sacks has pointed out. But indirectly, it does seem difficult to see how OpenAI could fail and the rest of the industry would still survive, both as a consequence of the collapse of the deal landscape and of general economic sentiment. OpenAI&#8217;s contribution to this third pillar, if anything, has been its ability to become near-synonymous with the AI industry as a whole. That makes disambiguation that much harder: even if you stick to Sacks&#8217; tenet of &#8216;we don&#8217;t care about OpenAI, just about AI&#8217;, it&#8217;s hard to see how you could make that practical distinction.</p><p>Taken together, these three arguments provide a vibe and an argument to go with it: skeptics will be assuaged by the sense that &#8216;surely, the USG won&#8217;t let OpenAI fail&#8217;, and OpenAI can continue to wield the implicit threat of &#8216;better make sure to keep us afloat, or else&#8230;&#8217;.</p><h4><em>The Quiet Part Out Loud</em></h4><p><strong>The Friar incident is a major setback to that trajectory. </strong>That&#8217;s why I struggle to believe it was some sort of clever test balloon testing how far one could push this. OpenAI was trading on the <em>implicit</em> notion that it was too big to fail. But in the wake of Friar&#8217;s comment, even unrelated OpenAI proposals &#8211; such as a call for derisking chip manufacturing or expanding AMIC credits to the AI supply chain &#8211; were quickly <a href="https://x.com/GaryMarcus/status/1986798628552515647">miscast</a> as a bailout plays by uncharitable critics, tainting sound and helpful suggestions. Making the notion this explicit also invited clarifications &#8211; in this case, latent <a href="https://x.com/RonDeSantis/status/1984975933862707661">opposition</a> from within the GOP led the White House to clarify a bailout wasn&#8217;t on the table, and Sam Altman to walk back the very idea.</p><p>No harm done? Not quite, because a <strong>bailout would be much more difficult to achieve in the future now </strong>&#8211; and the observers that were supposed to be assuaged by the prospect have taken note of that shift. The bailout has grown more unlikely for multiple reasons: Now that a senior OpenAI executive has publicly mentioned a bailout as an option, it&#8217;ll be harder for OpenAI to insist that they&#8217;ve always argued in good faith and never <em>planned</em> for a bailout, should it become necessary. Now that critics are alarmed, it&#8217;ll be harder to subtly further the impression of &#8216;too big to fail&#8217;. And now that the administration has committed against a bailout, <strong>much helpful ambiguity has been lost.</strong></p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!CZnr!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e8bd870-f2f5-47d7-9bd4-7edd7a3f2049_1630x238.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!CZnr!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e8bd870-f2f5-47d7-9bd4-7edd7a3f2049_1630x238.png 424w, https://substackcdn.com/image/fetch/$s_!CZnr!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e8bd870-f2f5-47d7-9bd4-7edd7a3f2049_1630x238.png 848w, https://substackcdn.com/image/fetch/$s_!CZnr!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e8bd870-f2f5-47d7-9bd4-7edd7a3f2049_1630x238.png 1272w, https://substackcdn.com/image/fetch/$s_!CZnr!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e8bd870-f2f5-47d7-9bd4-7edd7a3f2049_1630x238.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!CZnr!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e8bd870-f2f5-47d7-9bd4-7edd7a3f2049_1630x238.png" width="1456" height="213" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/3e8bd870-f2f5-47d7-9bd4-7edd7a3f2049_1630x238.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:213,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!CZnr!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e8bd870-f2f5-47d7-9bd4-7edd7a3f2049_1630x238.png 424w, https://substackcdn.com/image/fetch/$s_!CZnr!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e8bd870-f2f5-47d7-9bd4-7edd7a3f2049_1630x238.png 848w, https://substackcdn.com/image/fetch/$s_!CZnr!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e8bd870-f2f5-47d7-9bd4-7edd7a3f2049_1630x238.png 1272w, https://substackcdn.com/image/fetch/$s_!CZnr!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e8bd870-f2f5-47d7-9bd4-7edd7a3f2049_1630x238.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1>Why It Might Not Work </h1><p>Let&#8217;s nevertheless assume people mostly forget about the Friar mishap and things cool down again. Is OpenAI&#8217;s strategy working out? Is the growing impression that it might be &#8216;too big to fail&#8217; justified? I think there are three factors that still greatly complicate this story.</p><p>The first reason is: <strong>large firms aren&#8217;t given bailouts because of their market capitalisation, but because of their political relevance</strong>, either in terms of jobs or of strategic value. In the 2008-2009 case of automakers, the story was jobs: Michigan is a politically important state, the automakers employed many workers, and the bailout was very good politics. Comparatively very few people work for OpenAI, and most of them in electorally peripheral California. The Lockheed bailout in 1971 is of the strategic value kind: Lockheed was critical to military-industrial capacity, and couldn&#8217;t be allowed to fail. You might make the same argument for OpenAI or the AI industry at large today. But if a bubble burst, the sentiment on that would likely shift, too. Believing in the strategic upshot of AI requires believing in claims of future progress in capabilities and adoption; but a bubble burst that would require a bailout could be read as evidence to the contrary. &#8216;If AI was important enough to warrant bailing-out, why did it not even meet its revenue projections?&#8217;, the story would go.</p><p>The second reason is: <strong>many assets tied up in the &#8216;AI bubble&#8217; are relatively easy to reabsorb. </strong>If things went sideways, the value currently tied up in AI would not evaporate. Much of the capex is prospective to begin with: it&#8217;s in forward commitments for datacenters that haven&#8217;t been built, purchasing contracts for GPUs that haven&#8217;t shipped, and talent acquisition that hasn&#8217;t been paid out. Even the assets that have already taken physical shape &#8211; the current generation of datacenters &#8211; are fungible and absorbable. I&#8217;m sure we&#8217;d find a second-best use case for all the talent and especially all the compute even without OpenAI.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> Granted, once the valuation for all these assets was cooled down a bit by a market correction, the reabsorption into the overall economy wouldn&#8217;t be <em>perfect </em>by a long shot &#8211; a lot of valuation of the infrastructural and talent assets in particular would be lost. </p><p>But a bubble bursting would primarily be bad news for the committed institutional investors; it does not spell a widespread loss of real assets in the way that past sectoral failures would have. If the automakers collapse, the still-depreciating manufacturing plants&#8217; value actually plummets, and they could only be repurposed at great cost. An AI bailout isn&#8217;t as necessary to ward off a similar loss in non-fungible assets.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!SEMW!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79a7fec6-b1e3-4aaa-b4ee-ba5e456fd8c3_1466x818.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!SEMW!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79a7fec6-b1e3-4aaa-b4ee-ba5e456fd8c3_1466x818.png 424w, https://substackcdn.com/image/fetch/$s_!SEMW!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79a7fec6-b1e3-4aaa-b4ee-ba5e456fd8c3_1466x818.png 848w, https://substackcdn.com/image/fetch/$s_!SEMW!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79a7fec6-b1e3-4aaa-b4ee-ba5e456fd8c3_1466x818.png 1272w, https://substackcdn.com/image/fetch/$s_!SEMW!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79a7fec6-b1e3-4aaa-b4ee-ba5e456fd8c3_1466x818.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!SEMW!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79a7fec6-b1e3-4aaa-b4ee-ba5e456fd8c3_1466x818.png" width="1456" height="812" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/79a7fec6-b1e3-4aaa-b4ee-ba5e456fd8c3_1466x818.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:812,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!SEMW!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79a7fec6-b1e3-4aaa-b4ee-ba5e456fd8c3_1466x818.png 424w, https://substackcdn.com/image/fetch/$s_!SEMW!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79a7fec6-b1e3-4aaa-b4ee-ba5e456fd8c3_1466x818.png 848w, https://substackcdn.com/image/fetch/$s_!SEMW!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79a7fec6-b1e3-4aaa-b4ee-ba5e456fd8c3_1466x818.png 1272w, https://substackcdn.com/image/fetch/$s_!SEMW!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79a7fec6-b1e3-4aaa-b4ee-ba5e456fd8c3_1466x818.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>And the third reason is: <strong>AI and its developers are still very unpopular. </strong>A bailout, of course, would likewise be unpopular. So unpopular, in fact, that it could become a liability to a White House that delivered it. Already today, Presidential hopefuls and dissident voices <a href="https://writing.antonleicht.me/p/three-fault-lines-in-conservative">within the GOP</a> are positioning as anti-AI; Josh Hawley and Ron DeSantis are perhaps the most visible examples. An economic populist platform on the left is likewise just one sharp pivot removed from a very promising <a href="https://writing.antonleicht.me/p/ai-and-jobs-enter-populism">anti-AI strategy</a>; and I suspect the midterms will give us plenty of data to reinforce these strategic directions. The Trump administration, and presumptive candidate Vance in particular, is already visibly tied to AI through its policies and relationships to tech executives &#8211; but an all-out bailout is more visible still. If AI continues to be as unpopular as it is, no shrewd politician would want to be on the record for sustaining it even where the market would not.</p><p>For all these reasons, <strong>I&#8217;m skeptical that a bailout in the current market and political climate was ever really in the cards.</strong> But that&#8217;s the easy part &#8211; recall that OpenAI&#8217;s primary hope was always to trade on the prospect of the bailout more so than to secure the bailout itself. This is perhaps where the Friar incident hurts the most: implicitness was a boon in this debate. Now that people are making the bailout conversation explicit, the gaps in the pitch for the bailout are starting to show, the political economy starts showing its teeth, and the implicit prospect becomes so much harder to keep alive.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aLiS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" width="450" height="112.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:450,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1>Where This Leaves Us </h1><p>I&#8217;m not excited to collectivise the risk of OpenAI&#8217;s specific business strategy. But I&#8217;m also wary of the political fallout hindering important avenues of government support. As so often, we&#8217;re left with a potentially unsatisfying middle road, on which it&#8217;ll be important to keep two things in mind at once:</p><ul><li><p>There&#8217;s a real incentive for OpenAI to become too big to fail, and we should work to <strong>calibrate the market so that governments do not bear the risks</strong> of <em>OpenAI&#8217;s individual business. </em>Given the chance, major players will always be tempted to kick away the ladder once they&#8217;ve gotten to the top, and doubly so in a business as politically entrenched as AI. That&#8217;s normal, but being watchful of this natural trend is both good and the measure of a healthy policy ecosystem.</p></li><li><p>But we can&#8217;t cross the line <strong>into overscrutinising the idea of supporting and derisking AI progress</strong> and buildouts as a whole. We need smart policy mechanisms to do this &#8211; otherwise we won&#8217;t be able to sidestep the often-atrophied mechanisms of Western democracies to stay in control of this technological revolution, and we won&#8217;t be able to proliferate this technology throughout the world in time to ensure an equitable and stable order. The government simply has a role to play here, and it should be enabled to play it well.</p></li></ul><p>All sides can contribute to this &#8211; as always, our worst political instincts are to dunk on our pick of power-hungry AI companies, Luddite politicians, crony capitalists or captured regulators. It bears repeating again that this is seldom helpful, and takes us further away from the nuance we need to make difficult disambiguations like the above.</p><p><strong>But an immediate burden to do better falls on OpenAI and fellow developers</strong>. They already greatly benefit from a USG that has the leeway to derisk AI buildouts as a whole. Support for AI is already effectively support for OpenAI, especially if OpenAI is optimistic about its relative position in the AI race. Anything OpenAI does to suggest that government support is a tool to secure its <em>relative</em> competitive position, rather than to advance national progress, actively undermines that broader project. That&#8217;s ultimately not to OpenAI&#8217;s advantage. As AI grows, so will the temptations for AI developers to exploit their proximity to power for short-term gain. To keep AI policy on track, they need to resist.</p><div><hr></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>I do wish someone would do some research on what exactly the best and most likely use of current compute buildouts would be in case there was a market correction</p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[AI & Jobs: Leverage without Labor]]></title><description><![CDATA[Would humans flourish under full automation?]]></description><link>https://writing.antonleicht.me/p/ai-and-jobs-leverage-without-labor</link><guid isPermaLink="false">https://writing.antonleicht.me/p/ai-and-jobs-leverage-without-labor</guid><dc:creator><![CDATA[Anton Leicht]]></dc:creator><pubDate>Wed, 05 Nov 2025 13:33:45 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/3d2e61d4-f873-4343-bfc6-995f2051b8c0_1599x1068.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Serious people increasingly voice serious aspirations of AI that will do all human work. That means <strong>the issue on the table is &#8216;full automation&#8217;:</strong> if AI displaces most human work, will our institutions still enable human survival and flourishing?</p><p>The right way to frame this question may be <em><strong>leverage</strong></em><strong>: do humans retain the power to defend favourable arrangements? </strong>Look around, and you&#8217;ll find optimists telling you yes, in the absence of material scarcity, there&#8217;ll be enough to go around and little need for hard leverage; and pessimists telling you no, the need for labor is what staves off the prospect of a permanent underclass. Between these two, I&#8217;ll argue for another view: our <strong>institutions would handle the immediate aftermath of full automation quite well, but irreversibly deteriorate </strong>in the absence of mutual leverage and common manners<strong>. </strong>To avoid that fate, we ought to close the growing gap between the pace of automation and institutional adaptation.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p>Compared to many pressing issues in AI policy,<strong> </strong>this essay considers a problem further down the road.<strong> </strong>For the foreseeable future, I believe the frontier will remain jagged and comparative advantage will persist &#8211; we might well run into many other problems before we ever have to start worrying about full automation. Many thoughtful people realise the same and argue that therefore, this debate is not worth having. But I disagree: <strong>leading developers and thinkers consider full automation possible, and are already planning for it</strong>. They either outright <a href="https://www.mechanize.work/">advocate</a> for full displacement, or aim to relegate human work to a smaller, less important <a href="https://youtu.be/zwnVUiwObl8?si=4XK_GYJTTOrr8r1c&amp;t=1476">part</a> of the economy, closer to a feel-good channel for welfare payments than a trade of economic value for commensurate wages.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> In an era of surprising technological progress, some of it driven by these very voices, I&#8217;m inclined to take them seriously enough to discuss full automation today.<br><br>No matter whether full automation is ultimately technically <a href="https://www.mechanize.work/blog/technological-determinism/">inevitable</a> or <a href="https://blog.cosmos-institute.org/p/technocalvinism">not</a>, we can imagine many different paths toward it &#8211; paths we choose today. So we ought to ask: <strong>do we find the current default story of full automation desirable?</strong> To get a better sense of what that story looks like, I suggest we consider three clusters of views.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!JNdb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8c16e0e-6d70-4154-8da9-d97d1120a6f5_1416x734.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!JNdb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8c16e0e-6d70-4154-8da9-d97d1120a6f5_1416x734.png 424w, https://substackcdn.com/image/fetch/$s_!JNdb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8c16e0e-6d70-4154-8da9-d97d1120a6f5_1416x734.png 848w, https://substackcdn.com/image/fetch/$s_!JNdb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8c16e0e-6d70-4154-8da9-d97d1120a6f5_1416x734.png 1272w, https://substackcdn.com/image/fetch/$s_!JNdb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8c16e0e-6d70-4154-8da9-d97d1120a6f5_1416x734.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!JNdb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8c16e0e-6d70-4154-8da9-d97d1120a6f5_1416x734.png" width="528" height="273.6949152542373" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a8c16e0e-6d70-4154-8da9-d97d1120a6f5_1416x734.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:734,&quot;width&quot;:1416,&quot;resizeWidth&quot;:528,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!JNdb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8c16e0e-6d70-4154-8da9-d97d1120a6f5_1416x734.png 424w, https://substackcdn.com/image/fetch/$s_!JNdb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8c16e0e-6d70-4154-8da9-d97d1120a6f5_1416x734.png 848w, https://substackcdn.com/image/fetch/$s_!JNdb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8c16e0e-6d70-4154-8da9-d97d1120a6f5_1416x734.png 1272w, https://substackcdn.com/image/fetch/$s_!JNdb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8c16e0e-6d70-4154-8da9-d97d1120a6f5_1416x734.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">The &#8216;Peasant Family at the Table&#8217;. What future does it face?</figcaption></figure></div><h1><strong>The Thin Veneer</strong></h1><p>The first view is what I&#8217;ll call the <strong>&#8216;thin veneer&#8217; view of labor and society.</strong> It holds that <strong>distribution of wealth and power stem from hard leverage</strong>. Democratic institutions, welfare systems and charity are but a thin veneer over this system of leverage &#8212; they exist because if they did not, humans would protest by withholding economic contributions. They could strike, lay down their work, cease consumption, and so on. Indeed, the view goes, this is often how they arose in the first place &#8211; rights were <a href="https://www.economist.com/by-invitation/2025/09/18/two-scholars-ask-whether-democracy-can-survive-if-ai-does-all-the-jobs?taid=68cd8e6df44288000116956c&amp;utm_campaign=trueanthem&amp;utm_medium=social">ceded</a> to groups that had the power to take them. This thin veneer view is not <em>only</em> about labor, but also about violence: the implicit threat is not just of strike, but also of violent revolution. But with the rise of increasingly autonomous weapons that change balances of power away from the many toward the few, the <em>economic</em> part of leverage seems more material right now. This first view runs with the implications of that, and fears the changes in leverage dynamics will upset our social contract.</p><p><strong>On the &#8216;thin veneer&#8217; view, good outcomes after full automation seem highly unlikely.</strong> Once governments and capital-holders &#8211; anyone directly involved in the AI economy &#8211; do not need workers anymore, why enable them to have any kind of good life? Perhaps they&#8217;ll still pay them a subsistence amount, perhaps not &#8211; but they definitely have no incentive to provide for anything that comes with even a marginal cost. The thin veneer view also applies to the softer version of full automation, wherein humans hold on to increasingly irrelevant jobs that make up a smaller and smaller percentage of the overall economy. That&#8217;s because that scenario still implies a substantial relative decrease in human workers&#8217; share of the economy, and thereby a necessary comparative decrease in their leverage. To the extent that the thin veneer advocate thinks the current welfare of workers is downstream from their current leverage, they&#8217;d also think a decrease in that leverage comes with a commensurate decrease in welfare. Note that this ignores deflationary effects from AI &#8211; more on that later.</p><p><strong>What does the &#8216;thin veneer&#8217; view make of the power of humans in their role as consumers</strong>, not workers? Surely, the AI economy requires <em>inputs</em> to continue running? Yes, but not necessarily from a breadth of humans &#8211; consumption can instead come from a smaller and smaller group of capital owners and stakeholders in the AI economy. Welfare payments likewise don&#8217;t confer leverage through consumption: counterfactually, you could just withhold them, not levy the taxes required, and approximately the same amount of consumption happens anyways.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a>  A similar reason makes it so that &#8216;universal basic compute&#8217; does not change the equation: if all the owners of distributed compute hold a valuable resource, but no other leverage, can you not just disown them? This is the natural conclusion of the most cynical view &#8211; that as the materialist balance of forces changes, so do the prospects of most who get by on work today.</p><p><strong>Readers will notice that there&#8217;s something distinctly Marxist about this materialist view</strong>. Depending on your ideological couleur, that can motivate different conclusions: perhaps you think that AI is merely the obvious endgame to capitalism, the continuation of existing class struggle by different means &#8211; and so it requires all the good left-wing responses you&#8217;ve been asking for anyways. Or perhaps you think that the thin veneer advocate makes the same claim to inevitability that Marx himself made when he observed the weavers and was quite certain that the revolution was bound to happen any year now. I think you&#8217;d be getting at something true either way: that an impending material imbalance might cause serious problems, but that we might just be able to carry on anyways.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!uuYW!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4f48462-dda1-4820-a6d2-72f4f4024c06_1392x596.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!uuYW!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4f48462-dda1-4820-a6d2-72f4f4024c06_1392x596.png 424w, https://substackcdn.com/image/fetch/$s_!uuYW!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4f48462-dda1-4820-a6d2-72f4f4024c06_1392x596.png 848w, https://substackcdn.com/image/fetch/$s_!uuYW!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4f48462-dda1-4820-a6d2-72f4f4024c06_1392x596.png 1272w, https://substackcdn.com/image/fetch/$s_!uuYW!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4f48462-dda1-4820-a6d2-72f4f4024c06_1392x596.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!uuYW!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4f48462-dda1-4820-a6d2-72f4f4024c06_1392x596.png" width="664" height="284.2988505747126" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f4f48462-dda1-4820-a6d2-72f4f4024c06_1392x596.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:596,&quot;width&quot;:1392,&quot;resizeWidth&quot;:664,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!uuYW!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4f48462-dda1-4820-a6d2-72f4f4024c06_1392x596.png 424w, https://substackcdn.com/image/fetch/$s_!uuYW!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4f48462-dda1-4820-a6d2-72f4f4024c06_1392x596.png 848w, https://substackcdn.com/image/fetch/$s_!uuYW!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4f48462-dda1-4820-a6d2-72f4f4024c06_1392x596.png 1272w, https://substackcdn.com/image/fetch/$s_!uuYW!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4f48462-dda1-4820-a6d2-72f4f4024c06_1392x596.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">In 1996, the South Korean government&#8217;s <a href="https://www.brookings.edu/articles/the-labor-market-policy-and-social-safety-net-in-korea-after-the-1997-crisis/">attempt to curtail</a> labor rights was met with a four-week general strike. The government gave in.</figcaption></figure></div><h1><strong>The Deep Utopia</strong></h1><p>The <strong>polar opposite view is one of &#8216;deep utopia&#8217;:</strong> underneath the current layer of scarcity-driven arrangements, the human interest in mutual welfare and redistribution runs deep &#8211; that <strong>if we had enough to go around, we&#8217;d make sure it went around. </strong>Mechanize, a company producing essays and RL environments, prominently espouses this view: in a recent <a href="https://www.mechanize.work/blog/life-after-work/">post</a>, they argue that we can and should automate all jobs, and hope humans will survive and flourish on welfare and charity amidst abundant AI-produced goods.</p><p>Their argument rests on one sociological and one economic premise. The sociological is to <strong>assume a strong trend of increasing charity and welfare continues.</strong> This is the point Mechanize make: they correctly identify that as society has progressed over the last centuries, we&#8217;ve redistributed and allocated to charity more and more of our resources. If we just continued to do so in the future, as AI-driven growth fills our treasuries, existing channels of redistribution seem set up to deliver material abundance for all. <strong>I&#8217;m not sure what would be a convincing </strong><em><strong>structural</strong></em><strong> reason</strong> <strong>for this assumption</strong>, and I tend to favour the leverage-forward view for its ability to make some causal sense of that trend. But it is hard to argue with the shape of a steadily increasing graph, and to the utopian view&#8217;s credit, the trend has persisted even through what should have been marked shifts in relative leverage, such as the shift of threat of violence toward smaller, technologised militaries. You would need some extraordinary evidence to be sure it would break away as radically as the thin veneer view implies, and I&#8217;m not sure this evidence exists.</p><p>The utopian view is also helped along by an<strong> economic premise, which assumes AI has a deflationary effect on the basket of goods that make up a &#8216;good life&#8217; today. </strong>Because widespread deployment and integration of highly efficient AI systems into supply chains might make it much easier to produce goods and provide services, it would be much, much cheaper to maintain today&#8217;s standard of living in the future. As a result, the argument goes, you could sustain everyone&#8217;s welfare even by very small payments. I think this claim is right on a very abstract economic level, but it glosses over some important details &#8212; <strong>increasing material abundance has not always implied a practical drop in the price of one specific lifestyle. </strong></p><p>That&#8217;s for a couple of reasons: basket of goods change, abundance makes us improve products in ways that also make them more expensive, and Baumol effects make some prices very sticky. Just as much as the life of a 1900s coal miner or medieval peasant would perhaps be cheap to buy today, but you can&#8217;t actually buy it, there might not be a way to get by at our current levels of welfare in the future. Beyond that, I&#8217;m also quite uncertain whether the most effective marginal deployment of AI systems in a compute-constrained world of full automation would really be creating material abundance for displaced workers, who surely hold fairly little market power as consumers. </p><p>All in all, I do think there&#8217;s some truth to the utopian view that the thin veneer view ignores: <strong>historical trends carry substantial momentum, and it might be easier to feed humanity than you think</strong>. But I&#8217;m quite doubtful that this effect gets us to the point of &#8216;paying for a flourishing humanity is a trivial side effect of AI-driven growth&#8217; all on its own.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Glwq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F68dafb18-e349-4900-9270-ccc6b6d06271_1600x1106.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Glwq!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F68dafb18-e349-4900-9270-ccc6b6d06271_1600x1106.png 424w, https://substackcdn.com/image/fetch/$s_!Glwq!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F68dafb18-e349-4900-9270-ccc6b6d06271_1600x1106.png 848w, https://substackcdn.com/image/fetch/$s_!Glwq!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F68dafb18-e349-4900-9270-ccc6b6d06271_1600x1106.png 1272w, https://substackcdn.com/image/fetch/$s_!Glwq!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F68dafb18-e349-4900-9270-ccc6b6d06271_1600x1106.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Glwq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F68dafb18-e349-4900-9270-ccc6b6d06271_1600x1106.png" width="636" height="439.4340659340659" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/68dafb18-e349-4900-9270-ccc6b6d06271_1600x1106.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1006,&quot;width&quot;:1456,&quot;resizeWidth&quot;:636,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Glwq!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F68dafb18-e349-4900-9270-ccc6b6d06271_1600x1106.png 424w, https://substackcdn.com/image/fetch/$s_!Glwq!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F68dafb18-e349-4900-9270-ccc6b6d06271_1600x1106.png 848w, https://substackcdn.com/image/fetch/$s_!Glwq!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F68dafb18-e349-4900-9270-ccc6b6d06271_1600x1106.png 1272w, https://substackcdn.com/image/fetch/$s_!Glwq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F68dafb18-e349-4900-9270-ccc6b6d06271_1600x1106.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">If the <a href="https://ourworldindata.org/grapher/social-spending-oecd-longrun?time=earliest..2022&amp;country=CAN~JPN~USA~ITA~DEU~FRA~GBR">trend</a> in public social spending as share of GDP continues after full automation, we might just be okay. <em>If.</em></figcaption></figure></div><h1><strong>The Thick Veneer</strong></h1><p>That brings us to the <strong>&#8216;thick veneer&#8217; view. </strong>It holds that <strong>our values and institutions run deeper than just economic leverage, but not all the way to the core</strong>: once we remove the delicate balance of mutual leverage, our arrangements will <em>eventually</em> slip away. And by the time they do, we won&#8217;t have the leverage to get them back &#8211; so we should proceed with utmost caution.</p><p>This view concedes, first of all, that <strong>we might be fine for some time</strong> after full automation. From sheer institutional inertia and retained values from the pre-automation time, our institutions might well work to deliver equitable and favourable results to many for quite some time. <strong>But that might slip away.</strong> Laws, redistribution channels, habits of charities and rules of taxation are themselves highly contingent. The underpinning that would keep these institutions going past the simple inertia of the initial years are shared norms and beliefs &#8211; &#8216;habits of the heart&#8217;, perhaps. The inverse means that old institutions can deteriorate as we forget their purpose and lose the norms and circumstances that gave rise to them.</p><p>The <strong>norms that make redistribution durable rest on social conditions that full automation could eliminate. </strong>Today&#8217;s redistribution works because most people participate in labor markets, face similar risks, and can imagine themselves in each other&#8217;s positions. Even in 2025, there&#8217;s still somewhat of a sense of &#8216;there but for the grace of God go I&#8217; &#8211; insurance and welfare still somewhat feel like shared risk pools rather than one-way transfers to a different class. Even those who never need these protections at least share the basic experience of work, the participation in societies that derive pride and purpose from labor, and the ensuing sense of basic desert.</p><p>But on the full automation view, only few will be capital holders and live players that still hold substantial leverage over what happens. They are supposed to live lives not only of material abundance, but of importance and purpose, in the knowledge that they&#8217;ve escaped the &#8216;permanent underclass&#8217;. They might look onto the rest of humanity with sympathy and compassion, but surely not with the same amount of respect and kinship they felt while lines between classes were still more obviously blurry. How long before the few start wondering <em>why exactly</em> they should continue the trend of charity and redistribution? Before they find something better they&#8217;d like to do with the money, some political wedge that has them take offense at the spending of the many, some frustration at whom they might see as freeloaders and how unproductively they spend their time? If our society changes like that, <strong>it&#8217;s very easy to forget the reasons for why we have set up our carefully calibrated social institutions to begin with.</strong></p><p><strong>The deflationary argument implies this won&#8217;t happen</strong>, because the amounts are so small and marginal. I wouldn&#8217;t bet on it, even if I believed the deflation case: recent history suggests even small amounts of very helpful spending might actually be highly susceptible to outright cuts once they lose the leverage to back them up. For just one example, the richest country in the history of the world has cut its global aid payments from a similar impulse: no matter where you come down on the cuts to USAID, they&#8217;re squarely evidence that the instinct of &#8216;why should we bankroll this&#8217; does not stop at small sums or large reported positive effects.</p><p>And the insidious part is: <strong>if we do get to that point, it&#8217;ll be hard to turn back.</strong> Maybe current institutions<em> </em>are not only the product of mutual leverage, but we&#8217;d surely require leverage to rebuild them and to enforce the costs that this would imply. Today, functioning democracies have a backstop: whenever our manners slip, whenever our arrangements deteriorate and lead down a markedly less equitable path, the people can still change our path. Without remaining leverage, how could those on the losing end of a new arrangement ever make their voices heard? Meaningful democratic enfranchisement, with a government that can and will actually make substantial changes in response to votes, might deteriorate in much the same way. How would we keep it honest? It sure won&#8217;t be strikes or boycotts &#8211; that, the thin veneer view clearly demonstrates. And if the few no longer think the many contribute to society, I&#8217;m not sure they&#8217;ll remain very sympathetic to uprisings and protests. The <strong>backstops that would bail us out if we lost our ways still seem based on leverage.</strong></p><p><strong>A likely outcome seems to be slow deterioration</strong>: cutbacks on welfare and charity, communicated with the attitude of &#8216;you better take what you can get and be grateful for it&#8217;, and paired with a manifest lack of leverage to change that arrangement back. Does that mean full automation ends in disaster, full stop? No. But it means that I&#8217;m skeptical that current institutions are enough, or that past trends persist, to guarantee outright flourishing. Tocqueville gives us a good notion of the manners that make this veneer thick, but vulnerable nonetheless:</p><div class="pullquote"><p>I am convinced that the most advantageous situation and the best possible laws cannot maintain a constitution in spite of the manners of a country; whilst the latter may turn the most unfavourable positions and the worst laws to some advantage. &#8212; Alexis de Tocqueville</p></div><h4><em>Implications</em></h4><p><strong>I favour the &#8216;thick veneer&#8217; view</strong>, in the sense that I think it&#8217;s what would happen if we implemented full automation today without changing our societal arrangement at all. I find the plain cynical reading of human nature as unconvincing as the bright-eyed one, just as much as I find the reductive materialist view as unconvincing as the naive extrapolation of societal trends. Society is far less of a ruthless optimiser one way or the other than you might give it credit for from San Francisco. Even if a radical technological change &#8211; like massive progress toward full automation &#8211; manifested from one day to the other, society would move quite slow in updating our arrangements to reflect that fact. We&#8217;d be as slow to weed out the sudden inefficiencies as we&#8217;d be to plug the sudden holes in our welfare state, and I assume that much of society would slowly grind along for quite some time. But the underlying shifts in mutual leverage and shared manners ultimately, slowly rise to the top, and things go wrong &#8211; a whimper, not a bang.</p><p>This <strong>prospect of delayed deterioration implies a need for risk aversion</strong>: even a tentative believer in the deep utopia would be well-advised to act in accordance with the thick veneer view. If the adverse effects of full automation come slowly, pervasively and insidiously, we have very little live signals to react to &#8211; especially if full automation moved at a hastened speed, and societal reaction did not. By the time we found real-world evidence that the utopian view did not work out, it might already be too late to fix the underlying societal structures that we had hastily changed.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aLiS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" width="438" height="109.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:438,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>What to make of this?</strong></h1><p>I don&#8217;t think the &#8216;thick veneer&#8217; future is a <em>necessary</em> conclusion of full automation: even if we couldn&#8217;t choose to avert full automation in the face of economic pressures, surely we can nudge its trajectory. If our institutions keep up, we could prepare them to get this &#8211; unlikely &#8211; scenario of full automation right. Today, I think our best course of action comes down to a <strong>dual strategy</strong>: <strong>we need to make sure the pace toward full automation does not exceed the pace of expanding institutional capacity to handle it.</strong></p><p><strong>First, we should scrutinise, and sometimes hinder, plans to accelerate toward full automation. </strong>There is a speed that is manifestly too fast &#8211; one that most people in the world have a justified grievance with because it would strip them of their leverage without giving them enough in return. I think if you gave them the outline and odds of the deal the utopians offer, they wouldn&#8217;t want to take it. They might be right, but even if they weren&#8217;t, they should get a say. To add, there is a pace of automation that exceeds all ability of institutional adaptation. Before we reach it, we should rather slow down. We can do so intelligently and elegantly, by carving out incentives for human roles to remain vital along the slower parts of the jagged frontier, by requiring humans in the loop only where they add to that loop, and by boosting and supporting augmentation far enough that automation does not outcompete augmented workers. If we don&#8217;t, public discontent could manifest in much less elegant policy that adds frictions where we&#8217;d want lubricants and prohibits the use of AI systems where they&#8217;d do a lot of good &#8211; if you think requiring drivers in Waymos were bad, just wait a couple of years. We should rather find a better way to regulate our pace in accordance with the will of the affected.</p><p><strong>Second, we should think toward institutional capacity for a post-work future. </strong>The thin veneer view explains to us all the ways in which our current institutions are insufficient: they do not confer durable ownership absent leverage, they lack any mechanism for an international component to charity and redistribution, and they often fundamentally operate on the premise that being out of economically valuable work is a <em>temporary</em> phenomenon before or after a stint in the labor market. That premise needs reexamination, and many of the issues need addressing. There are many ways to conceivably do this, and way too few people are working on this. As to what I think, I&#8217;ve written about all this in much <a href="https://writing.antonleicht.me/p/ai-jobs-and-the-rest-of-the-world">greater</a> <a href="https://writing.antonleicht.me/p/ai-and-jobs-two-phases-of-automation">detail</a> <a href="https://writing.antonleicht.me/p/ai-and-jobs-politics-without-policy">before</a> and will spare you the repetition &#8211; the gist is: a good solution would need to be somewhat <a href="https://writing.antonleicht.me/p/ai-jobs-and-the-rest-of-the-world">global</a>, somewhat durable, and try to retain most humans&#8217; ability to make valuable contributions. We&#8217;re nowhere near a proposal that gets any of that right.</p><p><strong>Between these two options, I think we should avoid slowing down AI progress whenever we can.</strong> First, because I believe AI systems promise tremendous welfare, wealth and growth, and I think we should build their most advanced versions sooner than later, all else being equal. But second, it&#8217;s for a political reason: fighting full automation by aiming at preventative prohibition doesn&#8217;t seem viable. Any attempt to halt AI research in total is going to face a very difficult political economy &#8211; both as it relates to the domestic loss of autonomy and productivity, and as it relates to foreign competition. The same goes for attempts to steer away the technical paradigm from full automation toward narrow capabilities and augmentation, insofar as the former are much more efficient. Slowing full automation through negative advocacy is a costly zero-sum game; hastening institutional capacity or developing actually competitive <a href="https://workshoplabs.ai/">alternatives</a> faces a far better political economy.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!0wYI!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3b7f7514-8fdb-4d41-89e2-d74aca90c2c7_1600x846.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!0wYI!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3b7f7514-8fdb-4d41-89e2-d74aca90c2c7_1600x846.png 424w, https://substackcdn.com/image/fetch/$s_!0wYI!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3b7f7514-8fdb-4d41-89e2-d74aca90c2c7_1600x846.png 848w, https://substackcdn.com/image/fetch/$s_!0wYI!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3b7f7514-8fdb-4d41-89e2-d74aca90c2c7_1600x846.png 1272w, https://substackcdn.com/image/fetch/$s_!0wYI!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3b7f7514-8fdb-4d41-89e2-d74aca90c2c7_1600x846.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!0wYI!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3b7f7514-8fdb-4d41-89e2-d74aca90c2c7_1600x846.png" width="566" height="299.3269230769231" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/3b7f7514-8fdb-4d41-89e2-d74aca90c2c7_1600x846.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:770,&quot;width&quot;:1456,&quot;resizeWidth&quot;:566,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!0wYI!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3b7f7514-8fdb-4d41-89e2-d74aca90c2c7_1600x846.png 424w, https://substackcdn.com/image/fetch/$s_!0wYI!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3b7f7514-8fdb-4d41-89e2-d74aca90c2c7_1600x846.png 848w, https://substackcdn.com/image/fetch/$s_!0wYI!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3b7f7514-8fdb-4d41-89e2-d74aca90c2c7_1600x846.png 1272w, https://substackcdn.com/image/fetch/$s_!0wYI!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3b7f7514-8fdb-4d41-89e2-d74aca90c2c7_1600x846.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">The <a href="https://www.sanders.senate.gov/wp-content/uploads/10.6.2025-The-Big-Tech-Oligarchs-War-Against-Workers.pdf">politics</a> of full automation are already on a knife&#8217;s edge. </figcaption></figure></div><p>But we can&#8217;t build our way out of every conundrum. In particular, <strong>accelerationists should be mindful that politics enforce a natural speed limit here. </strong>If you go too much faster than the institutions and the public, they will backlash and violently halt your project &#8211; and leave you with little to show for in terms of &#8216;hastening inevitable technological marvels&#8217;. As fun as it reads on X &#8211; it&#8217;s not actually a great omen if your company makes it to the top of a Bernie Sanders <a href="https://www.sanders.senate.gov/wp-content/uploads/10.6.2025-The-Big-Tech-Oligarchs-War-Against-Workers.pdf">report</a> before its first product. </p><p>That means <strong>accelerationists in particular should get on board with improving the institutional setups</strong> we need to navigate the post-automation world well. They used to be better at this, but I&#8217;ve recently been disappointed to see that they seem to be giving up on this mission &#8211; whether it&#8217;s Sam Altman <a href="https://www.wsj.com/tech/ai/universal-income-tech-executives-a16eb2d0">moving away</a> from redistributionary messaging like his UBI experiments, or whether it&#8217;s the Mechanize crew posting essays on why everything will turn out great anyways. How fast you can scale your dream of full automation depends on how fast we all manage to scale our path toward institutions that can handle it. An accelerationist, in their naked self-interest, should help to do that more.</p><p>This is a tremendously complicated issue, and I still believe that our technological trajectory does not, in fact, point at full automation. But if it does, you should recognize we&#8217;re headed for a discontinuous shock to the incentive structures that underpin the setup of modern societies. <strong>Even the most confident extrapolation of history does not reliably make it across this event horizon</strong> &#8211; no matter whether your reading of choice is of the utopian or dystopian flavour. So if this is the path we&#8217;re on, we should walk it at a pace that makes it possible to still consider where we step.</p><div><hr></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>Note that such roles would be different from sticky &#8216;human preference&#8217; jobs that would be a genuine high-leverage bottleneck for the post-AI economy. Jobs arising from comparative advantage would sit somewhere in between, depending on how costly it would be to replace them with AI work.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>Correcting, I guess, for different flowthrough speeds of money &#8211; but surely, that&#8217;s easier addressed through interest rates and taxes than through doling out money to consumers.</p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[When Should Nations Sell Their Data?]]></title><description><![CDATA[Service economies might have to treat data as a strategic asset beyond privacy politics]]></description><link>https://writing.antonleicht.me/p/when-should-nations-sell-their-data</link><guid isPermaLink="false">https://writing.antonleicht.me/p/when-should-nations-sell-their-data</guid><dc:creator><![CDATA[Anton Leicht]]></dc:creator><pubDate>Wed, 29 Oct 2025 14:07:23 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/a77ecca9-4d2f-4957-91ef-8b051fc68d10_1480x1394.avif" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><strong>In recent decades, the measure of an economy&#8217;s progress has been the growth of its service sector:</strong> the ability of its population to work out of gleaming skyscrapers instead of smog-filled industrial plants, the degree to which its growth is divorced from the material constraints of physical labour. Strongholds of the fastest growth often still bear marks of that departure: I write today&#8217;s piece from the 60th floor of a Hong Kong skyscraper that rises from the vestiges of a now almost-irrelevant industrial harbour, against the backdrop of a buzzing services industry that seems sure to have left its humble physical roots behind.</p><p>AI, however, threatens to flip that logic. It seems that AI systems will master the cognitive domain before the physical, and thus displace white-collar before blue-collar work. That means many <strong>economies that thought they were doing particularly </strong><em><strong>well</strong></em><strong> in adjusting to the 21st century&#8217;s logic now find themselves at the greatest peril. </strong>Most countries outside America hold no large stake in AI progress itself, but their labor markets are still vulnerable. They lack leverage and leeway, as AI-driven revenue might well accrue outside their capitals and on the US shores; and outside of the income tax bracket and in corporate tax havens. Economies like Singapore, the UK, or the Nordics, whose economic success relies on highly educated service labor, are highly exposed to having their workforce displaced and their growth stymied.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p><strong>This piece tells one story of how they might still pivot.</strong> The threatened service economies still hold a key asset: the valuable data required to hasten the advent of transformative, disruptive AI agents. In some AI futures, leveraging this data at scale would be one promising path to steer clear of the most disruptive futures for service economies. But <strong>by political consensus and legal default, even private individuals are limited in selling and using highly valuable data</strong> &#8211; not to speak of firms and nation states. This is the consequence of a strategic mistake: existing data governance was built for a bygone era in which data was best understood and governed by its relation to personal rights.</p><p>Yet countries would be justified to change their approach. The data embodied within their skilled workforce is not accidental, but a cultivated asset &#8211; developed through costly investments in education, talent attraction, and structural preconditions for a thriving services sector. That makes it nations&#8217; right and responsibility to leverage this asset well. If they want to avoid being on the losing side of AI diffusion, they instead need to find a way to transform it into a lasting advantage. To do so, they would have to <strong>relinquish the view of extensive data privacy as sacrosanct right, and craft policy to time and shape the sale of their fleeting data treasures.</strong></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!haPl!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74d03fa1-5e24-46ee-8330-5b4d872c45eb_1600x410.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!haPl!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74d03fa1-5e24-46ee-8330-5b4d872c45eb_1600x410.png 424w, https://substackcdn.com/image/fetch/$s_!haPl!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74d03fa1-5e24-46ee-8330-5b4d872c45eb_1600x410.png 848w, https://substackcdn.com/image/fetch/$s_!haPl!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74d03fa1-5e24-46ee-8330-5b4d872c45eb_1600x410.png 1272w, https://substackcdn.com/image/fetch/$s_!haPl!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74d03fa1-5e24-46ee-8330-5b4d872c45eb_1600x410.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!haPl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74d03fa1-5e24-46ee-8330-5b4d872c45eb_1600x410.png" width="1456" height="373" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/74d03fa1-5e24-46ee-8330-5b4d872c45eb_1600x410.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:373,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!haPl!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74d03fa1-5e24-46ee-8330-5b4d872c45eb_1600x410.png 424w, https://substackcdn.com/image/fetch/$s_!haPl!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74d03fa1-5e24-46ee-8330-5b4d872c45eb_1600x410.png 848w, https://substackcdn.com/image/fetch/$s_!haPl!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74d03fa1-5e24-46ee-8330-5b4d872c45eb_1600x410.png 1272w, https://substackcdn.com/image/fetch/$s_!haPl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74d03fa1-5e24-46ee-8330-5b4d872c45eb_1600x410.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Singapore&#8217;s high-rise office buildings are emblematic of its economic ascent. But today, they also represent a concentration of risk.</figcaption></figure></div><h1><strong>Technical Underpinnings</strong></h1><p>This future I describe is, of course, technologically contingent. As so often in AI, what I suggest is not guaranteed to be a worthwhile strategy, and a technical shift tomorrow could invalidate all that follows. Consider it a speculative deep dive &#8211; into a policy shift that would become necessary on one quite specific trajectory. </p><p>Two claims in particular are loadbearing for what I describe: the notion that advanced AI systems might soon at least temporarily displace services jobs in many economies; and the notion that data will be valuable to build these systems. The first has been discussed elsewhere in <a href="https://budgetlab.yale.edu/research/evaluating-impact-ai-labor-market-current-state-affairs">much</a> <a href="https://digitaleconomy.stanford.edu/publications/canaries-in-the-coal-mine/">greater</a> <a href="https://epoch.ai/gradient-updates/consequences-of-automating-remote-work">detail</a> &#8211; I believe it&#8217;s not settled, but likely enough not to warrant further explanation in this piece.</p><p>The second is the <strong>role of data</strong> <strong>in enabling automation</strong>. At the moment, there seem to be two conceivable pathways to advanced AI agents. The first makes do without much specialised data &#8211; it builds general systems that first become so intelligent and capable that they can excel in any domain, and then descend onto specific task profiles. That might happen because their intellectual capabilities generally apply across domains, or because their prowess in software engineering specifically causes a degree of self-improvement that allows for indirect generalizations. But lately, there are many reasons to think we&#8217;re not on this first path: general returns to scale seem less efficient than narrow reinforcement learning, economic usefulness seems to hinge on many last-mile problems around reliability that are best fixed by doing very specific work, and much economic value seems to be locked behind tricky broad diffusion issues that aren&#8217;t obviously bridged fastest by building very capable agents.</p><p>Instead, <strong>the path to an agent that&#8217;s good at something might be to put in a lot of manual work into making it good at that thing. </strong>There are many ways in which data from a given economic sector is very helpful in making models significantly better at tasks within that sector. A particularly popular contemporary approach is constructing bespoke reinforcement learning environments used to make models really good at a given task profile. This is the foundational approach of much-noted recent start-ups like Mira Murati&#8217;s Thinking Machines or ex-Epoch employees&#8217; Mechanize. This current version of the data-leveraging story is one instance of a more general point: Data controlled by leading white-collar businesses might be very usable for making AI agents better.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!W_z7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!W_z7!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!W_z7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/efcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!W_z7!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>How To Sell Your Data</strong></h1><p><strong>What kind of data might that be?</strong> I think there&#8217;s a wide range: It can be very narrow, sophisticated data on workflows: SOPs, recordings of employees, or specific inputs they make on specific tasks. It can be slightly broader, results-heavy data: precise outputs from spreadsheets to computer code, or specific deliverables in the context of the original task. Or it can be high volumes of somewhat relevant data &#8211; email logs, huge servers full of somewhat pertinent documents, all of which contain some nuggets of information on best professional practice.</p><p>Depending on what data turns out to be the most useful, you could imagine different markets emerging: AI developers purchasing it directly, middleman companies purchasing, collecting, and curating data for developers, or more deployment-focused companies fine-tuning models for specific, very narrow use cases. Legacy services companies themselves might become quite good at using their proprietary data to bootstrap narrow AI ambitions. I think there&#8217;s a good argument for this market becoming quite important. Just look at the heights to which the infrastructural bottlenecks for AI have driven the Nvidia stock price. If data turns out to be the respective informational bottleneck, it might likewise become quite valuable &#8211; for some time, at least.</p><p>Still, it&#8217;s essential <strong>not to overstate the ultimate importance of this data.</strong> You might think the above means a restrictive approach to data can save labor forces from displacement altogether. But there are several problems with that thinking: generalisation-based approaches might still catch up, lower-quality data on similar tasks might be acquired elsewhere, or even generated and curated from the ground up. Data would be valuable because it provides an important boost in a neck-and-neck race toward economically viable agents; but not because it&#8217;ll remain a binding constraint for the foreseeable future.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!W_z7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!W_z7!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!W_z7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/efcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!W_z7!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>Treating Data as an Asset</strong></h1><p>Returning to the question of national strategy, I believe these trends should have countries recognise that <strong>data is a national strategic asset of service economies.</strong> This is a controversial, in many places heretical view: Data is still widely understood as individually-owned, individually-controlled, removed from transactional and strategic considerations. A similar view underpins the infamous European GDPR, as well as many analogous laws developed in the spirit of the fabled &#8216;Brussels effect&#8217;. Sometimes, these policies do leave leeway for the choices I describe; in many cases, their most pervasive feature is a political attitude to &#8216;keep one&#8217;s hands off of data&#8217;. My strong sense is that policymakers in many of the economies I describe shy away from a leverage-forward view of data.</p><p>I believe retaining this position sets countries up for failure. Professional services data is downstream of all the work you&#8217;ve invested in building a high-skilled workforce. That has likely been a national effort &#8211; requiring investment in education, tax incentives, and costly trade-offs around migration, among other things. The old deal was that this effort would pay dividends over decades, because it boosts your economy and yields high income and corporate taxes. But in the future I described above, that might no longer be the case. The only way to capture the returns of your investment in a quality workforce might now be to cash out on the data this workforce provides. That makes <em>doing something</em> imperative to your national balance sheet &#8211; from a strategic point of view, you simply cannot allow yourself to default on the mortgage you&#8217;ve taken on your workforce. Given the investments you&#8217;ve made, you&#8217;re more than justified to wield regulatory power accordingly.</p><p><strong>This implies a need for national consolidation.</strong> Leaving decisions around how to process this data to companies surrenders control over the asset in ways that are likely to lead to worse outcomes due to a lack of coordination. That is because <strong>c</strong>ompetitive pressures incentivise early defection.<strong> Service economy markets typically feature multiple competitors with very similar business offerings &#8211; consultancies, banks, accounting firms, agencies, and so on</strong>. If one of them sells, the value of the remaining data can quickly plummet &#8211; even more so if there is no broad market for data pipelines, but only a few sources. And there is no guarantee of reasonable reinvestment: For national economies to cash out in a way that confers enduring economic viability, the returns from selling data must be reinvested in some way. Perhaps that&#8217;s through receiving an enduring stake in the agents deployed as a result, perhaps it&#8217;s just through investing that money into compute, education, and buildouts required for an economic pivot. Perhaps it&#8217;s even by leveraging the data into individualised, <a href="https://workshoplabs.ai/">augmentation-forward AI systems</a>. It&#8217;s by no means guaranteed that incidental company-by-company sales contain a viable path toward useful reinvestments.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!W_z7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!W_z7!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!W_z7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/efcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!W_z7!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>A New Kind of Data Policy</strong></h1><p>Moving toward strategically sound leveraging of service economy data requires a different approach to data governance. <strong>Practically, there are ways to centrally leverage this kind of data</strong>, and consolidate attempts to sell it. One way is simply introducing channels of coordination, and bringing companies together to consolidate their decision-making on this, reducing defection risks through mutual commitments. But that requires additional provisions for ensuring useful reinvestment approaches; perhaps either through making sure tax codes appropriately capture the revenue shift from workers to data sales, or through securing reinvestment pledges by companies. Economies with currently strict data protection laws have a different route: they can keep their strict laws on the books, and instead strategically choose where to allow sales.</p><p>The bigger underlying problem is not practical, but dogmatic. <strong>Enabling governments to make decisions on this issue breaks with an established view of data, </strong>which has long been aimed exclusively at safeguarding individual rights. This view has been entrenched in many jurisdictions, Europe in particular, far beyond its genuine political support. Powerful lobbies of data protection attorneys, themselves created by burdensome data regulations, hold significant sway over data-related decision-making. And political decision-makers have grown fixated on communicating privacy concerns above all else when it comes to data. Pivoting toward capitalising on data is a deeply unpopular position in that environment.</p><p>It will be difficult to make the political case as motivated by the concerns outlined above. Unprompted, the ideas I&#8217;ve discussed require an awareness for AI trajectories that seems unrealistic in most countries; and they&#8217;re set to run into motivated and well-funded opposition. But waiting for more favourable political economies will not do: once the displacement begins and the political window for action opens up, that means systems are already good enough &#8211; so companies already have less need for the data. This piece aims to motivate the challenge more so than to develop the mechanism, but I think two angles remain underexplored:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!CfLJ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F101aaa50-53ce-432c-bd41-078c7f08aa96_1600x649.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!CfLJ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F101aaa50-53ce-432c-bd41-078c7f08aa96_1600x649.png 424w, https://substackcdn.com/image/fetch/$s_!CfLJ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F101aaa50-53ce-432c-bd41-078c7f08aa96_1600x649.png 848w, https://substackcdn.com/image/fetch/$s_!CfLJ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F101aaa50-53ce-432c-bd41-078c7f08aa96_1600x649.png 1272w, https://substackcdn.com/image/fetch/$s_!CfLJ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F101aaa50-53ce-432c-bd41-078c7f08aa96_1600x649.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!CfLJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F101aaa50-53ce-432c-bd41-078c7f08aa96_1600x649.png" width="1456" height="591" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/101aaa50-53ce-432c-bd41-078c7f08aa96_1600x649.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:591,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!CfLJ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F101aaa50-53ce-432c-bd41-078c7f08aa96_1600x649.png 424w, https://substackcdn.com/image/fetch/$s_!CfLJ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F101aaa50-53ce-432c-bd41-078c7f08aa96_1600x649.png 848w, https://substackcdn.com/image/fetch/$s_!CfLJ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F101aaa50-53ce-432c-bd41-078c7f08aa96_1600x649.png 1272w, https://substackcdn.com/image/fetch/$s_!CfLJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F101aaa50-53ce-432c-bd41-078c7f08aa96_1600x649.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Services as percentage of GDP (<a href="https://howmuch.net/articles/role-services-around-the-world">HowMuch</a> // <a href="https://data.worldbank.org/indicator/NV.SRV.TOTL.ZS">World Bank</a>) </figcaption></figure></div><h1><strong>Searching for the Goldilocks Zone</strong></h1><p><strong>To finish this exploration, let&#8217;s assume for a moment you&#8217;ve found a way to consolidate decision-making over a relevant amount of data. What to do? </strong>The primary challenge will be selecting the optimal time to sell.</p><p>It&#8217;s tempting to <strong>realise the broad strokes of the above argument, overreact, and sell too early. </strong>Before bottlenecks manifest, low-hanging fruit fail to suffice, and alternative approaches run aground, there will be little interest in breaking through the very high barriers to accessing privileged data in large swaths. At a time where you&#8217;re not quite aware of how big and disruptive the impact of AI on services jobs will be, it might be tempting to sell your data at a reasonably low price &#8211; you might think it&#8217;s an easy way to cash in on some of the AI boom, take some easy money, and go on on your old economic paradigm. Doing so before you understand that the returns might need to be sufficient to restructure entire economic sectors is a recipe for economic downturn.</p><p>For more obvious reasons,<strong> it&#8217;s easy to sell too late.</strong> Competing approaches are one factor &#8211; alternative, perhaps synthetic, data sources that circumvent the need for your data, inefficient solutions that are still cheaper than paying you too much; really anything that impatient AI builders might resort to if they can&#8217;t get old world governments to budge. Another factor is international competition: especially when your economic structure provides services that are very similar to those offered in other countries, these rivals may secure the sale first and make your data much less valuable.</p><p>Getting the timing right requires two factors that are currently quite scarce in national governments. First, keen awareness of the input factors for data value is necessary: you need to closely track the relevant technical developments, as well as what other countries are doing, to find a suitable moment. The second is the ability to execute rapid turnarounds. The government must be able to move <em>quickly</em> on deals and agreements &#8211; aware enough to assess the viability of specific exchanges on their merits in a way that doesn&#8217;t cause costly delays. That&#8217;s a tall ask; realistically, it requires named ownership and having frameworks in place to conduct this assessment well before the fact. If a government &#8211; or a private would-be seller of data &#8211; only goes around to shop for political approval once a deal is on the table, the delays might well be prohibitive.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!W_z7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!W_z7!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!W_z7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/efcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!W_z7!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!W_z7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fefcfd74b-c821-4fb9-b9ff-7a21e79ca1a1_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>Outlook</strong></h1><p>Pivoting to treating data as a national asset might turn out critically important for service economies. <strong>Given the disastrously difficult political economy, it&#8217;s worth starting early. </strong>As such, I think the best thing to do right now is to build two things: First, data governance structures that actually allow you to make strategic choices around leveraging data if you ever wanted to. A by-default data policy that removes any ability of strategic coordination and productive use is risky &#8211; under a permissive regimen, you can quickly miss the window to act. Inversely, a data policy that entrenches an understanding of data as only a matter of individual rights can be a major obstacle for effective data use. Chipping away at that old understanding will be a long and politically arduous task, and perhaps worth embarking on now.</p><p>And second, awareness of the issue and its determinants. <strong>The timing question depends on plenty of moving technical and economic parts</strong> &#8211; what goes into leading systems, what are rivals selling, and what&#8217;s the price for what data. Countries &#8211; especially middle powers committed to the mantra of &#8216;winning on AI diffusion &#8211; currently lack the capacity to track these trends. But the issue of data should really register as another entry in the increasingly long list of reasons to change that: by default, or by just trusting your private sector to develop these capabilities, countries <em>will</em> fail to recoup the investment into their workforce.</p><p>Service-focused economies have been dealt a tough hand by the current AI trajectory: their structures are at risk, and they have to time their jump carefully. In one realistic future, their fate will hinge on building data governance that enables them to sell at the right moment. <strong>Success can only be found ahead of the curve &#8212; which means starting the work on strategic data policy now.</strong></p><div><hr></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p>]]></content:encoded></item><item><title><![CDATA[The Future of AI Export Deals]]></title><description><![CDATA[AI developers can build moats by making country-level exports work for everyone]]></description><link>https://writing.antonleicht.me/p/diffusion-deals</link><guid isPermaLink="false">https://writing.antonleicht.me/p/diffusion-deals</guid><pubDate>Thu, 23 Oct 2025 12:49:58 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/56038b8b-755f-46c8-95c6-edce5ee1ce4d_770x508.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>These days, when you glance at the top technology news, the odds are decent that you&#8217;ll find announcements of increasingly contrived partnerships between AI developers and atrophied legacy institutions. In the ensuing whirlwind of news updates, diagrams of investment flows, and hasty takes on what this means for &#8220;the bubble,&#8221; I think we&#8217;ve missed the most interesting category of deals: between major AI developers and governments. Just on the OpenAI side, the last month has seen announcements of <strong>Stargate Argentina</strong>, a <strong>country-level</strong> deal with Germany, and a narrower <strong>Stargate UK</strong> hidden away in a broader <strong>UK&#8211;US</strong> tech deal. Today I want to offer some thoughts on why these deals are underrated, and how to make them work.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p>I believe country-level deals matter for two reasons: First, they could become an<strong> increasingly important part of </strong><em><strong>company</strong></em><strong> strategy</strong>: compared to fleeting arrangements with other software companies, country-level deals potentially offer longer-term arrangements, exclusive capture of some important tier-2 markets, and inroads with governments that will matter as strategic relevance of AI technology increases. By creating lasting, sticky deals with national governments, AI developers can potentially build huge and lasting moats at a time where commodification is a looming threat.</p><p><strong>Second, country-level deals are a major variable in getting international diffusion right.</strong> In many countries, private and public sector alike often underappreciate the economic and strategic relevance of AI, leading to underinvestments and misallocations that could strip their governments, citizens and businesses of critical capabilities. Few actors have both the incentive and the leverage to fix this &#8211; but AI developers pursuing country-level deals do.</p><p><strong>The best version of these programmes is exceedingly good</strong>: they stabilise the market against the volatilities of Silicon Valley, and they provide for the equitable distribution of groundbreaking AI capabilities. <strong>Their worst version is exceptionally bad</strong>: they sell useless computing infrastructure to countries who can&#8217;t run it, and discredit AI developers and their technologies the world over. Of course, much of the difference will be up to public policy, and I&#8217;ve written about both the <a href="https://writing.antonleicht.me/p/datacenter-delusions">importers&#8217;</a> and <a href="https://writing.antonleicht.me/p/making-ai-export-promotion-work">exporters&#8217;</a> side before.</p><p>But many of the most important questions instead come down to questions of firm-level program design. So, this piece rethinks the debate and policy implications around developer-side strategy, starting with a sober assessment of how far apart the AI industry and many importers are today.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aLiS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" width="450" height="112.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:450,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a></figure></div><h1><strong>Where Things Stand</strong></h1><p>Right now, these programs are still in their strategic infancy. They also overlap with preexisting hyperscaler initiatives, where developers have long built datacenters in other countries to reduce latency or navigate data governance. As an attempt to delineate, this post will try to deal with country-level deals, i.e. agreements aiming at the provision of AI capabilities struck principally between American AI developers on one side and involving foreign governments on the other.</p><p>For many reasons &#8211; some of which will soon receive some further examination on this blog &#8211;, OpenAI in particular has been excited to engage in just about any deal. The resulting portfolio of announced &#8216;OpenAI for countries&#8217; deals is a good overview of the strategic pathways these programmes can in theory go down. Some other developers have also pursued some export ambitions, but they&#8217;re harder to categorise: Google&#8217;s expansions, for instance, are deeply entwined with their standard hyperscaler business. So for the paradigmatic case, let&#8217;s start with a <strong>look at OpenAI&#8217;s announced export deals</strong>:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!dy98!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd95beb0d-8ae9-44ff-82e6-79b2c3e8466e_1593x888.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!dy98!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd95beb0d-8ae9-44ff-82e6-79b2c3e8466e_1593x888.png 424w, https://substackcdn.com/image/fetch/$s_!dy98!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd95beb0d-8ae9-44ff-82e6-79b2c3e8466e_1593x888.png 848w, https://substackcdn.com/image/fetch/$s_!dy98!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd95beb0d-8ae9-44ff-82e6-79b2c3e8466e_1593x888.png 1272w, https://substackcdn.com/image/fetch/$s_!dy98!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd95beb0d-8ae9-44ff-82e6-79b2c3e8466e_1593x888.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!dy98!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd95beb0d-8ae9-44ff-82e6-79b2c3e8466e_1593x888.png" width="1456" height="812" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d95beb0d-8ae9-44ff-82e6-79b2c3e8466e_1593x888.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:812,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:439259,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://writing.antonleicht.me/i/176896683?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd95beb0d-8ae9-44ff-82e6-79b2c3e8466e_1593x888.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!dy98!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd95beb0d-8ae9-44ff-82e6-79b2c3e8466e_1593x888.png 424w, https://substackcdn.com/image/fetch/$s_!dy98!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd95beb0d-8ae9-44ff-82e6-79b2c3e8466e_1593x888.png 848w, https://substackcdn.com/image/fetch/$s_!dy98!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd95beb0d-8ae9-44ff-82e6-79b2c3e8466e_1593x888.png 1272w, https://substackcdn.com/image/fetch/$s_!dy98!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd95beb0d-8ae9-44ff-82e6-79b2c3e8466e_1593x888.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>We can glean three general directions for country-level deals from this:</p><ul><li><p><strong>Sovereign compute deals </strong>aiming to export enough infrastructure to run inference and training at scale, for a broad range of application and purposes. This is the Stargate UAE model: enough chips to do basically anything, with a well-understood interest on the side of UAE firm G42 to provide inference for the region and train its own models down the road.</p></li><li><p><strong>Narrow compute deals </strong>aiming to export enough infrastructure to run inference for specific applications and use cases. This is perhaps the Stargate UK model, which provides for only 20.000 H100 equivalents and earmarks them for specific purposes in the public sector.</p></li><li><p><strong>Software-focused deals</strong> that mostly create specific conditions to deploy or fine-tune existing software products in coordination with local stakeholders.</p></li></ul><p>This piece is mostly about the former two &#8211; deals that include some version of compute. Selling software is nice, and it&#8217;s good inroads for future deals, but I struggle to see how in itself, it&#8217;s a particularly sticky export. The technology is still a little bit too early to determine where exactly switching costs usually accrue, but it&#8217;s my sense that the infrastructure layer is by far the most robust bet: it&#8217;s the most capital intensive, the most deeply entrenched, and the hardest to resell and repurpose for other issues. If you buy a datacenter, you&#8217;re largely stuck with it. So I&#8217;ll focus the rest of this piece on export schemes that include some infrastructure component, and think about where their strategic refinement might go.</p><p>So <strong>what do you have to get right to build a version 2.0? </strong>I think it&#8217;s most useful to think about how to design export programs in terms of tensions between conflicting desiderata. I think the following three trade-offs map the terrain well.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aLiS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" width="450" height="112.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:450,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1>Three Trade-Offs</h1><h4><em><strong>Switching Costs vs Sovereignty</strong></em></h4><p><strong>From the perspective of a developer, you&#8217;d want to deeply entrench your exports </strong>in a way that makes switching <em>away</em> from them down the road more difficult. There are some ways to achieve this, but the most obvious one is through infrastructure. Infrastructure deal components create switching costs on two levels: first, maybe the contractual stipulations on the use of the exported infrastructure explicitly require the use of your models and no others. Second, maybe you export infrastructure that is specifically very good at using your models, and so switching would cause inefficiencies and incompatibilities without any objectionable contractual features.</p><p>The second element is an underrated reason for AI developers to get their hands on &#8216;in-house silicon&#8217;, i.e. proprietary chips specifically developed to run their own models. Google is the obvious leader in that space; current generations of their TPU chips seem to provide actually competitive performance. This is also a lens through which to view the OpenAI-Broadcom deal: next to general diversification away from Nvidia, it specifically enables OpenAI to export strongly path dependent infrastructure bundles.</p><p>But all these <strong>techniques to drive up switching cost</strong> <strong>trade off against importers&#8217; wish for sovereignty. </strong>Many countries in the world have an appetite for a reliable supply of AI capabilities that does not leave them susceptible to foreign leverage or restrictions. Right now, OpenAI can get away with calling their Stargate-tier offerings &#8216;sovereign&#8217; compute because they&#8217;re relatively a lot more sovereign than a ChatGPT Business subscription and a handshake with Sam Altman. But they&#8217;re often not seriously sovereign in the sense that a country might want a sovereign supply of, say, energy or food: they require maintenance and continued provision of models at minimum, and the higher you drive the switching costs, the less sovereign your exported solutions will be: if you import infrastructure that leaves you de facto dependent on a single developer&#8217;s exports, you incur a substantial dependency. Developers will have to thread the needle between designing exports sticky enough to be profitable, and sovereign enough to be viable for buyers.</p><p>Next to the question of importer demand, the sovereignty trade-off also raises the question of how closely to align your program with the US government.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aLiS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" width="450" height="112.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:450,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h4><em><strong>US Alignment vs US Alignment</strong></em></h4><p><strong>The US government wants to be associated with AI exports</strong>. It fits both the broad goal of &#8216;dealmaking&#8217; and the narrow goal of &#8216;AI export promotion&#8217; aimed at maximising American tokens served. That gives developers threefold incentive to steer close to the USG in their export efforts: they receive explicit financial and diplomatic support in the context of export promotion; they benefit from importers&#8217; interest to remain in the US&#8217; good graces through making a deal; and they can be assured the USG won&#8217;t make life harder for them around compute export limitations. Because the export promotion framework is not finalised just yet, we don&#8217;t <em>quite</em> know broad the administration&#8217;s interest in calling deals &#8216;export promoted&#8217; is &#8211; but there are some <a href="https://x.com/jacobhelberg/status/1978061674507567593">early signs</a> that most AI-forward infrastructure deals will be invited to fall under the umbrella.</p><p>On the other hand, <strong>close alignment with the USG is risky in dealing with many major importers</strong>, especially with regard to the sovereignty question outlined above. Following the last months in trade policy, some countries have grown wary of American leverage, and have in fact stated it&#8217;s their policy to diversify away from US influence over their supply chain. AI developers on their own might be able to claim they&#8217;re &#8216;exporting sovereignty&#8217; &#8211; I think this has worked somewhat decently to justify marginal datacenter deals in the UK and Norway, for instance. But the more overtly involved the USG is, the less viable this is. The OSTP Director and the AI Czar have repeatedly publicly stated that making the world run on American exports is their strategic priority; if they then laud and push a given deal, it&#8217;ll be hard to convince importers that it actually helps their sovereignty. So on some occasions, it might be in developers&#8217; and the administration&#8217;s interest to <em>reduce</em> the affiliation of a deal with the USG.</p><p>How will developers balance that trade-off? Next to the target market, it also <strong>depends on their general relation to the administration:</strong> if that&#8217;s <a href="https://www.anthropic.com/news/statement-dario-amodei-american-ai-leadership">rocky</a> to begin with, you might think you don&#8217;t have much to lose from disassociating just a bit &#8211; or even that the public record of your fraught relationship makes you more trustworthy in the eyes of skeptical importers. But if your relationship to the administration is strong and you work to maintain that, it&#8217;s hard to believe any import deal would be worth risking it. The administration can also do its part &#8211; by strategically allowing some developers to position far away from it to ensure that American exports happen even to US-skeptical importers.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aLiS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png" width="450" height="112.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:450,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!aLiS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!aLiS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F566456e0-59af-40aa-9c88-7155cf4ad07c_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h4><em><strong>Present vs Future Demand</strong></em></h4><p><strong>How do you sell AI to a largely uninformed world? </strong>You almost necessarily face a difficult trade-off. First, you want to sell capabilities that these countries want and need right now: there&#8217;s an actual seller&#8217;s market in some parts of the world, and you want to get in on that. But often, current demand is mistaken: many countries don&#8217;t actually <em>get</em> AI, and have no real plans on what to do with the datacenters you&#8217;ll build for them. Over time, their demand might shift substantially &#8211; and in five years, they&#8217;ll want something else entirely and perceive you as not having delivered on capabilities they actually need. There&#8217;s also a future market obscured by importers&#8217; ignorance: many countries that today still cling to unrealistic ideas of local champions or toothless adoption strategies will soon be in the market for large-scale AI imports. Servicing that future demand is much harder, because it involves identifying and eliciting it through arduous bilateral work; but it&#8217;s much more attractive, because it promises much more lasting demand.</p><p>One specific version of this point is this: There are <strong>two competing trends in effectively selling technology to countries</strong>: providing a <strong>&#8216;turnkey solution&#8217;</strong>, a standardised product that provides for a set of capabilities; and <strong>regional customisation</strong>, where you create bespoke solutions and integrations through regionally deploying manpower and development resources. Both of these ideas borrow from highly successful past Chinese products &#8211; the former from infrastructure, where China has been able to gain ground on exports by offering immediate ready-to-go solutions for computing, transportation or energy, and the latter from software, where China has not just exported basic software products, but deployed engineers to deeply integrate them into existing regional software ecosystems, such as in Southeast Asia. This has deeply entrenched the Chinese software stack in a large part of the region. In principle, both approaches have their merit &#8211; but there&#8217;s a natural risk to make exciting deals and gravitate toward the easy solution today, missing out on much deeper market capture.</p><p><strong>Trying to thread the needle between the two </strong>and selling robust infrastructure with real option value is perhaps the most durable pathway. It&#8217;s also what developers today will claim they&#8217;re doing when pushed on this trade-off. But it won&#8217;t always work. Different use cases frequently require different scale and type of infrastructure &#8211; for instance, a country that first wants large clustered training capacity and ends up needing low-latency decentralised inference capacity for manufacturing can&#8217;t be serviced through one robust project across time. There are many such divergences: between training and inference, between clustering and decentralisation, between frontier capabilities and narrow industrial integrations, between broad general-purpose exports and targeted public sector capabilities, and so on. At some point, you&#8217;ll have to choose &#8211; and it&#8217;s very risky to leave that choice to misinformed importers alone, lest you fail to keep up with their future demand. <strong>The most successful export initiatives might first help importers to develop the best sense of future demand, and work with them to meet it.</strong></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!vhIs!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc69b348-c4c5-44e3-bc69-b7aff6837f69_1899x1015.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!vhIs!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc69b348-c4c5-44e3-bc69-b7aff6837f69_1899x1015.png 424w, https://substackcdn.com/image/fetch/$s_!vhIs!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc69b348-c4c5-44e3-bc69-b7aff6837f69_1899x1015.png 848w, https://substackcdn.com/image/fetch/$s_!vhIs!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc69b348-c4c5-44e3-bc69-b7aff6837f69_1899x1015.png 1272w, https://substackcdn.com/image/fetch/$s_!vhIs!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc69b348-c4c5-44e3-bc69-b7aff6837f69_1899x1015.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!vhIs!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc69b348-c4c5-44e3-bc69-b7aff6837f69_1899x1015.png" width="1456" height="778" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/cc69b348-c4c5-44e3-bc69-b7aff6837f69_1899x1015.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:778,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:157188,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://writing.antonleicht.me/i/176896683?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc69b348-c4c5-44e3-bc69-b7aff6837f69_1899x1015.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!vhIs!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc69b348-c4c5-44e3-bc69-b7aff6837f69_1899x1015.png 424w, https://substackcdn.com/image/fetch/$s_!vhIs!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc69b348-c4c5-44e3-bc69-b7aff6837f69_1899x1015.png 848w, https://substackcdn.com/image/fetch/$s_!vhIs!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc69b348-c4c5-44e3-bc69-b7aff6837f69_1899x1015.png 1272w, https://substackcdn.com/image/fetch/$s_!vhIs!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc69b348-c4c5-44e3-bc69-b7aff6837f69_1899x1015.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Exports to countries led by AI developers. The map is still empty; we&#8217;re still early.</figcaption></figure></div><h1><strong>Three Outlooks</strong></h1><p>These trade-offs carry threefold implications: for developers, for the American government, and for importers.</p><p><strong>On the developer side</strong>, the trade-offs might best be thought of as a pick-and-choose list of features to configure your export program around. There are a couple of obvious permutations of choices that line up nicely with preexisting strengths and weaknesses of some developers:</p><ul><li><p><strong>In the trade-offs between sovereignty and switching costs, some developers are simply better-suited to capture some markets than others.</strong> Developers with good and abundant in-house silicon &#8211; perhaps Google &#8211; are well set-up to create deep infrastructure-layer lock-in that will feel convenient, but not sovereign. Developers that lack this kind of compute &#8211; think Anthropic and perhaps some newcomers &#8211; can make a virtue of necessity and position themselves as partners of choice to countries with a particularly strong appetite for (the illusion of) sovereignty.</p></li><li><p><strong>Developers ready to go today can capitalise on their head start by saturating markets</strong> with substantial and somewhat informed current demand, taking quick wins to spin them into path dependencies, while developers that will still need some time to ramp up can take the time to conduct deeper bilateral negotiations, figure out personalised solutions and create deeper inroads with more respect to future demand. Or you might think that some developers might steer close to the US government and enter markets where strategic backing is valuable or necessary, while others can keep their distance and expand into more US-skeptical markets instead.</p></li></ul><p>If you care to extrapolate some of these lines of thinking, I think you&#8217;ll agree that <strong>the incentives line up nicely with some of the leading developers&#8217; profiles.</strong> If they follow down the branches of their comparative advantage, I think we&#8217;re due for a productive division of the world of exports. But in the face of short-term market incentives it&#8217;ll be hard to resist competing for the lowest denominator of cashing in on today&#8217;s most boosterish &#8216;sovereignty&#8217; ambitions instead. If runway allows at all, developers would do well to consider the advantages of creating a durable market instead.</p><p><strong>Second, there are lessons in these observations for the USG </strong>and its export promotion program: looking at where the market incentives point helps identify which parts of exports don&#8217;t organically happen under the current policy paradigm, and thus might need some more specific support. Two interventions seem promising:</p><ul><li><p><strong>Let some exporters stray away from obvious US affiliation. </strong>When faced with a choice between a country importing American AI outside the export promotion umbrella; or the same country importing Chinese AI or launching its own sovereignty ambitions, the USG should still favour the former: it still depresses the Chinese stack, it still retains indirect leverage via US regulation of exporters, etc. And so allowing some developers plausible deniability to launch their own export programs that allow for a lot more sovereignty and a lot less USG affiliation ultimately serves the American strategic interest to capture even markets that would otherwise bias towards importing from adversaries.</p></li><li><p><strong>Focus promotion efforts on strategically valuable, but economically less attractive packages. </strong>Building bilateral relationships with medium-sized markets is really hard and takes a lot more time than selling to someone who has already opened their sovereign wealth fund checkbook. Deploying engineers to integrate the application level into local ecosystems is much more burdensome than building a datacenter and calling it a day. It would be easy for the USG to subsidise the easiest versions of exporting, see them happen, and call it a win. But making inroads in countries that require more massaging should be a <a href="https://www.thefai.org/posts/the-closing-window-to-win-part-i-american-ai-leadership-requires-a-global-strategy-for-full">priority</a> &#8211; a good export program makes things happen that otherwise would not. That means focusing financing and diplomatic support on difficult deals that otherwise would not happen, especially those that require demand elicitation and customisation at scale.</p></li></ul><p><strong>Third, thoughtful importers should make themselves more attractive targets for export deals.</strong> That doesn&#8217;t require aggressive spending or submissive negotiation as much as it requires strategic clarity. Because developers should be very interested in creating deals around lasting demand, but are stretched too thin to elicit that demand everywhere, simply having a clear strategic view of what capabilities you need is already an advantage for an importer: it signals to the exporters that going for a deal with you has strategic value. A similar logic applies to convening consortia; if your government can get together a group of strategically clear-eyed buyers with operationalised capability asks, the exporters have something to work with. </p><p>Ultimately, <strong>if you&#8217;re a middle-of-the-road importer</strong> without a huge sovereign wealth fund or gigawatts of energy infrastructure to deploy,<strong> you&#8217;re not the first port of call </strong>for an ambitious exporter by default. A cheap way to keep up is to <strong>realise what an exporters&#8217; ideal deal landscape looks like, and then work to create it. </strong>Most of that is being able to tell a convincing story about why an export deal confers a lasting market advantage to a developer, and a decent strategic upside to the USG.</p><p>To get the best version of AI exports, we&#8217;ll need all three parties of an export deal to play their part. Ignoring the market incentives is a surefire way to fail at that: the obvious risk is that AI exports collapse to only plucking low-hanging fruit, failing to proliferate capabilities well and allowing an in for a Chinese export product down the road. Making international diffusion work requires seeing where the developers&#8217; incentives pull &#8212; design policy around that.</p><div><hr></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[The Devil You Know]]></title><description><![CDATA[A second look at the tech right]]></description><link>https://writing.antonleicht.me/p/the-devil-you-know</link><guid isPermaLink="false">https://writing.antonleicht.me/p/the-devil-you-know</guid><dc:creator><![CDATA[Anton Leicht]]></dc:creator><pubDate>Thu, 16 Oct 2025 15:19:57 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/750b3b36-5d69-49ea-81b8-d6536e1fba82_8881x5526.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><strong>AI politics feels at a knife&#8217;s edge.</strong><em><strong> </strong></em>A week after I <a href="https://writing.antonleicht.me/p/a-preemption-deal-worth-making">argued</a> that safety advocates and the &#8216;tech right&#8217; ought to strike a deal around federal preemption and frontier safety, the politics seem volatile as ever. I&#8217;m reassured by the depth and breadth of positive response to Dean Ball&#8217;s proposal and my endorsement, and really do feel that movement is possible. But still, much strategic uncertainty remains, and it continues to erupt in rhetoric. On the safety side, people are wary of the tech right and hope for its imminent political failure. On the tech right side, people wonder: why compromise, when we hold all the cards.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p>So a week later, <strong>some on both sides mistakenly think themselves above a deal.</strong> As safetyists weigh what to make of Sriram Krishnan&#8217;s measured <a href="https://x.com/sriramk/status/1978470229056364797">articulation of grievances</a>, and as the tech right considers heeding its allies&#8217; warnings to make a deal, they both have a devil on their shoulder. Its destructive logic goes, &#8216;let&#8217;s take the fight and take our chances with the next guys&#8217;. Against that voice, I&#8217;ll defend and expand my case for deals and d&#233;tente in two arguments:</p><ul><li><p>First, that<strong> the tech right is already politically embattled</strong>, and would suffer from having to focus its stretched political resources on fighting the safety movement.</p></li><li><p>Second, that <strong>AI policy under the Trump administration is better with the tech right around</strong> &#8211; especially compared to any realistic alternative.</p></li></ul><p>At the end of a more vicious fight well into midterm season, we could end up with worse AI policy for everyone. We would see the tech right supplanted by a politicised populist alternative with a poor grasp of the technology and its promise &#8212; which in turn spells worse international diffusion, less AI-driven growth and progress, and a push of safety-relevant development further into secrecy and beyond public oversight.</p><p><strong>There is still another path.</strong> It requires some careful rapprochement: a safety movement that looks past the rhetoric to understand how it can prove its detractors wrong; and a tech right that realises that neither its political position nor its policy goals are served best by fighting safety advocates. Progress on this path begins with understanding the tech right&#8217;s current position.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!u5az!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!u5az!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!u5az!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!u5az!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!u5az!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a></figure></div><h2><strong>The Political Arithmetic of Tech-Right Influence</strong></h2><p><strong>You, too, might have heard the tales of the tech right&#8217;s ascendancy</strong>; of a successful capture of the US government and the consecutive passing of tech-friendly policy at every step. And much of that <em>is</em> true: The tech right elements within the White House, David Sacks in particular, have held on far longer and far better than observers had predicted &#8211; though Sacks&#8217; impressive media presence also plays a part in that. And by all accounts, important decisionmaking on technology policy has successfully been centralised around few tech right decisionmakers at Commerce and the White House. From that center, they have defied pushes for stricter controls on exports to China, authored the AI Action Plan, and executed a controversial mercantilist foreign policy that sees US datacenters raised in the Emirati deserts and global leaders rushing to get in on tech deals with the President.</p><h4><em>Soft Power Only</em></h4><p>But <strong>this past presence and influence is still different from sticky political power. </strong>The tech right&#8217;s pitch to the President and MAGA&#8217;s political leadership has been to provide expert talent, commercial ties, economic credibility and an intellectual underpinning (as well as campaign funding, but that matters less on the margins than many think). This contribution was rewarded with outsized say on politically marginal, comparatively low-salience policy. This is a rare dynamic; the Christian right or jobs populists, for instance, come with a contingent of voters in important states for Republican majorities. The implicit leverage makes them powerful in Congress, it means giving them wins can be spun as electorally important, and has them become essential in election years. In comparison, the tech right by default is strongest when furthest from electoral math: there aren&#8217;t all that many software engineers, they all live in San Francisco &#8211; you&#8217;re not going to win California because of the tech right, and Twitter gets no electoral votes.</p><p><strong>That contribution logic explains why the tech right is primarily empowered to ask for low-salience policies in return for its support:</strong> Higher-salience policy wins require fights or deals with politically more influential groups. For crypto, this is good news: no one other than people with crypto care a lot about crypto, and so the broader coalition is mostly happy to let the tech right do whatever it wants. This <em>used to be the case</em> for AI as well, but AI is becoming very political very quickly. Intersections with many salient areas are starting to emerge: jobs and child safety are the current big-ticket items, but others will emerge.</p><p>As a result, the <strong>modus operandi of the tech right in AI policy has often been to take marginal wins </strong>where they are available, marginally shift policy decisions on issues of lower salience, and carefully pick fights where they pose no outsized risks: around export controls contra an atrophied national security community far out of favour with the admin, or around AI buildouts foreign and domestic that fit well into a general deregulatory, business-friendly, deal-forward political agenda. That might be one of the reasons why it hasn&#8217;t gone to bat for the moratorium in July: As soon as the debate around that reached high salience, it seemed likely that the pro-moratorium would lose. Because the tech right ducked out early, the crusade didn&#8217;t continue, and the anti-moratorium momentum evaporated before it went on to search for the culprit.</p><h4><em>Why the AI Safety Antagonism?</em></h4><p><strong>This dynamic, I believe, is also instructive to understand the kinds of fights the tech right </strong><em><strong>does</strong></em><strong> pick. </strong>Many have been puzzled by Sacks&#8217;s repeated decrying of effective altruists and AI safety advocates, even though you&#8217;d think the more obviously luddite tendencies of jobs- and current-harms-related populists should offend him even more. In fact, the AI safetyists are perhaps the least left-aligned, most pro-technology camp among the tech right&#8217;s adversaries &#8211; so why do they seem like the primary target? One explanation surely is hard-to-parse intra-Silicon-Valley dispute. Another are some genuine mistakes, overreaches and hasty political alignments the safety movement has committed in the past, and about which I&#8217;ve <a href="https://writing.antonleicht.me/p/ai-safety-policy-cant-go-on-like">written</a> <a href="https://writing.antonleicht.me/p/the-real-sycophancy-problem">extensively</a> <a href="https://writing.antonleicht.me/p/a-moving-target">before</a>.</p><p>But an underrated factor is this: there are no effective altruists in the GOP, no AI safety senators &#8211; so <strong>identifying safetyists as the culprits is politically safe</strong>. If Sacks turned the same rhetoric against more obviously technophobic ideologies, this would inevitably prompt an inner-party conflict. If he categorically dismissed concerns around jobs and mental health the same way he dismisses concerns around frontier risks, powerful Republican lawmakers would turn their scrutinising eye to the tech right&#8217;s presence in the President&#8217;s orbit. The tech right has an interest in avoiding that, and thus looks for enemies outside of the coalition. I understand that this explanation might not pacify safety advocates &#8211; but I do think it adds important strategic nuance for interpretation.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!u5az!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!u5az!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!u5az!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!u5az!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!u5az!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h2><strong>Political Headwinds Can Rise To A Storm</strong></h2><p>This general political logic means that, as AI becomes more and more salient, the tech right&#8217;s influence is at risk. The maneuvering space afforded by dodging around the most salient issues is shrinking by the day, as mainstream politics bleed into AI policy and mainstream policymakers begin taking over the discussion. Most obviously, <strong>for any little issue in AI policy, you&#8217;ll soon find vocal opposition on the other side. </strong>Once increasing adoption spells even momentary labor market impacts, lawmakers will fight you on statist, job-protectionist grounds. Once datacenter buildouts&#8217; impact on electricity prices becomes visible, local lawmakers will fight you on cost-of-living grounds. And while the tech right can dodge away from any one single policy fight, at some point there won&#8217;t be much policy to make any more without taking losing fights. That trend will put the tech right in more and more direct opposition to other members of the coalition. In that conflict, the safetyists are yet unaccounted for &#8211; their sway and funding could strengthen the resistance the tech right face at every juncture.</p><h4><em>A Burden Come 2028</em></h4><p><strong>There&#8217;s also the potential of more direct electoral political upheavals</strong>: the election in 2028 casts a long shadow to the present day. For the GOP&#8217;s electoral prospects, the tech right &#8211; not as a policy platform, but as a group of people &#8211; could quickly turn out to be a burden. If AI becomes more and more important, it makes for a glaring vulnerability in the GOP&#8217;s 2028 pitch: With the tech right in good graces and visible positions, it will inevitably be cast as having sold out American workers to tech billionaires. There will be tech-right-specific angles abound &#8211; casting the protagonists as out-of-touch, as transhumanists and successionists.</p><p>There are two potential drivers for this trend: It might be a Democrat line of attack, especially if the populist left clinches the candidacy; which could then force GOP leadership to visibly distance itself from the tech right. Or it could come from inside the party, as the presence of the tech right opens presidential hopefuls up to an anti-AI primary challenge. Already today, Senator Josh Hawley seems to be gearing up for an even more <a href="https://www.hawley.senate.gov/icymi-senator-hawley-warns-ai-threatens-the-working-man-at-natcon5/">explicitly anti-AI</a> next stage of his political career. If the administration, and with it the likely presidential candidate Vance, stays associated with the tech right, it might just be quite vulnerable over an administrative record of supporting and endorsing it. That is a much stickier risk for the tech right than any policy position: you can dodge away from losing policy battles, but if <em>who you are</em> becomes the problem, little room to dance remains.</p><p><strong>The midterms might be one obvious catalyst for that trend to pick up speed. </strong>Right now, AI as an issue is only suitable for extraordinarily vague polling, owing to its low salience. But post-midterms, a lot more information about the political viability of different platforms will come into view; campaigns start thinking strategically about issue viability, and start considering which policies have been boons and burdens. My suspicion is that AI as a technology, economic input and cultural artifact will remain deeply unpopular, and correlations around that fact will start showing up in the data more prominently.</p><p><strong>If all this comes to pass, the GOP coalition can be vicious to its own. </strong>Next to all these reasons of electoral strategy and coalitionary dynamics, the tech right is currently still keeping a lid on latent animosities among the rest of the GOP coalition, with many members suspicious of their purity of faith and allegiance. By all accounts, David Sacks in particular has done very well to maneuver around the White House, its senior staff and the MAGA faithful. But spurred by political volatility, these things can all still come crashing down suddenly &#8211; just ask the many well-connected administration officials that found themselves &#8216;Loomered&#8217; on the tail end of falling out of favour with the core MAGA movement.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!u5az!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!u5az!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!u5az!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!u5az!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!u5az!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h2><strong>The Limits of PAC Money</strong></h2><p><strong>The tech right, too, is reading this writing on the wall. </strong>Perhaps partially in response, it has decided to &#8216;PAC up&#8217; &#8211; it is now armed with a near-unprecedented vehicle for political spending, a $100 million super-PAC called &#8216;Leading the Future&#8217; (LTF). This will be a force to reckon with, no doubt. But it might not be effective in staving off the most important political threats, especially if it is spent on fighting safetyists.</p><p><strong>People like to compare LTF to Fairshake</strong> &#8211; the super-PAC that has seen astonishing successes in crypto policy by following a playbook of aggressive, broad, highly political spending that advanced Congressional crypto champions and disincentivised would-be critics from interfering. But in large parts, Fairshake worked so well because the price of acquiescing to its demands was always very low: few policymakers had genuinely strong feelings on crypto, and even more importantly, there was never big electoral incentive to go against Fairshake. So by just a few high-profile victories, Fairshake managed to create a sense of fear no one was incentivised to go up against. </p><p>But <strong>replicating the Fairshake playbook in AI policy will be difficult</strong>. For one, salience is so much higher: voters will actually care about many AI-related issues. That makes the trade-off much less one-sided, because the threat of PAC money alone might not be enough to stave off anti-AI positions if they&#8217;re politically lucrative enough. <strong>To move the needle on AI, you actually have to spend on the races you care about. </strong>That approach is far more limited: You can target some Democrats for defeat, but that doesn&#8217;t solve the problem of intra-party pressures. Targeting Republicans is harder, because it presumably restricts your spending to primaries if you don&#8217;t want to get into coalition trouble by bankrolling Democrats. And it can backfire, because markedly anti-tech figures can get a lot of political mileage out of portraying themselves as &#8216;targeted&#8217; by big tech. This issue is even further complicated by the real prospect of counter-money; push too hard, and you&#8217;ll find that a lot of political money cares about the tech issue &#8211; and might find your favoured candidates going up against both populist right and tech-Democrat money. </p><p>Now, rumour has it the super-PAC will devote quite some time and effort to targeting AI-safety-aligned policymakers and initiatives. That sounds more likely to be effective, but I do not see how it addresses the tech right&#8217;s main challenge, which comes from the political pressures within its own party. In fact, it further exposes it: by deepening fights that are peripheral to the main challenge, and by inviting safety advocates to focus on defeating the tech right. You might still think this is worth it, if you thought the safetyists are an existential threat. But as I&#8217;ve argued in depth <a href="https://antonleicht.substack.com/p/a-preemption-deal-worth-making">last week</a> &#8211; and Dean Ball has <a href="https://www.hyperdimensional.co/p/the-future-and-its-friends?r=1iq90y&amp;utm_medium=ios&amp;triedRedirect=true">stated</a> much more eloquently &#8211;, that does not need to be the case, and so the tech right could still pivot away.</p><p>In short: if you understand PACs as the main vehicle to save the tech right, and fighting safety advocates as the main way to waste PAC money on peripheral threats, you&#8217;d also consider the deepening conflicts with safetyists a liability &#8211; and might think about how to free up precious money and resources from that conflict. But right now, the tech right is at risk of trying the Fairshake playbook where it will not work, and beating yesterday&#8217;s enemies instead of tomorrow&#8217;s.<strong> That won&#8217;t do to address the deeper political threats.</strong></p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!u5az!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!u5az!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!u5az!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!u5az!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!u5az!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h2><strong>As Good As It Will Get</strong></h2><p><strong>What should we make of this trend? </strong>Beyond animosities on X, intelligent observers do have some substantial reasons to disagree with tech right statements and policies on exports to China, on frontier risks, and on the prospect of fast progress toward advanced capabilities. Now that they see the political future laid out above, they might suspect a window to strike and supplant the tech right. But they would do well to investigate the alternatives first.</p><h4><em>Who Else If Not The Tech Right?</em></h4><p>Right now, <strong>I am not convinced the &#8216;tech right&#8217; would be replaced by something better</strong>: not by the standard of wanting to get AI right very generally, and not even by the standards of ardent safetyists.</p><p>That is in large part because <strong>the tech right&#8217;s most likely replacement are populist types with a spurious-at-best grasp of the technology </strong>and its ramifications. To understand why that&#8217;s the realistic alternative, look back at the conditions that <em>would</em> see the tech right pushed out: a political environment that is supercharged by economic, cultural and social anxieties around AI. In that scenario, the administration would be incentivised to pay visible concessions to this politically motivated crowd &#8211; if not outright through shifting the issue to the direct jurisdiction of Miller-type MAGA faithful, then at least by ceding much more influence over the issue to congressional Republicans with a track record of inane rhetoric on AI. At minimum, they won&#8217;t be willing to engage the technical discussion in the same way that the tech right has.</p><p>Put another way: the only currently realistic way that other &#8216;reasonable&#8217; forces could take over the tech right&#8217;s mantle within the administration would be if that shift was prompted by a low-salience shift in technocratic preference.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> But due to shrewdness, money and influence, the tech right will not be easily displaced in a low salience environment &#8211; so in any environment in which the tech right is displaced, AI policy could well get worse. That&#8217;s for reasons of policy and of political dynamics.</p><h4><em>Reasons of Policy</em></h4><ul><li><p><strong>International diffusion.</strong> In its market-share-driven approach to selling AI systems abroad, the administration has committed itself to a fundamentally export-friendly approach to AI trade. It is hard to overstate just how contingent this outcome was &#8211; especially as the Trump administration is moving to retreat from some of its deeper-integrated trade relations. It makes for bright prospects for many middle powers, who can now, in principle, buy frontier AI capabilities. This incentivises them to find their own economic contribution to an AI-driven economy, but does not put them on the death ground that would have been implied by a highly securitised or isolationist paradigm. If you compare this AI foreign policy with many other areas of the administration&#8217;s trade policy (which might take its place after removing the tech right), I think you&#8217;ll find it extraordinarily mutually beneficial. I rate this issue very highly for reasons I&#8217;ve described in greater depth elsewhere &#8211; permissive exports cut the world in on AI-driven progress and growth, and they lock out China from its path to AI-strategic victory through international diffusion. Looking at the track record of the broader GOP coalition on these matters, a populist replacement would almost certainly reverse this stance.</p></li><li><p><strong>Domestic economic diffusion.</strong> For many reasons, I believe it is important to diffuse AI technologies through the American economy quickly and seamlessly, even at the cost of some transitory disruptions. I of course think this is good and valuable for often-explained reasons of growth, progress and American competitiveness that don&#8217;t need restating. But I also believe this because of the labor market politics specifically: hastening diffusion of augmenting and productivity-boosting AI technology is essential to insulate the labor market against displacement, either from full automation or from augmented workers outside US borders. I do not believe that most job-concerned members of the populist right appreciate this nuance: from everything they have said and shown, I strongly suspect they&#8217;ll veer toward friction-inducing regulation that comes back to hurt the workforce down the road.</p></li><li><p><strong>Frontier safety.</strong> This is the trickiest one &#8212; and it depends greatly on how much you think safety would be helped by slowing down deployment in general: if you do, you favour broadly anti-AI forces in charge. I don&#8217;t share that view, and instead would point out some other factors: For one, the <a href="https://www.whitehouse.gov/wp-content/uploads/2025/07/Americas-AI-Action-Plan.pdf">AI Action Plan</a> is the  playbook for the next years, and it genuinely <a href="https://thezvi.substack.com/p/americas-ai-action-plan-is-pretty">delivers</a> on some safety-relevant topics. I understand it&#8217;s hard to look past many of the theatrics, and plans are not yet policy &#8211; but it does count for something that OSTP has put pen to paper and the Action Plan (and not the Techno-Optimist Manifesto) is what came out. It&#8217;s not enough on safetyist metrics, but I believe it counts as evidence of future potential. To add, the tech right does have a fundamental understanding of the technology and its drivers &#8212; which might count a lot for reacting to serious safety-relevant developments.<br><br>Now, I understand that safety advocates are excited about the populist forces on the basis of safety-focused proposals in the <a href="https://www.hawley.senate.gov/hawley-blumenthal-introduce-bipartisan-ai-evaluation-legislation-to-put-americans-first/">Hawley-Blumenthal bill</a>; and that the &#8216;AGI-pilled&#8217; nature of the CCP Select Committee has revived some hopes in a national-security-led approach to frontier safety. But congressional self-promotion through floating unlikely bills is a categorically different beast than actual lawmaking. Right now, lawmakers like talking about AI in these terms <em>because</em> it&#8217;s filling a gap the admin is leaving, <em>because</em> the political salience and upside is high, and because they face fewer constraints. But if they moved from that role into the very different incentive space of actually <em>governing</em>, I&#8217;m not sure how their incentives would shift. Until then, my high-level view remains that empowering populists based on their current, incidental commitments always carries a risk. If there was an alternative path to frontier safety compromise with the tech right, it would be worth taking instead &#8212; and I still think there is.</p></li></ul><h4><em>Reasons of Political Strategy</em></h4><p>Next to the policy reasons, you should be mindful of the strategic counterfactual.</p><p><strong>First, I do not think you are sure to get H20 export restriction back</strong> even if you supplant the tech right. This is a big crux: If you push many of the most reasonable voices in frontier AI hard enough to justify their dislike for the tech right, the conversation ends up at the H20 decision and fears of subsequent B30A exports permissions. I&#8217;m a bit <a href="http://antonleicht.substack.com/p/the-strategic-case-for-h20-chip-exports">less sure</a> of this myself &#8212; but let&#8217;s grant the argument for now. While Nvidia might initially have found a buyer of the pro-export argument in the tech right, I don&#8217;t think the influence still runs through these channels. Nvidia is the most valuable company in the world, its business model drives a sizable part of US economic and stock market growth, and its continued success is essential to staving off volatile economic developments.  By all accounts, Jensen Huang has leveraged this position to deepen his relationships to top political leadership. By economic necessity and political relationship, the influence of Nvidia now extends deep into the Oval Office, well past the niche areas of AI and tech policy. Tomorrow&#8217;s OSTP would face the same constraints, and might not be so likely to restrict inference exports to China after all. </p><p>And last, <strong>the tech right begets an important equilibrium: As long as it&#8217;s around, the ruling coalition will remain fundamentally divided on matters of domestic AI policy. </strong>No matter how much the administrative consolidation continues to succeed, congressional Republicans, driven by electoral incentives, still skew more sympathetic to the populist case against AI. That puts any legislation, but also much executive action, at the end of a tug of war. That gives others &#8211; dear readers &#8211; room to maneuver, politics to make. It means you can sometimes tap into one camp to derail the efforts of the other; it means things take longer so you can mobilise opposition; and it means there is some incentive for coalition-building outside strict GOP party lines. The replacement of the tech right elements would fundamentally closer align the majority of electorally motivated congressional Republicans with the White House, creating a unified front and force for legislation that makes policymaking from the outside that much more difficult.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> <strong>Even if you dislike the tech right, I suggest you appreciate its contribution to a malleable environment.</strong> I&#8217;m not sure if all of this ultimately reconciliates any ardent critic with the tech right. But I do think it should inform your judgement on the risks and rewards of plotting to remove them.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!u5az!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!u5az!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!u5az!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!u5az!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!u5az!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h2><strong>Back From The Brink</strong></h2><p>All in all, I think the above logic reaffirms last week&#8217;s point: a deepening fight between safety advocates and the tech right turns out to both sides&#8217; detriment. The most immediate implication of that is still this: there is mutual incentive for safety advocates and the tech right &#8216;accelerationists&#8217; to pursue a narrow <a href="https://www.hyperdimensional.co/p/be-it-enacted">deal</a> exchanging federal preemption for frontier safety regulation. But it might be valuable to sketch out the mechanics of the underlying rapprochement in greater detail.</p><h4><em>What Safety Advocates Would Need To Do</em></h4><p>On one hand, <strong>there will soon come points where others in AI policy will have opportunities to seriously weaken the tech right </strong>&#8211; to hasten the trends I describe, and to empower the populist forces. This is particularly true of safety advocates, whose political power within this narrow trade-off is set to rise: with increased public salience and skepticism, safety advocates will find themselves momentarily more influential in their coalitions, and able to point money, public outrage and political attention at the tech right. I don&#8217;t know that much policy will come of it, but it would definitely cause real damage.</p><p><strong>Past conflict will motivate safety advocates in particular to deepen that fight</strong>, and the promise of narrow success will do the rest. But even if you like the odds, the ugly fight might not be worth the outcome: even in low-level skirmishes today, battle lines solidify in ways harmful to the safety movement, such as when Twitter fights break out and fairly uncontroversial legislation in California gets caught in the crossfire. And still, an all-out PAC-funded fight around the midterms would be a good way to waste both sides&#8217; political power on mutual neutralisation.</p><p>Whether for that reason or the substantive policy concerns above, <strong>I&#8217;d hope that the safety advocates show some restraint</strong>: do not overreact to however-offensive social media lashouts that are explained by many factors, work toward deals that can actually deliver on narrow frontier safety priorities, and resist hastily entrenching factional battle lines between parties, companies and issues that do not need to harden yet.</p><h4><em>What The Tech Right Would Need To Do</em></h4><p>But for this to even be remotely viable, the tech right, too, would need to move. I know this publication frequently asks a lot of safety advocates &#8211; but I can&#8217;t reasonably suggest they could refrain from attacking the tech right while the tech right remains on the hunt for AI safetyists. To be able to act on this advice, safety advocates will need more solid evidence that the tech right&#8217;s attacks are contingent: right now, they see themselves at risk of becoming the frog carrying the scorpion across the water.</p><p>So on the same token, I think that <strong>tech right hostilities pointed at the safety ecosystem specifically are counterproductive</strong> &#8211; not least because they invite increasingly destructive retaliation. From many conversations I&#8217;ve had since laying out my case for the preemption deal last week, I&#8217;ll reaffirm that the willingness for compromise and disarmament on both sides is there &#8211; <em>please </em>do not discount that fact based on Twitter vibes alone. But if the tech right does not heed the writing on the wall and instead turns more vicious under pressure, the paths toward reconciliation will close. Yes, the tech right can turn LTF and its public profile against safety advocates and lock them out of the halls of power, and in some narrow sense, this is a position of power &#8211; the dynamics have to reflect this asymmetry.</p><p>But against the political headwinds it&#8217;s facing, <strong>the tech right needs to reconsider who its primary enemies are. </strong>Well-established friends of the tech right &#8211; <a href="https://x.com/neil_chilson/status/1976618436995293410">Neil Chilson</a> and <a href="https://www.hyperdimensional.co/p/the-future-and-its-friends?r=1iq90y&amp;utm_medium=ios&amp;triedRedirect=true">Dean Ball</a>, for instance &#8211; have suggested criteria for meaningful distinction. If not my warnings, the tech right should heed its friends&#8217; calls for caution. Doing so means realising the safety movement is, in many meaningful ways, different from the pro-regulatory left-wing forces at the gates. And it means searching for ways to reconcile its best ideas with your agenda, instead of for reasons to dismiss it out of hand. I read <strong>Sriram Krishnan&#8217;s recent <a href="https://x.com/sriramk/status/1978470229056364797">post</a> as a first step enumerating how the safety movement would have to change.</strong> It&#8217;s worth taking this post in particular very, very seriously, both as a matter of substance and of communicated openness. Safety advocates should give serious responses, and the tech right should take these responses seriously. From the tech right&#8217;s position of relative power, this will be all safetyists can get for now. I think it&#8217;s a better start than any alternative, but I do understand why some would disagree.</p><h4><em>D&#233;tente </em></h4><p><strong>The safety movement and the tech right can both unilaterally choose to carry each other deeper into a political fight that will see them both marginalised.</strong> This essay puts to both of them the extraordinarily difficult ask of not throwing another blow at a staggering opponent. Getting overexcited at the prospect of beating old enemies can very quickly lead to missing the bigger picture, and overindexing on Twitter rhetoric risks obscuring the pitfalls of upheaval. An existential conflict around the tech right&#8217;s future might feel impossible for some, inevitable for others, and satisfying for many. It&#8217;s not.</p><p>There is no safe path for the tech right to get rid of the safetyists without compromising its own position, and no safe path for the safetyists to replace the tech right with something better. <strong>And so I believe both sides are stuck with the devil they know, and better make the best of it.</strong></p><div><hr></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>Or, I suppose, by a security-relevant warning shot shifting ownership of the AI issue back to NSC and DoD, but you can&#8217;t plan for that.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>Perhaps this changes with the midterms. But hoping for the midterms as a core political strategy has its many flaws.</p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[A Preemption Deal Worth Making]]></title><description><![CDATA[Making the best of borrowed time]]></description><link>https://writing.antonleicht.me/p/a-preemption-deal-worth-making</link><guid isPermaLink="false">https://writing.antonleicht.me/p/a-preemption-deal-worth-making</guid><dc:creator><![CDATA[Anton Leicht]]></dc:creator><pubDate>Wed, 08 Oct 2025 14:15:14 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/6ac83954-e26c-4360-8c55-b20f7fb51a59_2196x1812.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><strong>The default trajectory of AI politics is grim.</strong> As political salience rises, entrenched groups with a spurious grasp of the issue will grow in power, exposing the once-sophisticated field of AI policy to vociferous debate and cheap politics. Yesteryear&#8217;s major AI policy factions &#8211; the so-called &#8216;accelerationists&#8217; and &#8216;safetyists&#8217; &#8211; can quickly become marginalised within their respective coalitions. As their influence wanes, less and less policy will be based on a realistic view of this transformative technology. <strong>In striking a preemption deal, the field could find a chance to change that fate. </strong></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p>Right now, faced with the threat of marginalisation, <strong>accelerationists and safetyists should have more in common than they&#8217;d like to admit.</strong> They share a fundamental understanding of the technology itself, and a desire to see its potential realised. Yet rather than reconciliation, we&#8217;re headed for escalation: on one side, an accelerationist camp faces increasing political headwinds, but is now armed with $200 million in super-PACs set to snipe away at safety advocates. On the other side, a safety movement is forced to mount expensive defenses and undertake increasingly risky attempts to garner public salience. Both sides carry a mistaken sense that they can win the ensuing fight. In reality, they&#8217;ll both lose to the broader political dynamics of increased salience, which will render them marginal elements of their respective coalitions.</p><p>Their resources would be better spent on keeping the frontier AI policy conversation on its tracks, and there is a rare window to do just that. Dean Ball, not an orthodox safetyist by a long shot, has laid out the <a href="https://www.hyperdimensional.co/p/be-it-enacted">case and mechanics</a> from his view last week. I agree with its general thrust: <strong>the practical first step is to move toward a deal on preemption.</strong> That deal should attempt to <em>trade broader preemption of state AI legislation</em> for <em>narrow frontier safety measures</em>. It might or might not happen in this Congress. But here is why we owe it to our many shared goals to try.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!89l5!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc558685d-1778-4cdb-9a25-2e0951d0f639_2196x875.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!89l5!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc558685d-1778-4cdb-9a25-2e0951d0f639_2196x875.png 424w, https://substackcdn.com/image/fetch/$s_!89l5!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc558685d-1778-4cdb-9a25-2e0951d0f639_2196x875.png 848w, https://substackcdn.com/image/fetch/$s_!89l5!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc558685d-1778-4cdb-9a25-2e0951d0f639_2196x875.png 1272w, https://substackcdn.com/image/fetch/$s_!89l5!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc558685d-1778-4cdb-9a25-2e0951d0f639_2196x875.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!89l5!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc558685d-1778-4cdb-9a25-2e0951d0f639_2196x875.png" width="1456" height="580" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c558685d-1778-4cdb-9a25-2e0951d0f639_2196x875.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:580,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:3275133,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://writing.antonleicht.me/i/175621704?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb36247af-4611-41d6-b5ec-ebf5a8340a90_2196x1812.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!89l5!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc558685d-1778-4cdb-9a25-2e0951d0f639_2196x875.png 424w, https://substackcdn.com/image/fetch/$s_!89l5!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc558685d-1778-4cdb-9a25-2e0951d0f639_2196x875.png 848w, https://substackcdn.com/image/fetch/$s_!89l5!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc558685d-1778-4cdb-9a25-2e0951d0f639_2196x875.png 1272w, https://substackcdn.com/image/fetch/$s_!89l5!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc558685d-1778-4cdb-9a25-2e0951d0f639_2196x875.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">&#8216;The Peace Negotiations between Claudius Civilis and Quintus Petillius Cerealis&#8217;</figcaption></figure></div><h1><strong>Political Trends and Undercurrents</strong></h1><p>My conviction in this proposal comes from my certainty that things in AI politics are about to get a lot worse for everyone on the current spectrum of frontier policy. I suspect that warrants some explanation. Two trends drive that effect: First, <strong>as AI enters people&#8217;s lives, they start to notice and care about its effects.</strong> Some of these effects register more obviously than others; and the effects that intersect with voters&#8217; deepest concerns are most salient. In recent months, this has included the threat to human jobs and to child safety. It has not included any issue closely or obviously related to frontier safety.</p><p>The second effect is less transparent, but already taking root: <strong>policymakers notice the electorate&#8217;s anxiety around new technology and play to their fears</strong>, connecting them to more salient issues. They portray even current AI as a threat to dignity, jobs, health and safety and exaggerate harmful trends. This will get worse as party politics settle: the populist right is getting ready to tap into anti-tech sentiments, and the left wing of the Democratic party could find AI to be a promising campaign issue if salience trends continue.</p><p>Not all of these trends have to do with frontier AI policy. Many of them will manifest as discussions around application-layer product design, or rules on adoption of prosaic AI systems far downstream. But they will still inform policy windows on frontier action. To understand how, <strong>we should distinguish between two levels of debate: the narrow frontier debate and the broader AI politics</strong> we&#8217;re heading for.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!u5az!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!u5az!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!u5az!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png" width="400" height="100" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:400,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!u5az!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!u5az!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>Two Levels of Debate</strong></h1><p><strong>The first level of debate is the one along the old battle lines: safetyists against accelerationists. </strong>This is a constellation of forces that has not been at the heart of the most recent fights, but still features prominently in the people&#8217;s minds &#8211; perhaps because these were the battle lines of the fight around SB-1047, the first major policy skirmish in frontier AI policy. In this narrow debate and compared to where we were in California two years ago, the trends above absolutely favour the safetyists: safety regulation fundamentally feels like regulation, and anti-AI sentiment favours regulation. This is particularly clear where safetyists don&#8217;t advocate for one narrow policy, but oppose deregulation, which is where we were in the moratorium fight a few months ago. No one who cared about any of the harms wanted broad preemption with nothing in return, so everyone could be rallied to defeat it. Accelerationists may soon find it harder to resist regulatory momentum, as the coalition against blanket deregulation grows stronger. That&#8217;s why accelerationists should and perhaps do seek a deal on preemption today &#8211; this is the most publicly powerful they&#8217;ll be for some time.</p><p>But then there&#8217;s the <strong>second level of debate: AI policy that goes beyond anything to do with frontier safety</strong> and instead has much more to do with mainstream sentiments on tech. However, this effect does not imply that frontier AI policy is getting more likely. It&#8217;s very easy to get child safety groups and labor unions to agree that a 10-year moratorium is a bad idea, because it preempts everyone&#8217;s favorite idea. It will be much harder to get them behind specific policy asks &#8211; especially as AI policy gets more specific, asks increasingly diverge. There is no good reason for an organisation or policymaker who is mainly concerned about, say, labor impacts to endorse a policy that addresses frontier risks when there is also a version of the policy available that just narrowly responds to the most salient aspect of the issue. </p><p>I&#8217;ve written about this in much more detail with regard to <a href="https://writing.antonleicht.me/p/ai-and-child-safety-against-narrow">child safety</a> last week. The high-level point is: <strong>Most AI regulation that comes from this broader high salience really does not help frontier risk. </strong>For any salient area like jobs or child safety, there&#8217;s an abundance of mediocre and easily-dodged policy that cannot conceivably move the needle on something as complex as frontier safety. Especially now that CAISI already exists and that minimal transparency standards are already on the state books, there is very little natural convergence.</p><p>As ever, frontier safety advocates will need their own points of leverage to squeeze their ideas in on the margins. But the trends from above do not confer this leverage to them: there are no obvious mechanisms by which political salience of frontier safety policy increases. So I remain convinced that neither <a href="https://writing.antonleicht.me/p/do-you-need-a-wake-up-call">warning shots</a> nor <a href="http://antonleicht.substack.com/p/dont-build-an-ai-safety-movement">movement-building</a> will suffice.<strong> Counting on salience to directly increase odds of frontier policy is mistaken.</strong></p><div><hr></div><h4><em><strong>Against Marginal Contributions</strong></em></h4><p><strong>Safetyists might endorse remaining a member of a powerful coalition like so:</strong> &#8216;The marginalisation point might be true, but it&#8217;s the best we&#8217;ve got. By hanging on and  aligning with the current harms coalition, we retain some say in its policy positions. That helps nudge policy that is chiefly about other things the right way, so that it&#8217;s also helpful for frontier safety.&#8217;</p><p>For instance, the argument goes, when a child safety law is being passed, safetyists could make sure it empowers CAISI more generally; or when a jobs law is passed, safetyists could make requiring greater transparency into developers&#8217; business practices part of the deal. But this undersells the price of nudging policy. Passing any law is a complex process, and any line of a deal that does not have substantial leverage behind it is susceptible to be cut at any time. In triangulating lobbied interests and policymaker idiosyncrasies, many things quickly fall off the wagon. If a bill is close to the finish line, and cutting a frontier safety concession is necessary to get industry on board, or to avoid an idiosyncratic policymaker jumping off, can safety advocates really be sure that other groups will go to bat for them and make a sacrifice or risk progress at large? I see few reasons to believe that today: safetyists are capable and well-funded, which makes them a valuable coalition member &#8211; but as the balance of power continues to shift, this contribution will matter less and less. <strong>Being a comparatively lower-leverage member of the coalition is unlikely to translate even to marginal policy wins &#8211; </strong>especially given the super-PAC dynamics I&#8217;ll discuss below.</p><p><strong>The distinction between the two levels of debate is a frequent source of confusion: </strong>Many safety advocates who think they might be headed for better policy prospects overindex on the fact that safetyism&#8217;s relative power compared to accelerationism rises. They see that accelerationists&#8217; power will decrease, and see no reason to give them a win through accepting broad preemption now. But that does not account for the fact that the influence of this entire sub-debate for AI policy as a whole radically diminishes. As a result, <strong>safetyists overestimate their likely gains from public salience</strong>. Safetyists and accelerationists alike should instead <strong>consider the shift of the broader debate as a forcing function</strong>: Because both camps risk getting swept away soon, they can and should compromise now and get narrow safety legislation in exchange for some preemption on the books.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!u5az!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!u5az!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!u5az!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png" width="400" height="100" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:400,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!u5az!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!u5az!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>The Other Forces Bearing Down</strong></h1><p>Before fleshing out what that might look like, <strong>there are two important exogenous factors: the upcoming midterms, and the upcoming influx of money into the debate through super-PACs. </strong>They cast their shadow to today: The former makes safetyists hesitant to strike a deal, the latter might make accelerationists miscalculate.</p><div><hr></div><h4><em><strong>Midterm Prospects</strong></em></h4><p>The first are the upcoming midterms. <strong>Some safety advocates are excited about the prospect of safety-sympathetic Democrats winning</strong>; some others feel fatalistic about an even more split Congress. I don&#8217;t want to discuss partisan politics at length; I&#8217;ll just point out that even if Democrats are sympathetic, a partisan bill is unlikely to pass the Senate and become law anyways. My main concern with any midterm-focused strategy is instead that the 120th Congress will be sworn in on January 3rd, 2027. For one, that&#8217;s a long time from now if you measure it in global compute supply, AI model generations or progress toward the automated software engineering that safety advocates dread. But more importantly, it&#8217;s way further down the road of politicisation. The midterms themselves can make AI politics a lot more vociferous; maybe because anti-AI rhetoric around whatever current issue already emerges as a promising campaign strategy before the election, maybe because political strategists will identify its potential right after and determine politically tailored action a strategic priority in the lead-up to 2028. Either way, <strong>after the midterms, the relative power of the accelerationist-to-safetyist spectrum will be lower, the politics trickier, and the windows for frontier policy smaller</strong>. I think it&#8217;s not worth rolling the dice on that prospect.</p><div><hr></div><h4><em><strong>A Tale of Two PACs</strong></em></h4><p>The second is the recently announced super-PAC &#8216;Leading the Future&#8217; (LTF), a joint vehicle by a16z and OpenAI&#8217;s Greg Brockman &#8211; squarely an accelerationist vehicle. Meta&#8217;s state-level American Technology Excellence Project might act in a similar vein, but for a simple model of what happens next, I&#8217;ll focus on LTF. With an opening salvo of $100 million, it is exceedingly well-funded, and will reportedly draw on the highly successful tactics of the crypto super-PAC Fairshake. It&#8217;s by many accounts very likely that LTF will default to setting its sights on fighting safetyists.</p><p><strong>This super-PAC can rapidly erode safetyists&#8217; standing in Congress.</strong> It&#8217;s hard to overstate how devastating the effects of that might be. In the AI policy setting leading up to the midterms, all LTF really needs to do is to call Congressional offices and say &#8216;I have $100 million of PAC money, and I&#8217;m happy to spend it on ugly primaries and heavy attack ads &#8211; remember Fairshake? So, we just wanted to make sure you were not talking to any of the following groups.&#8217;; at which point they&#8217;ll simply read a list of policy organisations that have been vocally opposed to moratoria in the past. And because frontier safety is clever and reasonable, but not particularly salient, it simply doesn&#8217;t make sense for most offices that received that call to keep in touch with safety organisations.</p><p><strong>Safety advocates sometimes argue that public salience insulates against PAC tactics. </strong>This is clearly true: Fairshake has worked so well because nobody really cares about crypto, so it&#8217;s not worth taking the risk. But money in politics has diminishing returns when it brushes up against higher public salience, and is usually not enough to get policymakers to pass deeply unpopular policy. That is a very good reason as to why LTF can&#8217;t prevent policies that draw on very high public salience.</p><p>But that likely makes it even worse for the safety movement. On the level of organisations, it means that LTF can&#8217;t actually keep offices from talking to groups that directly stand for high salience issues like child safety or labor. And on the level of policy, imagine a big AI regulation Christmas tree bill that deals with some salient issue and also has frontier-related provisions tacked on. LTF might not be able to defeat that bill altogether, because it draws on too much salience. But it can pick off single clauses that are not as politically vital. The frontier-related provisions are a prime target for that. On the safetyist logic of being a marginal member of a powerful coalition, LTF means you&#8217;re very unlikely to get your marginal contributions through.</p><p>This actually makes for a particularly insidious secondary effect: <strong>It can wedge away the safetyists from their broader coalition.</strong> In the worst world, increasingly powerful groups around child safety or jobs could come to believe that being close with frontier safety organisations might attract accelerationist lobbyists on the other side &#8211; lobbyists that perhaps know to stay away when the easy-picking safetyists aren&#8217;t in the room. How long after that realisation until more and more coordination calls happen without a safetyist in the room? <strong>None of this is happening just yet, and it can be averted</strong> &#8211; paradoxically, prudent action on preemption might actually keep safetyists in the room by satisfying LTF&#8217;s most reasonable demands and deflecting the rest of its spending elsewhere. Increased salience insulates other policy agendas against LTF, but makes safetyists even more comparatively vulnerable.</p><div><hr></div><h4><em><strong>The Limits of Counter-Money</strong></em></h4><p>The second response safetyists sometimes give is that <strong>they might counter money with money. </strong>Forgive me for being vague, but the overall rationale will be familiar to many readers &#8211; see this <a href="https://firstscattering.com/p/diminishing-returns-in-dc">post</a>, for instance. Deploying safetyist money is a good way to reduce the immediate influence of LTF in some direct regards: it can help fight out ugly elections, champion single policymakers, and reassure on-the-fence lawmakers that feel threatened. But it can&#8217;t offset the amount or breadth of money LTF can deploy. At any realistic ratio of accelerationist to safetyist money, safetyists cannot be everywhere the accelerationists are. They can&#8217;t offer to protect everyone, and there will be plenty of lawmakers who would simply prefer their elections not to become a battleground of AI money &#8211; no matter who will eventually win. Most to the point, there is no true cancelling the lingering threat of a super-PAC with plenty of potential to escalate funding and an imported track record from the Fairshake days.</p><p>But much more importantly, using any potential safetyist money for mitigation is a highly inefficient use of resources. This is money that can be used to stay in the room as politics increase, money that can be a valuable token to contribute to any future coalition that could emerge. In the face of tomorrow&#8217;s politics, <strong>both accelerationist and safetyist money could surely find much better uses than escalatory infighting</strong>. Both camps should endeavour to free up money from a cycle of defensive and offensive spending &#8211; a deal does just that. The funding will be direly needed to stave off many of the worst impulses that will result from a broader debate on AI politics.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!u5az!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!u5az!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!u5az!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png" width="400" height="100" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:400,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!u5az!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!u5az!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!u5az!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1ce0779-dbc7-421e-a23f-46852bbf704d_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>The Path Forward</strong></h1><p>What, then, is the alternative?<strong> </strong>On the face of it, the proposal on the table is about a specific policy deal: <strong>Federal frontier safety laws in exchange for a preemption of some state AI legislation. </strong>The safety laws would get to be deep, the preemption would get to be somewhat broad, i.e. relate both to frontier safety and <em>some</em> other issues that seem particularly likely to produce a burdensome patchwork. </p><p>That deal has something that safetyists like, in that it would regulate frontier AI at the federal level. And it has something that accelerationists like, in that it would preempt the worst instances of a state-level patchwork. Whatever headwinds the future holds, <strong>this would frame the next years of AI policy</strong> &#8211; in essence establishing an acceptable backstop for both sides that otherwise seems under threat. Getting it right will take some triangulating.</p><div><hr></div><h4><em><strong>Process &amp; Progress</strong></em></h4><p>None of the progress toward this deal can happen publicly, at first: committing to the idea of a deal is tricky territory. First for coalitionary reasons: if the deal falls apart, but safetyists have shown willingness to negotiate with accelerationists, they&#8217;ll pay a price within their coalition. Second for reasons of negotiation capital. Already now, safety advocates might think the fact that Dean is floating the idea of a deal as a sign that the accelerationists aren&#8217;t strong enough to just push through preemption alone. A similar public push from the safety side might have the accelerationists interpret the safetyists as desperate. <strong>Both camps can and should stay subtle about this prospect right until it can actually happen.</strong></p><p>But a process can start behind the scenes. Less coalitionary committed safety organisations can communicate openness. People can get in a room and hash out some details. Safety-affiliated policy researchers can make a public counterproposal, and suddenly we have an option space between that and Dean&#8217;s suggestion on the table. And then negotiations can begin, people can talk to their favorite Congressional offices, and the people that get in rooms slowly are more senior and take things more seriously. And perhaps, if it looks like the stars align, sponsors can declare and Congress can start moving. More coalitionary-bound safety organisations or a16z would never have to publicly commit: they can voice mild alibi reservations, but call offices and signal they aren&#8217;t fighting anything, in much the same way that the accelerationists ultimately shrugged and accepted SB-53. It&#8217;ll cost them in their respective coalitions, but perhaps remains short of visibly throwing established allies under the bus. Quietly, away from the spotlight,<strong> a still-nimble policy environment, backed by the threat of super-PACs, might just get this through before the midterms.</strong></p><p><strong>The frontier safety element would have to be a bit more extensive</strong> than Dean suggests, I see no other way. One of the main contributions of Dean&#8217;s proposal is codifying SB-53, and that will not be enough to get safetyists to risk their coalition: the risk of it being preempted currently does not seem high enough to make safetyists move. Accelerationists are seeking substantial concessions here &#8211; and so I suspect that at minimum, something on the scale of fairly hands-off pre-deployment testing or minimal entity-based regulation would need to be included. </p><p><strong>Safetyists, in turn, must commit to keeping the scope of regulation more narrow than the scope of preemption.</strong> Any favour-currying Christmas tree attempts of tacking on the standard current-harms-language makes the deal much less enticing for the accelerationists. The frontier safety elements must be narrow and strong, the preemption must be selective, but broader than just narrow frontier issues. This sets a mutually acceptable frame for the 120th Congress &#8211; where we&#8217;ll all see each other again as we debate new federal laws on all the things the deal has not covered, but both sides&#8217; worst case outcomes have already been avoided by having seized the moment today.</p><div><hr></div><h4><em><strong>A Second-best Aftermath</strong></em></h4><p>But of course, even given a successful deal between safetyists and accelerationists, <strong>a law might very well not happen. </strong>Congress is highly unpredictable, and frequently just does not get things done even if interests align. Also, the current harms issues are already a political sleeping giant today. They might well wake up and plenty of lawmakers might catch what&#8217;s going on and try to stop state-level preemption. I&#8217;m less sure of this than others for two reasons: The safetyist-accelerationist deal can be ad-hoc expanded to rein in the pet issue of at least one lawmaker or two while still being mutually beneficial; and some concerns can be staved off by offloading them into parallel, unrelated but ongoing federal action on issues like child safety. </p><p>But still, there remains a risk. Some safety advocates see that risk and conclude it&#8217;s not worth the downsides for coalitionary cohesion. I think if the maneuvering is done right and the scoping is done carefully, the only coalition-critical commitments would not need to be made before the prospects seem clear &#8211; so we can return to the risk assessments then. </p><p>But more importantly, as I&#8217;ve argued, coalitionary cohesion alone doesn&#8217;t buy policy wins anyways. And on the other hand, <strong>even a failed attempt can move us toward a helpful realignment of battle lines.</strong> Even an attempt to cooperate on this, some reestablished channels of coordination, and a demonstration that safetyists aren&#8217;t necessarily accelerationists&#8217; worst enemies, would go a long way &#8211; especially in the context of the PAC dynamics discussed above. Some big underlying assumptions can be diffused that way: that safety advocates are the most ardent opponents of any preemption, that they believe any regulation is good regulation, and that they&#8217;re captured by partisan and group-specific politics. A credible effort could work to diffuse that. </p><p>In practical terms, that would save us from ugly fights and unnecessary spending: safety efforts would no longer be solely committed to fending off LTF, and LTF would no longer have to go hunting for safetyists. Both sides might actually be well-advised to unlock the other side from their battle: in staving off the worst of AI politics, <strong>they might find themselves on the same side of the fight </strong>more often than not.</p><div><hr></div><p><strong>All of us few who have a serious understanding of the scale of this technology are on borrowed time.</strong> Uglier politics are coming, but we can still frame them today: we can get a lower bound for frontier safety and for policy patchworks on the books before salience has taken root. And we can take a step away from the brink, avoid fighting each other in an increasingly marginal corner of AI policy. <strong>We have much to gain and little to lose.</strong></p><div><hr></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[AI & Child Safety: Against Narrow Solutions]]></title><description><![CDATA[How to navigate a political forcing function]]></description><link>https://writing.antonleicht.me/p/ai-and-child-safety-against-narrow</link><guid isPermaLink="false">https://writing.antonleicht.me/p/ai-and-child-safety-against-narrow</guid><dc:creator><![CDATA[Anton Leicht]]></dc:creator><pubDate>Wed, 01 Oct 2025 13:29:35 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/d1f6cbc4-c535-4919-9ce5-e10d0453d353_960x697.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><strong>The tragic case of a teenager&#8217;s <a href="https://www.nytimes.com/2025/08/26/technology/chatgpt-openai-suicide.html">suicide</a>, allegedly helped on by ChatGPT, has put the issue of child safety center stage in AI politics. </strong>In the just-emerging political conversation around AI policy, issues like these have explosive potential. Policy  will be made around &#8216;political flashpoints&#8217; &#8211; moments when AI intersects with more salient topics, connecting abstract policy debates to tangible harms that command public attention and move constituencies.</p><p><strong>Child safety is one such flashpoint. </strong>After a few months of latent concerns about chatbot use by children spurred on by controversial company policies, the conversation has now broken through. Recent instances of allegedly AI-assisted suicides have resulted in a high-profile lawsuit, national attention, policymaker concern, and hasty reactive commitments from OpenAI. This won&#8217;t stop any time soon: Teenagers are early and enthusiastic adopters of chatbot assistants, with less natural apprehension to intimate and personal conversation. AI developers seem keen to capture this market, and <a href="https://www.reuters.com/investigates/special-report/meta-ai-chatbot-guidelines/">some even seem ready</a> to tolerate troubling usage patterns, perhaps to secure <a href="https://hsph.harvard.edu/news/social-media-platforms-generate-billions-in-annual-ad-revenue-from-u-s-youth/">valuable</a> engagement.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!rVFR!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F097e1d64-11e2-4a41-a9d2-0d597937abd5_2024x786.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!rVFR!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F097e1d64-11e2-4a41-a9d2-0d597937abd5_2024x786.png 424w, https://substackcdn.com/image/fetch/$s_!rVFR!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F097e1d64-11e2-4a41-a9d2-0d597937abd5_2024x786.png 848w, https://substackcdn.com/image/fetch/$s_!rVFR!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F097e1d64-11e2-4a41-a9d2-0d597937abd5_2024x786.png 1272w, https://substackcdn.com/image/fetch/$s_!rVFR!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F097e1d64-11e2-4a41-a9d2-0d597937abd5_2024x786.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!rVFR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F097e1d64-11e2-4a41-a9d2-0d597937abd5_2024x786.png" width="1456" height="565" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/097e1d64-11e2-4a41-a9d2-0d597937abd5_2024x786.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:565,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!rVFR!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F097e1d64-11e2-4a41-a9d2-0d597937abd5_2024x786.png 424w, https://substackcdn.com/image/fetch/$s_!rVFR!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F097e1d64-11e2-4a41-a9d2-0d597937abd5_2024x786.png 848w, https://substackcdn.com/image/fetch/$s_!rVFR!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F097e1d64-11e2-4a41-a9d2-0d597937abd5_2024x786.png 1272w, https://substackcdn.com/image/fetch/$s_!rVFR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F097e1d64-11e2-4a41-a9d2-0d597937abd5_2024x786.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Some recent headlines.</figcaption></figure></div><p><strong>It&#8217;s hard to overstate just </strong><em><strong>how</strong></em><strong> salient the resulting harms to children are.</strong> They make for heart-wrenching media stories and very concerned parents. As opposed to many other cases of harm, they also can&#8217;t be as easily dismissed by implying it was really the user&#8217;s fault &#8211; for good reason, children are usually not considered as responsible for only sustaining beneficial usage patterns. <a href="https://doi.org/10.1017/9781009257954">Social media</a> is perhaps the most obvious example: The times policymakers have really moved &#8211; whether it&#8217;s hearings, proposed legislation, or any detailed scrutiny &#8211; frequently came after some story linking social media to harms to children. We can learn from that: both to take the political momentum seriously, and not to fall into the same resulting policy traps.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p><strong>It would be easy, tempting, and ultimately mistaken to channel child safety concerns into a narrow solution. </strong>Policymakers might be tempted to isolate the child safety issue &#8211; say, by introducing age gates, child-specific modes and similar. AI developers are also incentivised to go down that route: they think that finding narrow fixes makes the pro-regulatory political momentum go away. To that effect, OpenAI has announced its plans to <a href="https://openai.com/index/building-towards-age-prediction/">automatically detect users&#8217; age</a> and accordingly finetune content, and introduced &#8216;<a href="https://openai.com/index/introducing-parental-controls/">parental controls</a>&#8217; for the meantime. I strongly suspect that they are currently lobbying to shape child-safety-focused regulation along the lines of these hastily drawn-up commitments. By default, we might be heading for a narrow child safety policy that captures the political momentum. But <strong>I think that&#8217;s neither effective policy nor prudent use of a promising political window.</strong></p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!8S37!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!8S37!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!8S37!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!8S37!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!8S37!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!8S37!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!8S37!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!8S37!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!8S37!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!8S37!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 1456w" sizes="100vw"></picture><div></div></div></a></figure></div><h1><strong>The Failure of Narrow Solutions</strong></h1><p>On the policy level, child-safety-specific interventions are quite unlikely to work. Attempts to make contentious online services adult-only are not new: they&#8217;ve been applied to social media apps, pornography and gambling for years now.</p><p><strong>They have failed across every online platform that has been compelled to try them</strong>. There are a couple of compounding effects that <a href="https://www.newamerica.org/oti/reports/age-verification-the-complicated-effort-to-protect-youth-online/">make for these failures</a>. The first is outright circumvention &#8211; most basic versions of age gates can simply be <a href="https://www.ucd.ie/newsandopinion/news/2021/january/27/ineffectiveagerestrictionsmethodsareputtingchildrenatriskonsocialmedia/">circumvented</a> by lying about your age, changing accounts once your account has been flagged as a minor, or badgering parents with arguments from peer pressure, school use and whatnot until they give you an adult account. Children resist being treated as children and find ways around restrictions. The second is redirecting traffic to even worse platforms; such as when states like Louisiana banned pornography access, <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5165280">routing traffic away</a> from compliant mainstream sites to even less credible and responsible alternatives. You could imagine much the same effect <a href="https://www.techpolicy.press/when-age-assurance-laws-meet-chatbots/">shifting traffic</a> from large developers&#8217; chatbots to less sophisticated products with fewer guardrails. When the UK quite recently introduced its Online Safety Act, VPN downloads <a href="https://www.bbc.com/news/articles/cn72ydj70g5o">exploded</a>, with substantial evidence that minors drove much of that <a href="https://www.yahoo.com/news/articles/more-age-verification-fallout-artist-173813729.html">surge</a>. Even walled gardens like YouTube Kids <a href="https://par.nsf.gov/servlets/purl/10212017">often devolve</a> into offering inappropriate and unsettling content &#8211; they become lower-value products, removed from the careful oversight usually afforded to flagship lines. They&#8217;re also removed from parental scrutiny because the &#8216;specifically for kids&#8217; label assuages their concerns. I&#8217;m very unsure at what twisted local minima kids-specific post-training for chatbots could end up, but I wouldn&#8217;t rule it out at all.</p><p>Age limits and content gates exist everywhere and really work nowhere. Making user-level policy restrictive enough to solve this issue turns out massively intrusive into privacy &#8211; in ways that led to <a href="https://www.cnbc.com/2025/08/12/why-the-uk-age-verification-law-has-led-to-backlash.html">backlash</a> even in Britain. British policymakers have quickly found out that trying to enforce online safety laws leads down a path of <a href="https://www.bbc.com/news/articles/cn438z3ejxyo">going up against</a> free internet access, VPNs, and so on; opening up the once-popular attempt to protect children to all kinds of criticism.</p><p>Softer rules don&#8217;t do much to prevent the harms we ought to be concerned about. <strong>What &#8216;child safety&#8217; rules do effectively, however, is shift blame. </strong>The usual circumvention methods can all be painted as illicit and sometimes as parent complicity or negligence &#8211; making it much easier for AI developers to offload blame and liability. No matter how ineffective a child safety provision is, once it&#8217;s passed, an AI developer will be able to respond to any allegation of unsafe usage by suggesting the system worked &#8211; it was just the child victim, its parents, its school, or someone else who did something wrong. This is one major reason for why we will see strong support for narrow child safety interventions in the coming months: it&#8217;s an attempt to deflect the political momentum from AI developers&#8217; product priorities.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!8S37!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!8S37!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!8S37!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!8S37!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!8S37!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!8S37!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!8S37!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!8S37!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!8S37!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!8S37!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>A Promising Policy Window</strong></h1><p>If you&#8217;re in favour of finding sensible policy solutions on AI, <strong>the current debate makes for a promising moment to get some important things right.</strong> Part of that is because of the factional setup: Yes, there are already some substantial differences in the positions of different factions on AI. But because AI policy is still in its political infancy, it hasn&#8217;t yet calcified along party lines the way other issues have. Voices are still emerging, policymakers are still building their AI policy brands, parties are still uncertain about their headline ideas on AI. A lot is still in flux &#8211; which gives room for sensible ideas to prevail. Any clear-eyed observer agrees that ultimately, we will have to make federal laws on frontier AI. The current moment might be one of the better times to lay the foundations for this.</p><p>The more important reason for using the child safety window for broader policy is that <strong>the underlying issue is bigger and more important than it first appears. </strong>Solving for specific constituencies is a very good way to lose track of the broader challenges of AI policy. <strong>The child safety issue asks a broader question</strong>: of how to shape interactions between intelligent, persuasive and highly capable chatbots and users that quickly end up at their mercy. Age gates or not, we were always going to have to answer this question. Now, I&#8217;m far from convinced that current AI systems are particularly good at implanting psychoses and delusions, but that&#8217;s not necessary to motivate the challenge. The more capable these systems become, the more context they draw on &#8211; from conversation histories to email inboxes &#8211; the more central they become to user decision-making, and the greater their influence over everyday choices.</p><p>That&#8217;s easy to ignore under normal circumstances, because most people think themselves above the risk of manipulative undercurrents. But recognising the risk anyways isn&#8217;t illiberal. It just requires taking AI capabilities seriously: for all my belief in human autonomy, I also think that <strong>billions of dollars of computational resources and research ingenuity poured into making a persuasive machine make for a very persuasive machine.</strong> If that machine is miscalibrated on what to persuade its users of, I think that&#8217;s a grave threat.</p><p>Particularly salient instances of this effect &#8211; like the cases of teenage suicides &#8211; make this clear. But making these instances go away doesn&#8217;t make the broader effect go away as much as it hides it from political and public attention. <strong>Whack-a-mole solutions gradually strip political salience from important issues. </strong>But aligning the persuasive capabilities of advanced AI systems with the prospect of human autonomy is too important to do that. The issue deserves wholesale political discussion, and if child safety gets us there, I&#8217;ll be happy to take it.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!8S37!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!8S37!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!8S37!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!8S37!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!8S37!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!8S37!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!8S37!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!8S37!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!8S37!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!8S37!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb29a09c1-e62a-4bdf-abc6-276e62243979_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>Channeling the Momentum</strong></h1><p><strong>If narrow solutions don&#8217;t suffice, the child safety issue poses a broader question:</strong> How do we regulate frontier AI systems and their relations with vulnerable, suggestible, impressionable users? This is really, very hard. But it&#8217;s not going to get any easier any time soon, either. So I think we should treat the unusually high salience and the resulting political momentum from the child safety issue as an occasion to have the actual debate: Can we set regulatory incentives to reduce externalities and preserve autonomy in the face of more advanced AI systems? I think we have the makings of a decent conversation around this question, and it&#8217;s worth having. It is one feature of this window that I don&#8217;t quite know what I&#8217;d want the ultimate policy to come of that be &#8211; but I can offer some desiderata.</p><p>The overarching goal is to <strong>give developers a vested interest in getting things right on a deep level. </strong>I&#8217;ve commented on this <a href="https://writing.antonleicht.me/p/a-moving-target">elsewhere</a> in greater detail, but will reiterate I think we have to avoid two specific failure modes in particular:</p><p><strong>The first is to avoid locking in paradigm-specific regulation too early. </strong>It&#8217;s very easy to look at the current chatbot paradigm and come up with overly burdensome, overly narrow regulatory requirements that look silly, onerous, and ineffective in a few years. The story of past AI regulation proposals supports that interpretation; we&#8217;ve time and time again seen policies that would have been invalidated by technical developments. Doing the same thing here would be easy &#8211; for instance by thinking that chatbots with discrete apps and websites will remain the standard mode of interaction, or that text will likely be the medium of choice.</p><p><strong>The second is to avoid shallow metrics for success</strong>. There are real shortcomings of shallow, easily fudged metrics and evaluation targets that can be trained for, deceptively reached, and ultimately circumvented in non-standard usage patterns. Systems must remain robust under adversarial conditions, due to all the circumvention avenues laid out above. The current evaluation ecosystem is doing strong work on this, but still not remotely large or well-equipped enough to get the nuances right if evaluations were actually required to be conducted quickly and thoroughly as a matter of regulatory compliance. A headline bill that sets high standards for anything, but doesn&#8217;t fundamentally consider the challenge of how to verify that its criteria have been met will look good and fail.</p><p>In the face of these challenges, I find myself favouring an incentive- and market-driven approach that determines safety desiderata and leaves their implementation to regulatory markets, with regulatory organisations constrained and licensed by legislators and government agencies. That&#8217;s mostly for two reasons: because they incentivise a market to find ways to avoid the failure modes, and because they don&#8217;t require as much certainty at the point of passing regulation. I think stringent private governance, at least for purposes of near-term harms rather than long-term risks, is promising for reasons well <a href="https://arxiv.org/abs/2504.11501">articulated</a> <a href="https://arxiv.org/abs/2001.00078">elsewhere</a>. It fits the unique challenge of the moment well: We&#8217;re too early to spell out top-down regulation, but still face a rare opportunity to act.</p><p>But more generally, my point is not that I know best what specific policy this window should be used for &#8211; my point is that we will need a specific kind of policy sooner or later, and might get the best version of it if we figure it out today. Researchers, advocates and policymakers should take that opportunity seriously and put forward their own suggestions beyond narrow fixes. </p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!8kI6!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb87828a-2c6c-450c-9fd2-80afbf0ca40d_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!8kI6!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb87828a-2c6c-450c-9fd2-80afbf0ca40d_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!8kI6!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb87828a-2c6c-450c-9fd2-80afbf0ca40d_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!8kI6!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb87828a-2c6c-450c-9fd2-80afbf0ca40d_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!8kI6!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb87828a-2c6c-450c-9fd2-80afbf0ca40d_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!8kI6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb87828a-2c6c-450c-9fd2-80afbf0ca40d_1000x250.png" width="1000" height="250" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/eb87828a-2c6c-450c-9fd2-80afbf0ca40d_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!8kI6!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb87828a-2c6c-450c-9fd2-80afbf0ca40d_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!8kI6!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb87828a-2c6c-450c-9fd2-80afbf0ca40d_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!8kI6!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb87828a-2c6c-450c-9fd2-80afbf0ca40d_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!8kI6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb87828a-2c6c-450c-9fd2-80afbf0ca40d_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>Outlook</strong></h1><p><strong>The child safety issue supercharges the broader conversation around frontier AI regulation with political momentum. </strong>Developers and policymakers alike might think it in their interest to separate the child safety issue, find narrow solutions, and call it a day. But that would be a mistake. In today&#8217;s still nascent AI policy debate, we have a chance to get some of the bigger questions right. Doing so requires that vocal policymakers aren&#8217;t satisfied with shallow assurances; that rightfully regulation-skeptical voices articulate their favoured approach to frontier AI regulation instead of opposition to the concept; and that zealous pro-regulation advocates don&#8217;t overshoot the limitations of regulating an emerging technology. It&#8217;s not a perfect policy window, but it&#8217;s too good to waste it on narrow solutions. <strong>This is a rare opportunity to get something right in frontier AI.</strong></p><div><hr></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p>]]></content:encoded></item><item><title><![CDATA[Making American AI Export Promotion Work]]></title><description><![CDATA[How to sell the American stack to an uninformed world]]></description><link>https://writing.antonleicht.me/p/making-ai-export-promotion-work</link><guid isPermaLink="false">https://writing.antonleicht.me/p/making-ai-export-promotion-work</guid><dc:creator><![CDATA[Anton Leicht]]></dc:creator><pubDate>Wed, 24 Sep 2025 13:38:29 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/d9c31c83-86dd-4882-ad65-0e0a7ba58a31_1521x1054.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>The Trump administration has set out to make the world run on American artificial intelligence &#8212; and has started drawing up the policy details to deliver on that vision. The conversation around this <strong>export promotion is perhaps the most underrated issue in international AI policy today. </strong>It offers the ultimate win-win perspective: entrenching and securing a durable US-led global AI ecosystem, while cutting the rest of the world in on the promise of advanced AI. More so than any other international piece of AI policy, something good might actually happen here:</p><ul><li><p>For countries at risk of being left in the dust, export promotion can deliver frontier capabilities while providing a direly needed forcing function to find a strategic and economic niche.</p></li><li><p>For the US, <strong>getting export promotions right today means undercutting China&#8217;s global influence</strong> and AI stack effectiveness before it becomes mature &#8211; capturing the world while China is still ramping up production.</p></li><li><p>For a world fearful of volatile conflict between the US and China, a successful export promotion approach could strike mutually beneficial deals to entrench the US stack worldwide &#8211; <strong>deciding the AI race by flywheel effects and spheres of influence instead of military conflict.</strong></p></li></ul><p>Policy action is already underway. Full-stack exports feature prominently in the<a href="https://www.whitehouse.gov/wp-content/uploads/2025/07/Americas-AI-Action-Plan.pdf"> Action Plan</a> and in statements by senior White House staffers alike. Pursuant to a recent <a href="https://www.whitehouse.gov/presidential-actions/2025/07/promoting-the-export-of-the-american-ai-technology-stack/">Executive Order</a>, details on the export promotion scheme are being developed, and a program is to be established by October 21st &#8212; so now is the time to get it right.</p><p>But in scoping out this program, difficult challenges must be met. The relation between government-promoted exports and extant private projects remains unclear. Most critically, <strong>export promotion is caught between an underdefined supply and an uninformed demand.</strong> The resulting gap can derail export promotion: Exports sought by importers who are mistaken about their own needs do not create path dependencies, exports aimed at what importers will need three years from now might not be in demand today. Getting it right will require avoiding diplomatic backfire risks, as well as building strong promotion vehicles around the right export stack.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:&quot;button-wrapper&quot;}" data-component-name="ButtonCreateButton"><a class="button primary button-wrapper" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!n5qz!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!n5qz!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!n5qz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!n5qz!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a></figure></div><h1><strong>The Promise of Promotion</strong></h1><p>What makes me so optimistic about export promotion as a tool? I think that <strong>current efforts at diffusing AI to most countries are often led by importing countries that don&#8217;t get what&#8217;s going on</strong>. That leads most current import strategies to be in service of absurd goals formulated in dire lack of situational awareness: the likes of<a href="https://bmds.bund.de/aktuelles/reden/detail/faz-ki-konferenz"> training frontier models</a> on 100,000 H100 chips or reaching inference sovereignty in a country with record-high electricity prices &#8212; with no clear payoffs to the US. Many importer-led schemes, as delivered by the current market, don&#8217;t serve importers&#8217; or exporters&#8217; interest.</p><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;772cfaa1-44c3-41a0-9ed8-c57fff86f460&quot;,&quot;caption&quot;:&quot;AI policy analysts often present compute as a resource somewhere between oil from the 20th century and Spice from the science-fiction universe of Dune &#8211; a near-mystical resource that bestows wealth and political power onto its holders. That became obvious again in May this year, when the US government announced a major deal with the UAE on the export o&#8230;&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;sm&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;Datacenter Delusions&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:113003310,&quot;name&quot;:&quot;Anton Leicht&quot;,&quot;bio&quot;:&quot;I write about the policy, politics, and political economy of advanced AI.&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/75422da7-aafa-42ab-8fa6-cf4f0df85cf0_3166x3166.jpeg&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2025-07-16T11:49:35.689Z&quot;,&quot;cover_image&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5d3cdbaf-df6b-404d-8241-52d595ef5432_1476x1080.jpeg&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://writing.antonleicht.me/p/datacenter-delusions&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:168455660,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:19,&quot;comment_count&quot;:4,&quot;publication_id&quot;:3834218,&quot;publication_name&quot;:&quot;Threading the Needle&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!4SKU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd7dff89d-5b66-4169-8ef8-eb2ec9ed94e8_1024x1024.png&quot;,&quot;belowTheFold&quot;:false,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><p>The export promotion story looks different. Ideally, <strong>the US should be actively looking to strike deals to facilitate the export of consortium-based &#8216;full stacks&#8217; of AI technology</strong>: US firms build and run a datacenter that services US-built AI systems in a foreign country. The US government has a range of tools available to foster these deals:</p><ul><li><p>Financial, such as loans, guarantees, and insurances;</p></li><li><p>Coordinative, by getting together and aligning a consortium; and</p></li><li><p>Diplomatic, by conducting negotiations on behalf of the export plans or tying these plans into broader foreign policy initiatives.</p></li></ul><p>The US is motivated to do so on three grounds: create a lasting global revenue base for their AI industry; undercut future Chinese export ambitions; and set up a &#8216;<a href="https://x.com/sriramk/status/1961072926561550366">flywheel effect</a>&#8217;, in which broad deployment of US systems feeds quality improvement to American hardware and software.</p><p>That makes it likelier for these deals to happen &#8211; but they are also likelier to be good deals. That is because the <strong>US government and private sector have a vested interest in delivering a </strong><em><strong>useful</strong></em><strong> solution. </strong>The goal of export promotion is not just to make a quick buck on selling a couple of chips, but to create enduring demand and tech ecosystem integration. If the US were to sell a useless stack, countries might still turn around, develop proprietary solutions or go for Chinese imports later instead.</p><p>So<strong> to win on exports, America needs to export lasting path dependency </strong>on the US stack. It&#8217;s an urgent matter: there are just a couple of years left to entrench the American stack. A Chinese export platform, highly subsidised by its government, is a threat to US market share &#8211; but it&#8217;s still a few years out, mostly due to Chinese chip manufacturing constraints. <strong>It&#8217;s important to get exports right before Chinese exports happen</strong>. Export promotion is an attempt to capture that moment, and so aligns incentives that are otherwise far apart: The US wants to seize the window of opportunity to provide countries with capabilities they need, so they don&#8217;t get them somewhere else.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!QElY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8bc90a13-37ba-4c8d-bd6a-432896ad7909_1024x647.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!QElY!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8bc90a13-37ba-4c8d-bd6a-432896ad7909_1024x647.png 424w, https://substackcdn.com/image/fetch/$s_!QElY!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8bc90a13-37ba-4c8d-bd6a-432896ad7909_1024x647.png 848w, https://substackcdn.com/image/fetch/$s_!QElY!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8bc90a13-37ba-4c8d-bd6a-432896ad7909_1024x647.png 1272w, https://substackcdn.com/image/fetch/$s_!QElY!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8bc90a13-37ba-4c8d-bd6a-432896ad7909_1024x647.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!QElY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8bc90a13-37ba-4c8d-bd6a-432896ad7909_1024x647.png" width="1024" height="647" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/8bc90a13-37ba-4c8d-bd6a-432896ad7909_1024x647.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:647,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!QElY!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8bc90a13-37ba-4c8d-bd6a-432896ad7909_1024x647.png 424w, https://substackcdn.com/image/fetch/$s_!QElY!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8bc90a13-37ba-4c8d-bd6a-432896ad7909_1024x647.png 848w, https://substackcdn.com/image/fetch/$s_!QElY!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8bc90a13-37ba-4c8d-bd6a-432896ad7909_1024x647.png 1272w, https://substackcdn.com/image/fetch/$s_!QElY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8bc90a13-37ba-4c8d-bd6a-432896ad7909_1024x647.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">It&#8217;ll take <a href="https://semianalysis.com/2025/04/16/huawei-ai-cloudmatrix-384-chinas-answer-to-nvidia-gb200-nvl72/">some time</a> until China builds an export stack, but they&#8217;ll get there. For now, <a href="https://epoch.ai/gradient-updates/why-china-isnt-about-to-leap-ahead-of-the-west-on-compute">chip constraints</a> mean there is no viable Chinese export platform.</figcaption></figure></div><h4><em><strong>What&#8217;s Not To Like?</strong></em></h4><p>Skeptics of export promotion will tell you two things. First, that we can just get this future without explicit promotion &#8211; <strong>foreign government will just naturally import. I think that&#8217;s mistaken. </strong>An export promotion framework means there is way more US buy-in, making deals happen that otherwise wouldn&#8217;t &#8211; because the US supports the deals, encourages importers, and is incentivized to get them to understand AI and where it is going. That&#8217;s aligned with American interest, because the US should get in on favourable deals before the market gets around to them, creating early path dependencies and widespread adoption of its stacks. In a rare current moment, the US is facing a <a href="https://www.thefai.org/posts/the-closing-window-to-win-part-i-american-ai-leadership-requires-a-global-strategy-for-full">closing window to win</a> &#8211; and is thus incentivised to move fast on diffusion.</p><p>Second, critics will tell you that export promotion is at best a niche cause: most <strong>AI exporting might not happen through nation-led deals, but through private businesses selling to private businesses</strong> &#8211; like in the cases of American hyperscalers creating datacenter capacity in Europe in the past. I think this is true by volume: I expect most of the local computing capacity to be created where hyperscalers believe they should build out their inference and training capacity. And I think governments should generally not be in the business of elevating run-of-the-mill hyperscaler projects to a matter of national ambition. </p><p>But it&#8217;s not true by quality. <strong>Export promotion done well can facilitate the proliferation of some of the most important capabilities.</strong> Hyperscaler expansion alone does not make for an enduring shift of AI access. Especially in countries with a small footprint in terms of economic demand for AI systems beyond consumer apps and API access, non-promoted market forces alone seem unlikely to service durable import deals. And many important functions will not be provided by importers&#8217; domestic private sectors alone, especially as they relate to public use of AI systems: as a government or national security resource or as an instrument for public research or education.<a href="https://openai.com/index/introducing-stargate-uk/"> Local servicing of specialist public-sector-focused use cases, not unlike Stargate UK</a>, could be one blueprint for a good export &#8211; just that you&#8217;d need to promote it to get it done anywhere but in the very AI-aware UK. Private incentives get you unstable access for incidentally profitable areas. Durable diffusion is not a default, and might be the outcome of export promotion instead.</p><p>Still, <strong>detractors have a point: Promoted exports will have to go where the market currently does not</strong> &#8211; otherwise you wouldn&#8217;t have to promote them in the first place. But promotion does not imply a deal would otherwise be bad &#8211; my sense is that much promotion will aim to hasten a deal that would ideally happen much earlier, but are paralysed by lack of awareness or capacity. They are mostly trying to make good deals happen sooner. But there&#8217;s a lesson in the objections for the US government regardless: if these deals are supposed to go beyond signing off on natural market moves, there need to be substantial and actually helpful promotion efforts.</p><p>Export promotion as a framework of international AI diffusion is very much worth being excited about. But <strong>a minimal implementation of the Executive Order is not set up to deliver on this optimistic perspective</strong>. The awareness gap persists: Buyers don&#8217;t want what they should want, the sellers don&#8217;t know what to sell. Let&#8217;s start by looking at the buyers.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!n5qz!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!n5qz!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!n5qz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!n5qz!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>Buyer Ignorance</strong></h1><p>By and large, <strong>governments of likely importing countries don&#8217;t have a strong understanding of what they should buy</strong> &#8211; or that they should buy at all. This is the core issue that has made a successful exporting approach without promotion so difficult, but it remains a problem even if the US government plays a more active role.</p><p>Very broadly, I suspect there are two ways in which buyers are getting this wrong.</p><p>Importing <strong>countries that know AI is somewhat important often want to achieve &#8216;sovereignty&#8217;. </strong>That makes them interested in imports, but only on their terms: They might want to import GPUs, but want to be able to control them, no US datacenter providers cut in. This is the story behind initiatives like the EU gigafactories. Policymakers in these countries have looked upon the UAE deal and grown concerned &#8211; they see the involvement of US consortia as restrictive to the way they can use the GPUs they buy, surmising they&#8217;re not getting &#8216;real&#8217; sovereignty by the deal.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!V87d!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc85a4583-98dd-4b9a-9363-790a37015086_1158x401.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!V87d!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc85a4583-98dd-4b9a-9363-790a37015086_1158x401.png 424w, https://substackcdn.com/image/fetch/$s_!V87d!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc85a4583-98dd-4b9a-9363-790a37015086_1158x401.png 848w, https://substackcdn.com/image/fetch/$s_!V87d!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc85a4583-98dd-4b9a-9363-790a37015086_1158x401.png 1272w, https://substackcdn.com/image/fetch/$s_!V87d!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc85a4583-98dd-4b9a-9363-790a37015086_1158x401.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!V87d!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc85a4583-98dd-4b9a-9363-790a37015086_1158x401.png" width="1158" height="401" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c85a4583-98dd-4b9a-9363-790a37015086_1158x401.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:401,&quot;width&quot;:1158,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!V87d!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc85a4583-98dd-4b9a-9363-790a37015086_1158x401.png 424w, https://substackcdn.com/image/fetch/$s_!V87d!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc85a4583-98dd-4b9a-9363-790a37015086_1158x401.png 848w, https://substackcdn.com/image/fetch/$s_!V87d!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc85a4583-98dd-4b9a-9363-790a37015086_1158x401.png 1272w, https://substackcdn.com/image/fetch/$s_!V87d!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc85a4583-98dd-4b9a-9363-790a37015086_1158x401.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Epoch&#8217;s<a href="https://epoch.ai/data/gpu-clusters"> supercomputer database</a> reveals both the breadth and lack of depth of sovereignty ambitions.</figcaption></figure></div><p>They&#8217;re somewhat right about this, but are missing the fact that even sovereign control of incidental heaps of chips doesn&#8217;t do much. Between replacement cycles and continued dependence on models, you can&#8217;t actually buy sovereignty. Still, this <strong>sovereignty ambition might make countries less likely to engage with the US&#8217; preferred export models</strong> and build unhelpful projects instead. This is not just a problem for importers, but for the US as well, because you don&#8217;t get these sovereignty-minded countries onto the full stack early enough &#8211; which has them participate less in the overall US ecosystem, and makes them more susceptible to Chinese exports once their original ambitions fail. This is a notion that still dominates the domestic conversation in some of the most potentially-profitable and strategically important importing countries &#8211; South Korea, Germany, and France, to name a few.</p><p>Many <strong>countries also don&#8217;t have a strong notion of their demand just yet.</strong> Some political environments are keen to dismiss AI as hype, others have illusions about the viability of some locally developed stack or paradigm, others still assume they can muddle through on general API access without national solutions. Substantively, they often haven&#8217;t figured out strategic and economic needs: will their economies require large amounts of low-latency inference? A lot of fine-tuning capacity? Ability to process privileged data locally? Large-scale proprietary training infrastructure for narrow AI?</p><p>Depending on the answers to these questions, they&#8217;ll require different imports. You might give them expensive, low-latency compute and create a market for cheap, slow mass inference instead; you might give them low-security commercial capacity where they&#8217;d need high-security capacity for government use or vice versa; and so on. Give them the wrong stack, and future demand not met by US supply might be an opportunity for China&#8217;s exports. Even worse: High-profile <em><strong>failures</strong></em><strong> of the US stack to meet importers&#8217; demand give China salient benchmarks for its own export ambitions.</strong></p><p>To make matters more difficult, strategic confusion is frequently paired with <strong>an apprehension toward making deals with the US</strong> in particular. The adverse consequences of this position might not become visible all that quickly &#8211; and might even be reinvigorated by transitory market corrections. I know that many governments register high-level interest in importing right now, so you might doubt that concern. But as import deals get closer to completion, I believe these political hurdles and demand mismatches will become more and more important &#8211; and I wouldn&#8217;t be surprised if many initially enthusiastic negotiations broke down over time.</p><p>As a result of these misapprehensions, <strong>buyers of both kinds might organically realise their demand for importing a US stack too late. </strong>This is a problem not just for them. The viability of an export-forward US strategy increases the earlier the exports happen. Both the self-reinforcing flywheel effect that hinges on widespread adoption and the idea of preempting Chinese exports, which are not happening just yet, should have you place a premium on getting in early. Mismatched demand is a major challenge for export promotion.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!n5qz!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!n5qz!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!n5qz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!n5qz!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>Sellers Beware</strong></h1><p>These buyer preferences inform the design of the export stack at a time when many design elements are up in the air anyway. The US government and the private firms providing the AI stack haven&#8217;t quite figured out two things: what exactly to sell, and how to promote it.</p><h4><em>Promotion Avenues</em></h4><p>Starting with the latter, <strong>the US government must make sure to aim promotion at the right kind of deals.</strong> For an export promotion scheme to add surplus value, it needs to enable deals that otherwise wouldn&#8217;t happen. But that must not mean subsidising deals that otherwise lack economic viability. Instead, export promotion should be about overcoming inertia and lack of institutional awareness: Make good deals happen earlier by getting importers to understand they&#8217;re mutually beneficial. But even overcoming inertia this requires a substantive push. It&#8217;s not yet clear how the US might deliver it.</p><p>The first and most obvious angle is through coordination and negotiation. But in many critical areas of implementation,<strong> the US government is reportedly<a href="https://www.politico.com/news/2025/07/17/cyber-tech-state-ai-00460679"> already</a><a href="https://www.reuters.com/world/us/us-government-turmoil-stalls-thousands-export-approvals-sources-say-2025-08-01/"> understaffed</a>.</strong> Agencies responsible for negotiating satisfactory bilateral deals are unlikely to have the manpower to assemble bespoke consortia and convince buyers who are somewhat unaware of their attractiveness. Financial support is an obvious alternative. But it&#8217;s as of yet <strong>unclear how much financial support the US government will be able to provide</strong>. Making exported full stacks affordable enough for the deals to become an obvious buy, as China has often done with its exports of critical infrastructure, is very expensive. Key US vehicles like EXIM and DFC are already spread thin, and additional appropriations are very difficult to secure. If there is enough political support to aim extant export promotion resources at AI exports, the financial contribution might be substantial enough &#8211; if.</p><p>Of course, neither scaling up staffing nor subsidies is beyond theoretical reach &#8211; but both face a difficult political economy in an administration that has made early and loud commitments to cutting back on <a href="https://www.whitehouse.gov/presidential-actions/2025/01/hiring-freeze/">overstaffed departments</a> and <a href="https://www.whitehouse.gov/presidential-actions/2025/01/reevaluating-and-realigning-united-states-foreign-aid/">international spending</a>.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!50Cg!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81bb84bb-360f-4db3-a7ce-55f0d9f1ed6d_1020x660.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!50Cg!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81bb84bb-360f-4db3-a7ce-55f0d9f1ed6d_1020x660.png 424w, https://substackcdn.com/image/fetch/$s_!50Cg!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81bb84bb-360f-4db3-a7ce-55f0d9f1ed6d_1020x660.png 848w, https://substackcdn.com/image/fetch/$s_!50Cg!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81bb84bb-360f-4db3-a7ce-55f0d9f1ed6d_1020x660.png 1272w, https://substackcdn.com/image/fetch/$s_!50Cg!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81bb84bb-360f-4db3-a7ce-55f0d9f1ed6d_1020x660.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!50Cg!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81bb84bb-360f-4db3-a7ce-55f0d9f1ed6d_1020x660.png" width="1020" height="660" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/81bb84bb-360f-4db3-a7ce-55f0d9f1ed6d_1020x660.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:660,&quot;width&quot;:1020,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Redefining America's Interests? Trump's FY2026 Budget Proposes Sweeping  Cuts to US Foreign Aid | Center For Global Development&quot;,&quot;title&quot;:&quot;Redefining America's Interests? Trump's FY2026 Budget Proposes Sweeping  Cuts to US Foreign Aid | Center For Global Development&quot;,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Redefining America's Interests? Trump's FY2026 Budget Proposes Sweeping  Cuts to US Foreign Aid | Center For Global Development" title="Redefining America's Interests? Trump's FY2026 Budget Proposes Sweeping  Cuts to US Foreign Aid | Center For Global Development" srcset="https://substackcdn.com/image/fetch/$s_!50Cg!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81bb84bb-360f-4db3-a7ce-55f0d9f1ed6d_1020x660.png 424w, https://substackcdn.com/image/fetch/$s_!50Cg!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81bb84bb-360f-4db3-a7ce-55f0d9f1ed6d_1020x660.png 848w, https://substackcdn.com/image/fetch/$s_!50Cg!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81bb84bb-360f-4db3-a7ce-55f0d9f1ed6d_1020x660.png 1272w, https://substackcdn.com/image/fetch/$s_!50Cg!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81bb84bb-360f-4db3-a7ce-55f0d9f1ed6d_1020x660.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><a href="https://www.dfc.gov/sites/default/files/media/documents/FY26%20Congressional%20Budget%20Justification.pdf">DFC</a>&#8217;s funding (as well as <a href="https://img.exim.gov/s3fs-public/documents/exim-fy-2026-app.pdf?VersionId=RZkoU4Sn_VieeuIL5_eZuDYen1Ufxftt">EXIM</a>&#8217;s) is not usually high enough to move the needle on expensive infrastructure projects. Comparable <a href="https://greenfdc.org/wp-content/uploads/2025/02/Nedopil-2025_China-Belt-and-Road-Initiative-BRI-Investment-Report-2024-1.pdf">Chinese funding</a> is often greater by an order of magnitude. Focusing something like the proposed $3b revolving fund fund on AI export promotion might help.</figcaption></figure></div><p>If the US does not manage to consolidate political support for putting in the work and money on bilateral negotiations and export subsidies, <strong>I&#8217;m concerned the government might resort to broader diplomatic pressure as its main lever for export promotion</strong>. It could tie its asks for import deals to other questions of foreign policy, such as broader trade and tariff policy or military cooperation, such as on Ukraine. That sounds cheap and attractive in the short term, but<strong> could easily backfire</strong>: As laid out above, the sovereignty ambition is already a major hurdle for export promotion. If the promotion program gives importers the impression of detrimental dependencies and losing leverage from the get-go, bridging the supply-demand gap becomes much harder. The US government and exporters alike can prevent that: building enough negotiation capacity and subsidy options on the US side; and easing negotiations by developing a clear picture of domestic demand on the importer side.</p><h4><em>Shape of the Stack</em></h4><p>On what exactly the government should sell, the underlying executive order points out that this stack should include:</p><blockquote><p>(A) AI-optimized computer hardware (e.g., chips, servers, and accelerators), data center storage, cloud services, and networking, as well as a description of whether and to what extent such items are manufactured in the United States;</p><p>(B) data pipelines and labeling systems;</p><p>(C) AI models and systems;</p><p>(D) measures to ensure the security and cybersecurity of AI models and systems; and</p><p>(E) AI applications for specific use cases (e.g., software engineering, education, healthcare, agriculture, or transportation)</p></blockquote><p>The reasons for the full-stack focus are clear from the government&#8217;s overall approach: Market share and flywheel effects alike are most pronounced when the stack includes all layers. And there&#8217;s some attractiveness in exporting turnkey solutions: it reduces expertise required on the buyers&#8217; side. Instead of having to formulate demand for GPUs, bandwidth, and data, <strong>importers can simply name a capability they&#8217;d like to import, and the US can deliver a full-stack solution</strong> enabling that capability. The upshot is a simpler shopping experience, easier to explain and justify &#8211; anyone who has ever worked in national procurement will agree that is an underrated merit.</p><p>Still, that&#8217;s quite the stack to be exporting all at once, and <strong>leaves some unclarities</strong>. It&#8217;s unclear, for example, why you need to export data pipelines and labeling systems if you already deploy finished AI applications to the datacenter you have built. Or why you&#8217;d want your full-stack export to be specific to applications, rather than being open-ended for general-purpose systems which can interface with importers&#8217; proprietary structures.</p><p>It&#8217;s similarly unclear how to square the approach with various AI developers&#8217; &#8216;AI for countries&#8217; efforts. Is brokering an OpenAI Stargate deal, or even a full Google stack including in-house silicon within the scope of export promotion? It might be, but what&#8217;s the government&#8217;s value add, then? It couldn&#8217;t be coordination as gestured at in the EO, because Stargates themselves have mostly been coordinated and integrated packages &#8211; so is subsidising AI developers&#8217; existing platforms through diplomacy or financing in scope? Perhaps export promotion is supposed to only work where Stargate-like projects do not, such as in countries too small to warrant major developers&#8217; attention &#8211; but to focus a purported strategic leverage framework only in the smallest markets seems puzzling as well. There is much left to clarify here.</p><p><strong>Many of these questions reduce to a core tension between three desiderata</strong> here that sellers will have to figure out. An ideal export stack, from the ideas laid out in the executive order and the notion of maximising global market share, would have to satisfy the following desiderata:</p><ol><li><p><strong>Full-stack exports. </strong>Be integrated or cross-cutting enough to count as &#8216;full stack&#8217; in the sense of the flywheel effect described by senior White House officials.</p></li><li><p><strong>Make big sales today. </strong>Satisfy <em>current real demand</em> by importing countries, which are often irrationally focused on fake sovereignty or training capacity due to deep-running misconceptions about their own position in the AI race.</p></li><li><p><strong>Secure loyal customers.</strong> Respond to <em>actual, future demand</em> of the importers. The US wants the exported solution to remain attractive even once buyers improve their strategic understanding, so that the export captures enduring market share, leads to further exports, and can&#8217;t be undercut by a Chinese competitor that offers a better fit to future needs later on.</p></li></ol><p><strong>These ideas are sometimes in tension.</strong> The two most striking problems are the mismatch between 1 and 2 as well as 2 and 3: Insofar as it exists, current real demand from countries on AI aims mostly at fake sovereignty, which importers often believe to be different from full-stack imports. Simply selling them what they want right now, however, is <em>also</em> not a good solution: If you satisfy a demand that will inevitably change soon, you don&#8217;t win the enduring market share you&#8217;re looking for. <strong>The US edge is that it gets to sell today, but the market for useful full-stack exports today is still small. What to do?</strong></p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!n5qz!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!n5qz!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!n5qz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!n5qz!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>Toward a Solution</strong></h1><p>To square that circle, the US government might have no choice but to compromise a bit: It needs to focus its exporting efforts on <strong>a stack that gets in on the current level of demand, but continues to capture the market at future levels of awareness. </strong>A central design goal should hence be closing the awareness gap.</p><h4><em><strong>Export Design</strong></em></h4><ol><li><p><strong>Reconsider including the application layer.</strong> This is where some of the most significant mismatches between supply and demand occur. Many countries lack a clear understanding of the applications they need at this moment, including how to integrate AI into their education systems, what military applications, if any, they&#8217;re interested in, and so on. Any deal that prescribes instead of supports an application layer in detail risks being unresponsive to future preferences. Flexibility provided by hosting general-purpose systems enhances the sales pitch and increases the likelihood of lock-in. Concerns about diminishing the flywheel effect should not stop you: as long as the entire rest of the stack is US-built, any application layer on top will likely be linked to the US ecosystem and contribute to the synergies the administration cares about.</p></li><li><p><strong>Provide importers with some notion of sovereignty</strong> that enables meaningful freedom within the US ecosystem, but no breakout to Chinese solutions. Imported sovereignty is primarily a problem for US strategy insofar as it creates the future option to break away from the US stack. The tempting solution is to include contractual obligations not to switch to Chinese solutions, like in the UAE deal. But this is exactly what might set off the diplomatic risk of backfiring: If the promotion framework looks too much like it&#8217;s supposed to create path dependencies, sovereignty-minded buyers might be out. It&#8217;s an important mechanism especially given untrustworthy buyers, but should not be overused in interfacing with allies that could take offense and grow cautious.<br><br>So, any approach that frees up any capacity for importers to command is very helpful; though I&#8217;m less certain on how to get this right. Perhaps they can be freed up to repurpose parts of the infrastructure stack after a couple of years of exclusive use, when it&#8217;s no longer state of the art. You could also consider allowing importers to use the hardware stack to run open-source models instead. If the US is optimistic in its commitment to service top-tier open source models, and has reason to believe it can run them particularly efficiently so that it&#8217;ll be more attractive for importers to run US rather than Chinese OS, it should seriously consider opening the stack up at the AI system layer with some delay. I believe that would contribute greatly to a perceived sense of sovereignty, which might be the most important part of the diplomatic puzzle to get right.</p></li><li><p><strong>Spend some money. </strong>Equipping your favorite vehicle to outright fund promotion would be the most obvious win. That would happen perhaps most obviously through the revolving fund at DFC, and maybe further scaling up EXIM&#8217;s appropriations &#8212; though there is some lack of precedent regarding the latter&#8217;s extensive application to software. But more importantly, the political economy for outright subsidies is difficult, and appropriations are hard to come by. Beyond direct subsidies, you might provide extensive financial derisking, which aligns positively with the AI private sector&#8217;s general interest in diversifying away from a small number of markets. Enabling consortia to provide cheaper import deals doesn&#8217;t only reduce the threshold for locking in deals now, it&#8217;ll also be important to compete with any potential future Chinese export product. Taking on the financial cost via favourable insurance is a little bit more politically feasible right now. And if the administration feels it has good reason to believe in its model of how AI will go, the risk should be manageable.<br><br>Realistically, this might be a two-step process, where more extensive appropriations could be secured only for the 2027 fiscal year. But starting out with underpowered funding is risky even if that&#8217;s the plan: without some keystone projects to show for, Congress might not take too kindly to the idea to expand funding for export promotion in the future. If the promotion framework starts out with too little funding, spread too thin over too many projects, it might never get off the ground. To the extent that the two-step process is unavoidable, it might be well-advised to stick to very few projects in the first year to gather solid proof of concept for budgetary expansions down the line.</p></li></ol><h4><em><strong>Demand Shaping</strong></em></h4><p><strong>A second leg of a successful US strategy would be to contribute to shaping importers&#8217; demand. </strong>That&#8217;s a thin line to walk. On one hand, the US has a wealth of private and government insight into the trajectory and capabilities of AI, the sharing of which can be useful. In the absence of sufficient capacities at the State Department, in particular following recent cuts, the capacity to spread this awareness needs to be created elsewhere &#8211; lead perhaps by OSTP, but tapping into the broader American AI ecosystem: A large-scale effort to have trustworthy leaders in technology and technology policy interface with sympathetic elements in importing countries to create some awareness of the upsides of AI imports seems within scope for successful export promotions. This would be even more effective if the US could create a snowballing effect from recruiting representatives from early importing countries to the rest of their respective regions.</p><p>On the other hand, <strong>it will be very difficult for the US to credibly communicate anything to importers.</strong> The first reason is because the related objective of export promotion is so clear: as soon as the US has made it officially its policy to boost the export of chips and models, it has a business interest in getting these deals through, which will cast some doubt on any opinionated campaign. The second is that based on recent foreign policy, the US is already facing some suspicion of self-interested negotiation in more general terms. How to get around this? I suspect it differs regionally.</p><p>Especially where political leaders are not sympathetic to the US&#8217; foreign policy, there might be value in <strong>playing the diplomacy offensive via the private sector</strong>: in most less-clued-in middle powers, the private sector is much more anxious and ready to get AI right, and can serve as a more trustworthy internal advocate. Keeping the AI export conversation far enough away from the remainder of the trade policy conversation would presumably be very helpful in a similar vein. The administration&#8217;s close ties to business leaders are an untapped resource for getting export promotion right.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Iet9!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5efdee1a-edee-4be2-b866-361e7b65d3c5_1456x765.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Iet9!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5efdee1a-edee-4be2-b866-361e7b65d3c5_1456x765.png 424w, https://substackcdn.com/image/fetch/$s_!Iet9!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5efdee1a-edee-4be2-b866-361e7b65d3c5_1456x765.png 848w, https://substackcdn.com/image/fetch/$s_!Iet9!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5efdee1a-edee-4be2-b866-361e7b65d3c5_1456x765.png 1272w, https://substackcdn.com/image/fetch/$s_!Iet9!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5efdee1a-edee-4be2-b866-361e7b65d3c5_1456x765.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Iet9!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5efdee1a-edee-4be2-b866-361e7b65d3c5_1456x765.png" width="1456" height="765" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5efdee1a-edee-4be2-b866-361e7b65d3c5_1456x765.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:765,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Iet9!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5efdee1a-edee-4be2-b866-361e7b65d3c5_1456x765.png 424w, https://substackcdn.com/image/fetch/$s_!Iet9!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5efdee1a-edee-4be2-b866-361e7b65d3c5_1456x765.png 848w, https://substackcdn.com/image/fetch/$s_!Iet9!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5efdee1a-edee-4be2-b866-361e7b65d3c5_1456x765.png 1272w, https://substackcdn.com/image/fetch/$s_!Iet9!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5efdee1a-edee-4be2-b866-361e7b65d3c5_1456x765.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><a href="https://hai.stanford.edu/ai-index/2025-ai-index-report/economy">Adoption data</a> is notoriously tricky to read. But very generally, situational awareness is more evenly distributed in the private sector than in governments. That can be leveraged.</figcaption></figure></div><h4><em><strong>Importers Step Up?</strong></em></h4><p>Third, obviously, <strong>importers should do a lot more to get all this right.</strong> I focus on the US in most of this piece because I think the fundamental issue in much of this is importers&#8217; lack of serious thinking on this matter. In an ideal world, they&#8217;d pick up the slack and proactively approach the US with well thought-out ideas for what they want to import within the scope of the EO. But alas &#8211; they are not. Still, for some minimal ideas that might resonate with prospective importers:</p><ul><li><p><strong>Be fast:</strong> as long as the US is still out for early keystone successes, terms are likely to be somewhat better. Relatedly, getting in on the earliest instances of the promotion plan, before a huge SOP has been developed, probably gives you slightly more leverage around the specific contents of the deal.</p></li><li><p><strong>Be specific: </strong>The more ideas you have around your prospective demand regarding what a full-stack import should look like, the more likely you are to be prioritised in negotiations. Given limited negotiation resources on the side of the US government, I&#8217;d expect &#8216;easy-going&#8217; importers to receive some privileged treatment. One way to be an easy-going importer without compromising on a lot of the substance is to be able to reach a deal quickly &#8211; which is only ever going to happen if you don&#8217;t have to figure out what you even want in the process.</p></li><li><p><strong>Don&#8217;t chase the sovereignty spectre. </strong>Most countries in the world <a href="https://writing.antonleicht.me/p/datacenter-delusions">will not reach</a> a meaningful extent of AI sovereignty. The perhaps more relevant measure of your sovereignty as it relates to AI is your ability to retain economic and strategic leverage from <em>other</em> fields to ensure that you remain an attractive export partner for AI. This is the forcing function I mentioned at the top: Once countries realise they&#8217;ll be net importers of AI, that&#8217;ll force them to consider how to structure their economies and strategic alliances around that fact. This has been a long time coming &#8211; realising it early helps get the importing part right, too.</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!n5qz!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!n5qz!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!n5qz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png" width="440" height="110" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!n5qz!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!n5qz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0f3fd518-6a9a-4f1b-871b-5f6ec7a13ca5_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>Outlook</strong></h1><p>I really do believe getting US exports and their promotion right is both a crucial part of US strategy and one of the very few tractable policy projects to meaningfully improve international AI outcomes. At a rare current moment, there is concrete policy action on the table, and a measure of political support behind the directionally correct trend. But for many reasons, it&#8217;s an uphill battle. <strong>On both sides of the import-export-relationship, those in the know have to make sure demand and supply don&#8217;t drift too far apart.</strong> We should really get this right.</p><div><hr></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p>]]></content:encoded></item><item><title><![CDATA[A Strategic Case for H20 Chip Exports]]></title><description><![CDATA[Retracing the argument for exporting inference capacity]]></description><link>https://writing.antonleicht.me/p/the-strategic-case-for-h20-chip-exports</link><guid isPermaLink="false">https://writing.antonleicht.me/p/the-strategic-case-for-h20-chip-exports</guid><dc:creator><![CDATA[Anton Leicht]]></dc:creator><pubDate>Wed, 17 Sep 2025 13:38:42 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!jrAe!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F713b1a06-201b-4d1f-ae1d-16ff2eeab7a2_800x454.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>These days, most reasonable observers seem to agree: Nvidia&#8217;s H20 chip &#8211; powerful computing hardware used to run frontier AI models &#8211; should not be exported to China. Yet the <strong>Trump administration has cleared it for export</strong>. Why? The common answer is that the administration is supposedly deeply captured by Nvidia&#8217;s business interests. Confusing public interventions by Nvidia and puzzling administration statements have added fuel to that fire. The result is a very unsatisfying policy debate that had led to export proponents simply calling their adversaries doomers, and an increasingly incredulous opposition baffled at the administration&#8217;s choices. </p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p>But this framing misses important nuance. <strong>There </strong><em><strong>is</strong></em><strong> a strategic logic under which a serious U.S. government might choose to export inference chips</strong> &#8212; not (only) to enrich Nvidia, but to preserve a fragile equilibrium. US policy has so far targeted frontier AI capabilities, but not broader economic growth and development in China. <strong>Banning inference chip exports visibly breaks with established doctrine, </strong>shifting focus toward limiting deployment in China. That risks provoking China into erratic action. But if you think that America is currently winning the &#8216;AI race&#8217;, you might prefer not to rock the boat. </p><p>This piece <strong>reconstructs the strongest version of that strategic case</strong>, along with its potential shortcomings. It&#8217;s not to say I necessarily think the H20 should, on the whole, be exported. But there&#8217;s value in retracing what I think is the best argument in favour of the exports: I think it can provide for a better understanding and more helpful engagement with the administration&#8217;s choices. And perhaps recover some nuance &#8211; if only to prepare for the next debate over <a href="https://www.cnbc.com/2025/07/16/nvidia-ceo-wants-to-sell-advanced-chips-to-china-after-h20-ban-lifted.html?__source=iosappshare%7Ccom.apple.UIKit.activity.PostToTwitter">inference chip exports.</a></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!jrAe!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F713b1a06-201b-4d1f-ae1d-16ff2eeab7a2_800x454.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!jrAe!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F713b1a06-201b-4d1f-ae1d-16ff2eeab7a2_800x454.png 424w, https://substackcdn.com/image/fetch/$s_!jrAe!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F713b1a06-201b-4d1f-ae1d-16ff2eeab7a2_800x454.png 848w, https://substackcdn.com/image/fetch/$s_!jrAe!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F713b1a06-201b-4d1f-ae1d-16ff2eeab7a2_800x454.png 1272w, https://substackcdn.com/image/fetch/$s_!jrAe!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F713b1a06-201b-4d1f-ae1d-16ff2eeab7a2_800x454.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!jrAe!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F713b1a06-201b-4d1f-ae1d-16ff2eeab7a2_800x454.png" width="800" height="454" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/713b1a06-201b-4d1f-ae1d-16ff2eeab7a2_800x454.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:454,&quot;width&quot;:800,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:215126,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://writing.antonleicht.me/i/173739094?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F713b1a06-201b-4d1f-ae1d-16ff2eeab7a2_800x454.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!jrAe!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F713b1a06-201b-4d1f-ae1d-16ff2eeab7a2_800x454.png 424w, https://substackcdn.com/image/fetch/$s_!jrAe!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F713b1a06-201b-4d1f-ae1d-16ff2eeab7a2_800x454.png 848w, https://substackcdn.com/image/fetch/$s_!jrAe!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F713b1a06-201b-4d1f-ae1d-16ff2eeab7a2_800x454.png 1272w, https://substackcdn.com/image/fetch/$s_!jrAe!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F713b1a06-201b-4d1f-ae1d-16ff2eeab7a2_800x454.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h1><strong>A Doctrinal Shift</strong></h1><p>Today&#8217;s China-focused export controls, downstream of the initial <a href="https://www.csis.org/analysis/where-chips-fall-us-export-controls-under-biden-administration-2022-2024">framework</a> of 2022&#8217;s CHIPS act, have a narrow aim: Disrupt Chinese ability to pursue frontier AI capabilities. Banning the export of the H20, which is not primarily read as capability-increasing chip, departs from this underlying doctrine. Therefore, the argument goes, export bans risk upsetting the strategic equilibrium and with it our current, favourable trajectory.</p><h4><em>Inference is (Still) not Training</em></h4><p><strong>The H20, by itself, does not enable the development of frontier models</strong>. It is a (remarkably capable, top-tier) chip for servicing inference, i.e. for deploying and improving already-pretrained models. Of course, the <a href="https://ifp.org/the-h20-problem/">lines blur</a> in today&#8217;s technical paradigm, which has model capability scale with available inference compute and allows large RL-driven improvement, but puts less emphasis on pre-training alone. It&#8217;s inaccurate to say that the H20 <em>only </em>allows running extant models; it does help improve and customise models to pursue specific tasks, and can be used to boost their overall performance. But still, the fundamental strategic distinction, one that informed the development of the H20 to begin with, remains: If you export only H20s to an adversary, they don&#8217;t have the silicon to build a competitive frontier model. In that important sense, it remains a chip for deployment, not development. </p><p>And beyond the technical facts, the H20 is still a chip designed, branded and sold as focused on inference &#8212; which matters for strategic perception. As a result, <strong>banning H20 exports is a visible departure from past US doctrine</strong>. The CHIPS act and subsequent communications implied a clear American policy: To <strong>prevent China from building its own frontier AI, but not from overall growth and progress. </strong>Inference chips bans instead send a different message: Chip export controls will no longer be leveraged to exclusively target frontier development, but also include economically valuable AI deployment. Maybe that is the right message to send, maybe not &#8211; but it&#8217;s different in an important way.</p><p>By all <a href="https://www.chinatalk.media/p/china-responds-to-chip-export-controls">accounts</a>, the CCP has been outraged and hit hard by the original export controls. Still, by now, China&#8217;s AI ecosystem has accepted the CHIPS act doctrine<strong> </strong>and its narrow focus on national-security-relevant frontier development capability: Chinese strategy does not count on importing US-designed training chips, or on importing Western semiconductor manufacturing equipment. China is instead moving toward sovereignty on its own frontier AI stack. One major contribution to this equilibrium has been that there was a credible understanding that <strong>China would not be seriously hindered from deploying AI</strong> to fuel its own economy. In fact, even &#8216;hawkish&#8217; top Biden officials have sought to <a href="https://bidenwhitehouse.archives.gov/briefing-room/speeches-remarks/2022/09/16/remarks-by-national-security-advisor-jake-sullivan-at-the-special-competitive-studies-project-global-emerging-technologies-summit/">avoid sending that message</a>; it was not the policy of the US to target overall Chinese economic growth. Perhaps partly in consequence, <a href="https://carnegieendowment.org/research/2025/07/chinas-ai-policy-in-the-deepseek-era?lang=en">China&#8217;s AI strategy</a> as per the recent AI+ plans aims strongly at the deployment of AI systems throughout its economy &#8211; a strategy that requires few frontier training chips, but runs on the wide availability of inference capacity. The current <strong>equilibrium in international AI policy depends on that mutual understanding underpinned by the original scope of 2022&#8217;s export controls.</strong></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!dUpu!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F609417cc-b9ea-431b-b8ae-4329cc660683_2172x651.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!dUpu!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F609417cc-b9ea-431b-b8ae-4329cc660683_2172x651.png 424w, https://substackcdn.com/image/fetch/$s_!dUpu!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F609417cc-b9ea-431b-b8ae-4329cc660683_2172x651.png 848w, https://substackcdn.com/image/fetch/$s_!dUpu!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F609417cc-b9ea-431b-b8ae-4329cc660683_2172x651.png 1272w, https://substackcdn.com/image/fetch/$s_!dUpu!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F609417cc-b9ea-431b-b8ae-4329cc660683_2172x651.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!dUpu!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F609417cc-b9ea-431b-b8ae-4329cc660683_2172x651.png" width="1456" height="436" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/609417cc-b9ea-431b-b8ae-4329cc660683_2172x651.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:436,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:678343,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://writing.antonleicht.me/i/173739094?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F609417cc-b9ea-431b-b8ae-4329cc660683_2172x651.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!dUpu!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F609417cc-b9ea-431b-b8ae-4329cc660683_2172x651.png 424w, https://substackcdn.com/image/fetch/$s_!dUpu!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F609417cc-b9ea-431b-b8ae-4329cc660683_2172x651.png 848w, https://substackcdn.com/image/fetch/$s_!dUpu!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F609417cc-b9ea-431b-b8ae-4329cc660683_2172x651.png 1272w, https://substackcdn.com/image/fetch/$s_!dUpu!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F609417cc-b9ea-431b-b8ae-4329cc660683_2172x651.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div class="pullquote"><p>&#8220;<strong>What we're focused on is only the most sensitive technology that could pose a threat to our security. We're not focused on cutting off trade, or for that matter containing or holding back China.</strong>&#8221; &#8212; <a href="https://2021-2025.state.gov/secretary-antony-j-blinken-with-steve-inskeep-of-npr/#:~:text=SECRETARY%20BLINKEN%3A%20And%20so%2C%20again,containing%20or%20holding%20back%20China.">Antony Blinken</a>, then US Secretary of State, in Beijing, 2024.</p></div><h4><em>Two Paths Only </em></h4><p>As AI grows increasingly more important, and for the time that the US retains its lead, the doctrinal choices bifurcate more and more clearly: <strong>Either you stay narrow and target frontier development; or you go broad, implying a doctrine of targeting Chinese economic growth. </strong>That&#8217;s because AI deployment becomes increasingly difficult to divorce from general economic considerations: more and more economic sectors will depend on it. This is particularly true given the current Chinese approach to AI, which identifies diffusion and deployment as the central objective. Restricting inference threatens this overall approach to economic policy very directly. Put differently:<strong> If the Chinese plan is to use AI in every business, hitting deployment capacity necessarily means hitting the Chinese economy at large.</strong></p><p>As a result,<strong> the logic that justifies banning the H20 signals a departure towards a broader doctrine of export controls.</strong> As a Chinese onlooker, you&#8217;d be justified to wonder: If inference capacity is targeted now, what other elements of the deployment strategy might be next? For starters, there are other non-frontier-training chips that can turn out important to Chinese AI deployment efforts, such as RTX Pro chips used in advanced manufacturing. And similar criteria seem likely to apply to other resources just as well: API access or electrical infrastructure, for instance, also enable the deployment of AI throughout the Chinese economy. Now, I&#8217;m not saying all exporters of the H20 are likely or obligated to suggest these export controls as well. I <em>am</em> saying that a decision to not export the H20 communicates a willingness to diverge from established doctrine, in <strong>a step toward more sweeping controls that target the Chinese economy.</strong></p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!EQnP!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!EQnP!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!EQnP!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!EQnP!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!EQnP!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!EQnP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png" width="326" height="81.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:326,&quot;bytes&quot;:25136,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://writing.antonleicht.me/i/173739094?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!EQnP!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!EQnP!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!EQnP!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!EQnP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>Rocking the Boat</strong></h1><p>Entertain the notion that China might understand inference export restrictions as a doctrinal shift for a moment. China hawks might be tempted to dismiss that, arguing that China is already operating adversarially. But the argument does not require a dovish view. Instead, <strong>you might like the way things are going in the US-China race right now, and not want to rock the boat.</strong> Sticking to established doctrine would be a path to maintain a favourable strategic dynamic. And there<em> are</em> some reasons to think the AI race is going well right now &#8211; both from a US and global perspective:</p><h4><em>America is Winning</em></h4><p>First, you might think <strong>the race going well for the US</strong>, which still holds a sizeable lead in frontier model development and frontier chip design; as well as a commanding starting position for the race for global diffusion. Even datacenter infrastructure and open source competitiveness, two often-invoked US weaknesses, have gone fairly well lately. Especially if you follow the administration&#8217;s logic and believe in the <a href="https://x.com/sriramk/status/1961072926561550366">flywheel effect</a> of globally proliferating your superior AI stack, <strong>you would think things are generally going very well. </strong></p><p>You might not share that optimism, for instance because you think that today&#8217;s favourable position is due to greater access to frontier chips, and exports threaten that. Perhaps you&#8217;re right &#8212; for now, I simply want to say: it&#8217;s worth noting that parts of the administration do feel optimistic about the state of affairs, even if you don&#8217;t agree. It matters for understanding the logic behind their policies. </p><p>More narrowly, you might also think that <strong>continued Chinese use of Nvidia inference is actually good news for the US</strong> &#8211; because it delays a tightly integrated hardware-software stack in China at least for some time. On the administration&#8217;s logic and by many technical accounts, tight-knit integration between chip design and software on one hand and model development and deployment on the other hand quickly leads to accelerating capabilities. Setting up a full Chinese-built stack would hence be strategically valuable in the mid-to-long term. But as long as US-built inference chips remain available, <a href="https://www.ft.com/content/eb984646-6320-4bfe-a78d-a1da2274b092">Chinese companies&#8217; short-term interests will push to use them</a> over Huawei chips: right now, Nvidia inference chips are still clearly better. <strong>This sets back some Chinese efforts of software-hardware integration</strong> by as long as they keep using Nvidia inference instead. </p><p>Importantly, none of the above has any substantial bearing on <a href="https://semianalysis.com/2025/09/08/huawei-ascend-production-ramp/#">Chinese capacity a couple of years down the road</a>. Experts largely agree that sooner or later, Chinese semiconductor production will ramp up so China won&#8217;t be strapped for inference supply anyways. Export decisions largely don't affect SMIC's trajectory toward servicing domestic demand&#8211; ramping up semiconductor manufacturing is a core strategic objective, and <strong>Chinese buildout will ultimately happen no matter how much you export. </strong>So, we&#8217;re mostly talking about a few transient years of Nvidia-fueled Chinese inference either way.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!1-Dc!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2b7b2f05-0221-4f51-8abe-ff4cc2c9f6f3_1536x942.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!1-Dc!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2b7b2f05-0221-4f51-8abe-ff4cc2c9f6f3_1536x942.png 424w, https://substackcdn.com/image/fetch/$s_!1-Dc!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2b7b2f05-0221-4f51-8abe-ff4cc2c9f6f3_1536x942.png 848w, https://substackcdn.com/image/fetch/$s_!1-Dc!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2b7b2f05-0221-4f51-8abe-ff4cc2c9f6f3_1536x942.png 1272w, https://substackcdn.com/image/fetch/$s_!1-Dc!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2b7b2f05-0221-4f51-8abe-ff4cc2c9f6f3_1536x942.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!1-Dc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2b7b2f05-0221-4f51-8abe-ff4cc2c9f6f3_1536x942.png" width="1456" height="893" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2b7b2f05-0221-4f51-8abe-ff4cc2c9f6f3_1536x942.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:893,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!1-Dc!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2b7b2f05-0221-4f51-8abe-ff4cc2c9f6f3_1536x942.png 424w, https://substackcdn.com/image/fetch/$s_!1-Dc!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2b7b2f05-0221-4f51-8abe-ff4cc2c9f6f3_1536x942.png 848w, https://substackcdn.com/image/fetch/$s_!1-Dc!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2b7b2f05-0221-4f51-8abe-ff4cc2c9f6f3_1536x942.png 1272w, https://substackcdn.com/image/fetch/$s_!1-Dc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2b7b2f05-0221-4f51-8abe-ff4cc2c9f6f3_1536x942.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">On account of multiple bottlenecks, production of Huawei&#8217;s Ascend chips faces an upper limit that&#8217;s already being maxed out. (<a href="https://semianalysis.com/2025/09/08/huawei-ascend-production-ramp/#">Semianalysis</a>)</figcaption></figure></div><p>The chip ecosystem tradeoff is therefore this: Do you provide more inference capacity to China <em>now</em>, setting back meaningful hardware-software-integration by some time and thereby reducing the potency of Chinese-built chips once their production comes fully online? I think depending on how valuable you think current diffusion is, you can come to different conclusions on that question. I tend to think that current deployment is really not all that valuable, and <strong>delaying Chinese software flywheel effects by supporting Nvidia-run inference could be a win in isolation</strong>. </p><p>Why is China doing this, then? My best guess is that they&#8217;re making a mistake, driven by their AI champions&#8217; short-term interests. It&#8217;s a mistake the CCP is <a href="https://www.reuters.com/world/china/china-cautions-tech-firms-over-nvidia-h20-ai-chip-purchases-sources-say-2025-08-12/">trying to fix</a> by encouraging use of domestic semiconductors, but for now, they&#8217;re still making it. The export proponent&#8217;s credo is simple: <em>let them</em>. </p><h4><em>The Best Version of the Race</em></h4><p>Second, <strong>you might also think the AI race is going well for everyone.</strong> I&#8217;m partial to this view: the race seems mostly economic and diffusion-focused, with a limited extent of securitisation and state ownership, and no likely military culmination in the near future. We seem to be set for a great power competition that produces AI systems available for market rates, not security clearances; and it seems likely that we&#8217;ll be able to somewhat effectively diffuse the benefits through long-established mechanisms of technological progress. That was not inevitable.</p><p>In the current setting, we&#8217;re spared many of the worst versions of the AI race that had analysts worried. We see relatively little securitised secluded development, lab-internal-only deployment, hasty integration into military structure, rampant sabotage and supply chain disruption. We&#8217;re in a strange situation where <em><strong>both</strong></em><strong> participating great powers seem to believe that the economic version of the race is best for them</strong>. Perhaps one of them is mistaken, perhaps it&#8217;s actually positive-sum. No matter, I get why you&#8217;d think this is <strong>an equilibrium worth maintaining.</strong></p><h4><em>China Can Flip the Gameboard</em></h4><p><strong>All of this can quickly change once you change doctrine visibly enough to prompt China. </strong>Current Chinese strategy, formulated post CHIPS act controls, would suffer from a broad US export control regimen. Deployment-focused plans would not survive contact with an aggressive US effort to restrict AI-diffusion-driven economic growth, beginning with the H20 and plausibly threatening to include further important goods. A lasting shift of US doctrine to that effect would ring alarm bells.</p><p><strong>As soon as the CCP is faced with the realistic prospect of an aggressive, deployment-focused export control scheme, it&#8217;ll be prompted to react.</strong> How? Arguably, the CCP has <a href="https://semianalysis.com/2025/08/12/scaling-the-memory-wall-the-rise-and-roadmap-of-hbm/#">no magic way</a> to redouble its efforts in the economic AI competition: It is already aiming its strategic capacities at increasing AI capacity and fostering deployment as much as it can. So if they had to react to a threat to the deployment strategy, China might feel inclined to shake things up.</p><p>I don&#8217;t know how that would look exactly. I definitely think it would make all the disruptive ways for an AI race to play out more likely again. Plenty of frequently-discussed drastic policy actions from China could suddenly be acutely on the table. On AI specifically, perhaps coordinated sabotage of US efforts, leading to a more securitised setting; or a state-run AI development effort prompting a similar state-run American competitor. On broader policy terms, perhaps a hastening of plans on Taiwan prompted by fear of asymmetrically increased military AI adoption in the US; or an extension of controls on rare materials or legacy semiconductors. The predictive details are unclear, at least to me. I&#8217;ll just say that it seems <strong>unlikely that China would do nothing, and that there are plenty of escalatory levers it has not pulled yet.</strong> Any way in which China could attempt to flip the gameboard threatens the ways in which the AI race is going well right now.</p><p>If you think that the current state of the race amounts to slowly, effectively choking off the Chinese AI ecosystem and its global influence, you&#8217;d do quite a lot to reduce any source of volatility. If exporting inference capacity is what it takes to reduce the risk of radical action, then you might think that&#8217;s a price worth paying. </p><p>It&#8217;s a tricky balance even today: Recent back-and-forths around export controls have already endangered the strategy, leading to Chinese government warnings against use of the H20 and an outright <a href="https://www.ft.com/content/12adf92d-3e34-428a-8d61-c9169511915c">ban of the RTX Pro 6000D</a>. A strong reiteration of the American commitment to inference exports might still remedy the fallout &#8212; perhaps. </p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!EQnP!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!EQnP!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!EQnP!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!EQnP!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!EQnP!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!EQnP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png" width="326" height="81.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:326,&quot;bytes&quot;:25136,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://writing.antonleicht.me/i/173739094?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!EQnP!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!EQnP!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!EQnP!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!EQnP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>A Better Doctrine?</strong></h1><p>Is there a way to frame the H20 decisions in a way that doesn&#8217;t give the impression of departing from the frontier-focused doctrine? Some suggestions have been made to that effect.</p><h4><em>The Total Compute View</em></h4><p>First, <strong>perhaps pursuing frontier capabilities should be understood in a broader sense: </strong>Exporting the H20 frees up existing capacity in China to be used on other things; for example by freeing up smuggled training chips that currently have to be used to service inference to be used on training instead. The broadest version of this view has given rise to a &#8216;<a href="https://www.rand.org/pubs/commentary/2025/05/chinas-ai-models-are-closing-the-gap-but-americas-real.html#:~:text=China%20will%20likely%20match%20U,overreacting%20to%20predictable%20Chinese%20advancements">total compute</a>&#8217; model, in which the key objective of export controls is to keep China under a certain threshold of compute. Could that be a way to retain a narrow compute-focused doctrine without giving the impression of targeting overall growth? It might be, if AI was a fairly niche phenomenon. But, as established by many policy documents and executive orders (all of them read in China, no doubt), the US government believes AI is a core economic driver of the future, and compute is the substrate on which it runs.</p><p>At the same time, China has repeatedly, publicly laid out that it seeks to use a large amount of compute for general integration. That makes it hard to sell a total compute view as a narrow technical stance. To develop policy that limits the substrate of economic growth is likely to be read as limiting China&#8217;s overall economic productivity. Again &#8211; perhaps that is the right call, especially in light of China&#8217;s diffusion-focused strategy; perhaps you otherwise lose the race on deployment, no matter how good your models are. But <strong>arguments in favour of switching strategies seem to undersell the magnitude of the shift</strong>. A total compute strategy aimed at diffusion and deployment is <em>not remotely</em> the same thing as the past frontier-focused strategy. It would be a big step with meaningful strategic implications.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!pi1c!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F14952ea6-1720-4de3-8369-301b7ebec593_1047x805.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!pi1c!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F14952ea6-1720-4de3-8369-301b7ebec593_1047x805.png 424w, https://substackcdn.com/image/fetch/$s_!pi1c!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F14952ea6-1720-4de3-8369-301b7ebec593_1047x805.png 848w, https://substackcdn.com/image/fetch/$s_!pi1c!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F14952ea6-1720-4de3-8369-301b7ebec593_1047x805.png 1272w, https://substackcdn.com/image/fetch/$s_!pi1c!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F14952ea6-1720-4de3-8369-301b7ebec593_1047x805.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!pi1c!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F14952ea6-1720-4de3-8369-301b7ebec593_1047x805.png" width="1047" height="805" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/14952ea6-1720-4de3-8369-301b7ebec593_1047x805.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:805,&quot;width&quot;:1047,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!pi1c!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F14952ea6-1720-4de3-8369-301b7ebec593_1047x805.png 424w, https://substackcdn.com/image/fetch/$s_!pi1c!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F14952ea6-1720-4de3-8369-301b7ebec593_1047x805.png 848w, https://substackcdn.com/image/fetch/$s_!pi1c!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F14952ea6-1720-4de3-8369-301b7ebec593_1047x805.png 1272w, https://substackcdn.com/image/fetch/$s_!pi1c!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F14952ea6-1720-4de3-8369-301b7ebec593_1047x805.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><a href="https://www.rand.org/pubs/commentary/2025/05/chinas-ai-models-are-closing-the-gap-but-americas-real.html#:~:text=China%20will%20likely%20match%20U,overreacting%20to%20predictable%20Chinese%20advancements">This</a> from RAND lays out the total compute view.</figcaption></figure></div><h4><em>Renting Out Inference</em></h4><p>Second, you might argue that not exporting the H20 does not mean strapping Chinese inference capacity &#8211; because <strong>you&#8217;ll just <a href="https://www.rand.org/pubs/commentary/2025/08/america-should-rent-not-sell-ai-chips-to-china.html">rent out H20 capacity instead</a>.</strong> I really like this suggestion, and think it makes for an elegant response to much of the current policy noise. But I still think, from the Chinese perspective, it makes for unsatisfying consolation. If you do believe in this outsized importance of compute for running your economy, you don&#8217;t want your inference to be serviced from <a href="https://writing.antonleicht.me/p/datacenter-delusions">datacenters abroad</a>, to be switched off at a moment&#8217;s notice. </p><p>In some sense, the &#8216;renting H20&#8217;-proposal is a good mark of why it&#8217;s valuable to retrace the strongest version of the pro export argument: Renting is a very good solution if the issue is &#8216;how do we ban exports while allowing Nvidia to still make a lot of money&#8217;, following the cronyism view. It&#8217;s somewhat less effective at solving the problem if it&#8217;s &#8216;how do we not rock the boat that&#8217;s taking us across the AI race finish line&#8217;, because it does pose a challenge to current Chinese AI strategy.</p><h1>The Chinese Exports Problem</h1><p>That said,<strong> I do think there is a weighty argument against H20 exports on the administration&#8217;s own logic: They enable a Chinese export product. </strong>One of the greatest threats to the market-share-focused, &#8216;flywheel&#8217;-inducing AI foreign policy strategy is a competing export product. If China develops an exportable stack itself, it can compete with the US, undercut prices, threaten global software lockin and more &#8211; thereby threatening the US negotiation position for export promotion as well as the value of getting these exports right.</p><p>But <strong>China&#8217;s export stack will have to be Chinese-built compute</strong>; otherwise it&#8217;s no viable competitor, is not under full Chinese control, and doesn&#8217;t cash in on the same flywheel merits as the US product. But it seems unlikely that China would prioritise using its home-built compute for exports if it was still struggling to meet inference demand at home. </p><p>This domestic demand will be serviced by a mix of domestic and Nvidia chips. So for any H20 exported today, you free up one comparable Huawei chip for export in the future. On the mercantilist, export-focused US perspective, exporting the H20 still hurts: not because it&#8217;s particularly useful to China or harmful to US interest, but because H20s free up Chinese chip production to build an export product. One of the big reasons why the US would be favoured in the current version of the AI race is that it&#8217;s first to the global export party: the US stack can proliferate while China struggles to meet domestic demand. Relaxing this constraint on China by <strong>exporting H20s threatens to undercut US advantage in the export race. </strong>If that happened, you might suddenly be less optimistic about whether the US leads the race &#8211; and might have to reconsider your H20 strategy as well.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!EQnP!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!EQnP!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!EQnP!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!EQnP!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!EQnP!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!EQnP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png" width="326" height="81.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1000,&quot;resizeWidth&quot;:326,&quot;bytes&quot;:25136,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://writing.antonleicht.me/i/173739094?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!EQnP!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 424w, https://substackcdn.com/image/fetch/$s_!EQnP!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 848w, https://substackcdn.com/image/fetch/$s_!EQnP!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 1272w, https://substackcdn.com/image/fetch/$s_!EQnP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9f9cf9b9-d428-4074-938d-b2b40973afc6_1000x250.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h1><strong>Outlook</strong></h1><p>All this is a <strong>very specific chain of arguments that requires accepting a host of contentious premises</strong>: around the current shape and trajectory of the race, about America&#8217;s odds in that race, and about the likely development of Chinese AI ambitions and strategies. But by many public accounts, <strong>the current administration does accept these premises</strong>. If you agree, I think it&#8217;s not too difficult to follow them to the conclusion of exporting inference chips &#8212; especially if you factor in the revenue gains that surely play a role as well.</p><p>That said, I&#8217;m still <strong>not quite sure if any of that means exporting the H20 is a good idea.</strong> For one, I do think enabling Chinese exports is seriously opposed to the administration&#8217;s agenda. And beyond that, trading off the above arguments against the many good points made in favour of restricting exports is very difficult, and certainly beyond the scope of this post.</p><p>But<strong> I don&#8217;t mean to persuade you one way or the other</strong> - but to invite you to consider an alternative explanation of the administration&#8217;s current decisionmaking. For detractors, that might mean focusing their spirited objections elsewhere: on the export argument, on the tradeoffs between giving China more inference now or a more integrated stack later, or on explaining why China&#8217;s reaction to a strategic change would not induce volatility. </p><p>A more doctrine-forward conversation around this question might ultimately also be in the interest of export supporters. At least, I&#8217;m not sure how tenable it is for the administration to continue offering very few substantive comments on the export strategy, and let the void be filled by Nvidia&#8217;s PR antics instead. I remain convinced that <strong>somewhere within this discussion on the H20, there&#8217;s a real debate to be had.</strong></p><div><hr></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[AI, Jobs, and the Rest of the World]]></title><description><![CDATA[No one but America has the leverage to get AI labor policy right]]></description><link>https://writing.antonleicht.me/p/ai-jobs-and-the-rest-of-the-world</link><guid isPermaLink="false">https://writing.antonleicht.me/p/ai-jobs-and-the-rest-of-the-world</guid><dc:creator><![CDATA[Anton Leicht]]></dc:creator><pubDate>Tue, 09 Sep 2025 12:54:33 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!SEXO!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c9a9985-a994-4284-b4a2-e602eb13efb9_1600x1087.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Isma&#8217;il Pasha was sure he had cracked the code of policy arbitrage. In 1863, he endeavoured to modernize Egypt&#8217;s failing economy and military through massive Europeanisation: schools and railways, boulevards and line infantry. Without a European-scale tax base, the plan quickly collapsed: Egypt was placed under humiliating external supervision of its debtors, and Isma&#8217;il was quickly deposed.</em></p><div><hr></div><p>Serious people in America are thinking more and more about reasonable research to assess AI labor market impacts, and reasonable policy to shape them. This is very good news for the 350 million citizens of the US, and I feel tentatively optimistic about their AI and jobs prospects. However, <strong>most of this research and all policy ideas exclusively work in America</strong> &#8212; and could often make things for other countries worse. International policymakers&#8217; default of observing US trends and decisions on AI and modeling approaches in response is set up to fail. They face the kind of rough awakening that befell poor Isma&#8217;il. </p><p>That is because basically <strong>any proposal on AI &amp; jobs that hinges either on regulation or redistribution works entirely differently in the rest of the world than in the US: </strong>Only the US retains an enduring fiscal base in an AI age; only the US commands substantial regulatory leverage over AI development. As a result, I think one of the most likely futures holds a US economy with high employment of AI-augmented workers, while many other global labor markets are caught off guard and severely disrupted by US-run AI agents.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://writing.antonleicht.me/subscribe?"><span>Subscribe now</span></a></p><p>This is a particularly big problem because it messes with AI policy&#8217;s implicit arrangement that the US is the policy research lab for the world. Clued-in <strong>policymakers anywhere are looking to the US </strong>for trends, ideas, and canaries in the coalmine of AI and jobs, shaping their responses based on the trends they observe. And among researchers and advocates<strong>, the best and brightest flock to the US</strong>, identifying correctly that the most important jurisdiction offers the largest lever to pull. In exchange, <strong>they figure out what works best to make AI go well anywhere</strong>, report back to the provinces, and the world learns to get all this right. <strong>This all dramatically breaks down in this specific case</strong>: Any policymaker who looks to the US to learn what to do about AI &amp; jobs is doomed to fail. No ideas we might come up with on labor policy translate well to any other jurisdiction.</p><p>This breaks down into three reasons: Most <a href="http://writing.antonleicht.me/p/a-roadmap-for-ai-middle-powers">middle powers</a> are <em>more exposed to disruption, have less regulatory leverage, and cannot expect an influx of tax revenue from AI progress</em>. If there&#8217;s any way out, it runs through getting exports, imports and strategic leverage right. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!SEXO!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c9a9985-a994-4284-b4a2-e602eb13efb9_1600x1087.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!SEXO!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c9a9985-a994-4284-b4a2-e602eb13efb9_1600x1087.png 424w, https://substackcdn.com/image/fetch/$s_!SEXO!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c9a9985-a994-4284-b4a2-e602eb13efb9_1600x1087.png 848w, https://substackcdn.com/image/fetch/$s_!SEXO!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c9a9985-a994-4284-b4a2-e602eb13efb9_1600x1087.png 1272w, https://substackcdn.com/image/fetch/$s_!SEXO!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c9a9985-a994-4284-b4a2-e602eb13efb9_1600x1087.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!SEXO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c9a9985-a994-4284-b4a2-e602eb13efb9_1600x1087.png" width="1456" height="989" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/8c9a9985-a994-4284-b4a2-e602eb13efb9_1600x1087.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:989,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!SEXO!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c9a9985-a994-4284-b4a2-e602eb13efb9_1600x1087.png 424w, https://substackcdn.com/image/fetch/$s_!SEXO!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c9a9985-a994-4284-b4a2-e602eb13efb9_1600x1087.png 848w, https://substackcdn.com/image/fetch/$s_!SEXO!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c9a9985-a994-4284-b4a2-e602eb13efb9_1600x1087.png 1272w, https://substackcdn.com/image/fetch/$s_!SEXO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c9a9985-a994-4284-b4a2-e602eb13efb9_1600x1087.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">&#8216;Poverty and Wealth&#8217;</figcaption></figure></div><h1><strong>Greater Exposure</strong></h1><p><strong>First, I think there are good reasons to believe that disruptive job effects from AI are likely to hit non-US countries earlier and harder</strong>. Two important predictors for this trend are the <em>degree of individual augmentation</em> and <em>economy-wide AI literacy</em>.</p><p>By degree of individual augmentation, I mean to say: <strong>A labor force that has adopted helpful AI technologies to boost its output is less susceptible to outright displacement by AI agents. </strong>The efficiency gaps between augmented workers and agents are much smaller, making the lump investment of changing gears to agents much less attractive. This is a simplified version of the story &#8211; more effective augmented workers reduce a company&#8217;s demand for new hires, for instance. But still, I maintain that in a growing economy, it makes a big difference if a new company pays for 10 AI agents or hires 10 AI-augmented employees. I think the odds are good that America is set to lead on AI adoption, and therefore will have a more agent-resistant workforce in due time. This is mostly for two reasons: because the US has by-default access and very high awareness of the tech itself; and because its policy environment is much more on the ball when it comes to enabling AI uptake &#8211; the <a href="https://www.whitehouse.gov/wp-content/uploads/2025/07/Americas-AI-Action-Plan.pdf">AI Action Plan</a> in particular reads like a dream come true for fans of augmentation-boosting policy.</p><p>By general AI literacy, I mean <strong>how quickly the overall economy adopts AI,</strong> not as a matter of individual users using AI, but through identifying opportunities for new businesses made possible by advanced AI capabilities. Even if AI agents disrupt your existing labor market, there will be ample time for humans to find employment through novel applications of these agents &#8211; at least for a time. As with past technological revolutions, these jobs will be created where entrepreneurs come up with ideas to effectively deploy these new capabilities. I&#8217;m likewise optimistic about America&#8217;s comparative ability here, both for the reasons above and because of America&#8217;s astonishingly impressive track record regarding tech startups in the last two decades. For all the reasons that everyone building a start-up wants to move to America, <strong>I suspect that many jobs created from novel deployment of AI capabilities will be created in the US.</strong></p><p><strong>What about task profiles? </strong>Is the US perhaps more exposed because of its greater share of white collar workers? I think this is a reasonable claim on the margins, but I&#8217;m unsure it matters a great deal. High-level, the US does not actually have a much larger share of white-collar or information workers than many other major economies, and it makes up for it by having higher-skilled white collar workers, who seem somewhat hard to displace and easier to augment. Beyond such high-level distinctions, I don&#8217;t feel we know enough about sector-by-sector displacement to make strong statements about economies&#8217; respective exposure based on their makeup.</p><p>Everywhere in the world, <strong>job profiles will shift and some jobs will be lost as a result of AI deployment</strong>, in what I&#8217;ve described <a href="http://antonleicht.substack.com/p/ai-and-jobs-two-phases-of-automation">before</a> as &#8216;phase one&#8217; of AI job disruption. I&#8217;ve said that the employment and growth effects of that phase would be neutral to positive, but I think this headline trend will conceal stark regional differences. The US is ahead of the field on two important countervailing trends: augmenting workers to compete with agents, and creating jobs through using AI. As a result, the natural trend of AI-driven labor displacements will hit earlier and harder in the least prepared places elsewhere in the world. How might they react? I&#8217;ve written before that, once disruption happens, two solutions seem possible and feasible in the abstract. First, you might introduce social and labor policy to cushion the effects of unemployment and redeploy your workforce; and second, you might want to directly address AI systems and their impacts through policy. The US is in principle empowered to do either, but most other countries will struggle with this.</p><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;ef9c5350-ad68-444a-99b6-efed71cec978&quot;,&quot;caption&quot;:&quot;Four weeks ago, I argued that we&#8217;re unable to propose useful policy on AI-driven job disruption. Today, I take a short follow-up look at one specific misunderstanding that has contributed to the lack of effective policy development. The misunderstanding arises from two ostensibly rivaling views of AI automation that feature prominently in the political &#8230;&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;sm&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;AI &amp; Jobs: Two Phases Of Automation&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:113003310,&quot;name&quot;:&quot;Anton Leicht&quot;,&quot;bio&quot;:&quot;I write about the policy, politics, and political economy of advanced AI.&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/75422da7-aafa-42ab-8fa6-cf4f0df85cf0_3166x3166.jpeg&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2025-07-09T11:07:38.043Z&quot;,&quot;cover_image&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f5c94ec0-f7ca-4e4d-8c8b-77b08219185d_1920x1892.jpeg&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://writing.antonleicht.me/p/ai-and-jobs-two-phases-of-automation&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:167882437,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:27,&quot;comment_count&quot;:5,&quot;publication_id&quot;:null,&quot;publication_name&quot;:&quot;Threading the Needle&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!4SKU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd7dff89d-5b66-4169-8ef8-eb2ec9ed94e8_1024x1024.png&quot;,&quot;belowTheFold&quot;:true,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><h1><strong>No Taxable Bases</strong></h1><p><strong>Second, the US will have enough tax revenue for social welfare interventions, but many other countries won&#8217;t. </strong>Many policy researchers&#8217; favorite proposals cost a lot of money. Most obviously, any policy of redistribution - whether it&#8217;s money, GPUs, or something else - requires getting that money from somewhere. Enabling widespread retraining and redeployment of your workforce is likewise costly; and smoothing over the broader implications of rural-urban shifts, intermediary unemployment, readjusting education, and whatnot is of course also expensive. Social policy enthusiasts like to point to the New Deal as the prime example of policy-supported economic transformation gone well. Perhaps it was &#8211; but just look at <a href="https://www.ssa.gov/policy/docs/ssb/v30n12/v30n12p3.pdf">how expensive</a> it was:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!inTF!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0094172-9ecd-4cec-9daf-3d00b55a6537_720x390.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!inTF!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0094172-9ecd-4cec-9daf-3d00b55a6537_720x390.png 424w, https://substackcdn.com/image/fetch/$s_!inTF!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0094172-9ecd-4cec-9daf-3d00b55a6537_720x390.png 848w, https://substackcdn.com/image/fetch/$s_!inTF!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0094172-9ecd-4cec-9daf-3d00b55a6537_720x390.png 1272w, https://substackcdn.com/image/fetch/$s_!inTF!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0094172-9ecd-4cec-9daf-3d00b55a6537_720x390.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!inTF!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0094172-9ecd-4cec-9daf-3d00b55a6537_720x390.png" width="720" height="390" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b0094172-9ecd-4cec-9daf-3d00b55a6537_720x390.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:390,&quot;width&quot;:720,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!inTF!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0094172-9ecd-4cec-9daf-3d00b55a6537_720x390.png 424w, https://substackcdn.com/image/fetch/$s_!inTF!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0094172-9ecd-4cec-9daf-3d00b55a6537_720x390.png 848w, https://substackcdn.com/image/fetch/$s_!inTF!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0094172-9ecd-4cec-9daf-3d00b55a6537_720x390.png 1272w, https://substackcdn.com/image/fetch/$s_!inTF!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0094172-9ecd-4cec-9daf-3d00b55a6537_720x390.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>If you have ambitious plans for labor policy, you&#8217;ll need a lot of money. </strong>If you ask where you&#8217;ll get that money nowadays, a frequent response is: &#8216;AI-driven economic growth will be very high, so taxation revenue will increase and enable expansive social policy&#8217;. On a very high level, that&#8217;s true: the total amount of global revenue will dramatically increase. However, two effects make it unlikely that most countries can <em>tax</em> most of that revenue: the revenue moves both within and between economies.</p><h4><em>Inter-Economic Shifts: Revenue moves to the US</em></h4><p><strong>Revenue moves between economies, mostly to the US, because that is where most of the growth is likely to manifest.</strong> You&#8217;d expect an &#8216;AI revolution&#8217; to make AI companies rich first and foremost - and the best AI companies are in America. Some people disagree with that intuitive drive, arguing that the commodification of AI models could stave off revenue agglomeration on the developers&#8217; side. I think this argument is true for general AI models, but likely much less true for the kinds of AI agents we would expect to be required for big job market effects. The path to highly capable agents seems to run through constructing sophisticated, bespoke reinforcement learning environments that make them excel at specific task profiles &#8211; and AI developers are taking different approaches to training their respective models. I find it hard to believe that the resulting suite of agent products would be as easily commoditised as AI models, which are admittedly all very similar.</p><p>But <strong>even if agents were commoditised, I don&#8217;t think it follows that growth distributes equitably. </strong>Another candidate for capturing most revenue is AI infrastructure: semiconductor companies and cloud computing providers. Given the massive shortages in global compute supply and the fact that they might well last in the face of inference-hungry AI agents, I expect much AI-driven growth to accumulate around AI infrastructure. This again means an outflow to America, where the leading chip company sits, and which favours an export strategy that retains US market share.</p><h4><em>Intra-Economic Shifts: Income to Corporate Tax</em></h4><p>And even the portion of revenue that does not move to the US will not necessarily remain easily taxable. <strong>Within economies, revenue might shift from labor to corporations:</strong> As economic value is provided much less through the labor of individual workers and therefore taxed through income taxes; and more through AI agents, which are software products ultimately taxed at corporate tax rates. However, <strong>corporate</strong> <strong>taxes</strong> <strong>are</strong> <strong>generally</strong> <strong>much</strong> <strong>lower</strong> <strong>than</strong> <strong>income</strong> <strong>taxes,</strong> <strong>so</strong> <strong>the</strong> <strong>shift</strong> <strong>hurts</strong> <strong>overall</strong> <strong>revenue.</strong> <strong>Net growth can still lead to a net loss in tax revenue because less is captured by taxes. </strong>This is hard to fix because corporate taxes are also much more difficult to adjust &#8211; even small changes can have big effects on economies, stock markets, investments, and future economic activity. To make matters worse, a corporate tax rate hike would also hit any business that <em>doesn&#8217;t</em> use AI well, likely creating further economic disruption. Could you circumvent that with tech-specific taxes? No, for reasons spelled out further below.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!AZ0R!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F912dfc5c-abe2-47f5-9b02-8f37a2cda5a3_1024x851.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!AZ0R!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F912dfc5c-abe2-47f5-9b02-8f37a2cda5a3_1024x851.png 424w, https://substackcdn.com/image/fetch/$s_!AZ0R!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F912dfc5c-abe2-47f5-9b02-8f37a2cda5a3_1024x851.png 848w, https://substackcdn.com/image/fetch/$s_!AZ0R!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F912dfc5c-abe2-47f5-9b02-8f37a2cda5a3_1024x851.png 1272w, https://substackcdn.com/image/fetch/$s_!AZ0R!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F912dfc5c-abe2-47f5-9b02-8f37a2cda5a3_1024x851.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!AZ0R!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F912dfc5c-abe2-47f5-9b02-8f37a2cda5a3_1024x851.png" width="1024" height="851" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/912dfc5c-abe2-47f5-9b02-8f37a2cda5a3_1024x851.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:851,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!AZ0R!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F912dfc5c-abe2-47f5-9b02-8f37a2cda5a3_1024x851.png 424w, https://substackcdn.com/image/fetch/$s_!AZ0R!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F912dfc5c-abe2-47f5-9b02-8f37a2cda5a3_1024x851.png 848w, https://substackcdn.com/image/fetch/$s_!AZ0R!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F912dfc5c-abe2-47f5-9b02-8f37a2cda5a3_1024x851.png 1272w, https://substackcdn.com/image/fetch/$s_!AZ0R!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F912dfc5c-abe2-47f5-9b02-8f37a2cda5a3_1024x851.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Corporate tax tends to be much lower, but is politically sticky &#8211; which makes it hard to chase revenue.</figcaption></figure></div><p>The result is that the kind of ambitious social policy agenda that could successfully address labor market effects depends on countries benefiting from economic growth driven by AI. This, however, is far from guaranteed, and <strong>I expect many countries to be left at a </strong><em><strong>net loss</strong></em><strong> in tax revenue at a time when they&#8217;d want to expand labor policy</strong> and safety nets. Can regulation save the day? I think not:</p><h1><strong>No Regulatory Leverage</strong></h1><p><strong>Third, only the US has real regulatory leverage over AI developers </strong>and the circumstances of deployment. If your economy is not equipped to deal with AI disruption, and your fiscal setup is not equipped to address the fallout, <strong>you might want to stop what&#8217;s happening through regulation. </strong>There are some politically obvious mechanisms here: you can require &#8216;humans in the loop&#8217; in an attempt to save human jobs. You can regulate AI out of sensitive applications to artificially retain human jobs. You can just pass general regulation on AI systems that makes them harder to deploy, harder to use, more burdensome to make the swap &#8211; and thereby save jobs by slowing diffusion. Alternatively, you can use regulation as a pretext to collect fines, thereby creating a revenue stream in lieu of taxation &#8211; a practice some have accused the EU of pursuing with its digital governance.</p><p>But can you, really? If you&#8217;re the US, of course you can &#8211; in fact, you might, which has a lot of intelligent observers worried. But if you&#8217;re anyone else, passing decelerating AI regulation will become increasingly difficult for two reasons.</p><p><strong>The most important reason is that decelerating regulation risks economic and strategic repercussions. </strong>As AI becomes more important, <em>not using AI</em> becomes less of an option. That makes AI different from past economic trends that you could just choose to skip. If you don&#8217;t have a big cut of the global tech industry today, that&#8217;s already somewhat bad &#8211; the growing divergence between European and American GDP is a testament to that fact. But most other parts of the economy were fine: the gap in tech did not mean a gap in the productivity of other economic sectors. But AI is different altogether: I suspect AI will become an important part of <em>most</em> economic activity, and so being behind on AI means you&#8217;re behind on most of your economic contributions. If you regulate frontier AI out of your economy to stave off labor effects, you might just regulate your economy out of global competitiveness.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!H3r9!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5b00f529-bb30-4af3-b50a-4783bbb5cac4_1466x1032.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!H3r9!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5b00f529-bb30-4af3-b50a-4783bbb5cac4_1466x1032.png 424w, https://substackcdn.com/image/fetch/$s_!H3r9!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5b00f529-bb30-4af3-b50a-4783bbb5cac4_1466x1032.png 848w, https://substackcdn.com/image/fetch/$s_!H3r9!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5b00f529-bb30-4af3-b50a-4783bbb5cac4_1466x1032.png 1272w, https://substackcdn.com/image/fetch/$s_!H3r9!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5b00f529-bb30-4af3-b50a-4783bbb5cac4_1466x1032.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!H3r9!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5b00f529-bb30-4af3-b50a-4783bbb5cac4_1466x1032.png" width="1456" height="1025" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5b00f529-bb30-4af3-b50a-4783bbb5cac4_1466x1032.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1025,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!H3r9!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5b00f529-bb30-4af3-b50a-4783bbb5cac4_1466x1032.png 424w, https://substackcdn.com/image/fetch/$s_!H3r9!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5b00f529-bb30-4af3-b50a-4783bbb5cac4_1466x1032.png 848w, https://substackcdn.com/image/fetch/$s_!H3r9!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5b00f529-bb30-4af3-b50a-4783bbb5cac4_1466x1032.png 1272w, https://substackcdn.com/image/fetch/$s_!H3r9!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5b00f529-bb30-4af3-b50a-4783bbb5cac4_1466x1032.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">In the past decade, the US has already started leaving the rest of the world behind just because of general tech &#8211; which will be marginal compared to AI.</figcaption></figure></div><p>A related effect is that unilateral <strong>decelerating regulation exacerbates labor market risks down the road</strong>. If you stop the diffusion of AI systems throughout your economy now, your workforce will be <em>less</em> augmented and less efficient in the long run. That makes it more susceptible to being outcompeted by autonomous agents or by augmented workforces later on &#8211; and you can&#8217;t forcibly keep your economy from eventually gravitating toward efficient global optima in the long run.</p><p>The second, slightly more contingent effect is <strong>US pressure not to regulate</strong>. The current administration, in particular, has made it very clear that it will not tolerate extensive foreign regulation of its AI developers. It&#8217;s willing to defend this perspective through leverage in other areas, for example, in trade policy or with regard to military and intelligence support. And I suspect it will defend this perspective just as ferociously through leveraging AI access, come time. Currently, the US is poised to dominate the global intelligence flow&#8211;with US-built models hosted on US-built data centers run by US companies. Continued access to frontier capabilities will depend on remaining in America&#8217;s good graces, and regulating its AI developers is a surefire way to endanger that in the current political climate.</p><p>This might change again once administrations change toward a more internationalist view &#8211; but who&#8217;s to say when that will be? And even a future Democratic administration might be reluctant to rescind US pressure against regulation: in the face of geopolitical competition, actively taking a step to allow decelerating external regulation won&#8217;t seem like a wise policy or effective politics. Countries the world over are still holding out some hope of returning to a pre-2016 US relationship. On AI (and most other things), I think it&#8217;s not happening.</p><p>The upshot is that <strong>you</strong> <strong>can&#8217;t</strong> <strong>regulate</strong> <strong>your</strong> <strong>way</strong> <strong>out</strong> <strong>of</strong> <strong>AI</strong> <strong>disruption</strong> <strong>unless</strong> <strong>you&#8217;re</strong> <strong>willing</strong> <strong>to</strong> <strong>accept economic disadvantages and risk strategic retaliation. </strong>At this point in the argument, inevitably someone will tell me &#8216;that&#8217;s why Europe has to get serious this time around&#8217;. On strategic terms, I agree &#8211; on practical terms, please check in on how that&#8217;s going.</p><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;964dea6a-6c5f-44c1-a662-e018487940c0&quot;,&quot;caption&quot;:&quot;When the Goths crossed the Danube in 376 and devastated the eastern provinces, Roman leadership was not very concerned. The western capital&#8217;s walls held, and the Goths were far away. Only when the inflow of fealty, recruits and goods from the ruined provinces dried up did leadership realise how much the safety of the empire had depended on its periphery&#8230;&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;sm&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;The Awareness Gap&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:113003310,&quot;name&quot;:&quot;Anton Leicht&quot;,&quot;bio&quot;:&quot;I write about the policy, politics, and political economy of advanced AI.&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/75422da7-aafa-42ab-8fa6-cf4f0df85cf0_3166x3166.jpeg&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2025-06-17T12:51:23.442Z&quot;,&quot;cover_image&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f792d88e-2b86-40aa-8a2b-649a02401463_629x924.jpeg&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://writing.antonleicht.me/p/the-awareness-gap&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:166145880,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:16,&quot;comment_count&quot;:3,&quot;publication_id&quot;:null,&quot;publication_name&quot;:&quot;Threading the Needle&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!4SKU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd7dff89d-5b66-4169-8ef8-eb2ec9ed94e8_1024x1024.png&quot;,&quot;belowTheFold&quot;:true,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><h1><strong>The Role of US Policy</strong></h1><p>Instead of placing hope in unserious actors getting serious about AI, you might instead hope that the US government does something to its AI developers to help the middle powers &#8211; either on purpose or by accident.</p><p><strong>The US is very unlikely to choose to help middle powers</strong> as the effects of widespread AI deployment rip through their labor markets. Some policy research suggests US companies should be taxed to fund not only welfare spending in the US, but globally &#8211; for reasons that should be obvious from the administration&#8217;s general policies, I don&#8217;t see that happening anytime soon. The US won&#8217;t be of much help in the revenue problem, and for good economic reason: If US companies build a product that outcompetes foreign workers, these foreign workers are not America&#8217;s responsibility any more than displaced auto workers in Detroit were the world&#8217;s responsibility.</p><p>I also don&#8217;t see the US intervening in AI development for the sake of others, i.e., because AI is destabilising foreign labor markets. Quite the opposite: If the rest of the world is hit first, the US might be particularly disincentivised to do <em>anything</em> about AI labor disruption. Quite quickly, the story could become that AI is helping effectively &#8216;reshore&#8217; jobs: White-collar jobs that have been lost to the US economy are returning &#8211; not to be carried out by US workers, but by US AI agents at least, promising tax revenue and a sense of retributive justice.</p><p>On policy, <strong>if things get bad enough in America, could there be a positive spillover from US interventions that also helps the rest of the world?</strong> The US will be interested in solving its own labor market problems, and would probably intervene if disruption were to mount. As argued above, AI might just go fairly well for the US job market specifically. But even if it doesn&#8217;t, it&#8217;s far from certain that US-specific policy solutions would actually help the rest of the world. The only scenario in which that might be true is if the US policy response is <em>slowing down</em> AI development &#8211; not diffusion &#8211; to reduce job market effects. But that seems unlikely for geopolitical reasons: The administration is still committed to seeing through a technological competition with China, and hitting development with decelerating policy interventions is a surefire way to lose that. They&#8217;ll try other solutions first, whether that&#8217;s welfare, retraining, or slowing domestic diffusion &#8211; none of which help anyone outside the US.</p><p>There is one silver lining: The US is aiming to execute an ambitious export promotion strategy that could close the widening deployment gaps that have motivated my argument. More on the promise and pitfalls of this proposed plan will feature in this publication soon.</p><p>But for now, in general terms, the lesson remains: Not only can you not rely on US policy to fix your problem; <strong>from the administration&#8217;s perspective, exchanging foreign jobs for US agent revenue might often be a feature, not a bug.</strong></p><h1><strong>Facing the Music</strong></h1><p>What's the key takeaway from all this? First, <strong>I&#8217;d really like for the AI policy ecosystem to take this question much, </strong><em><strong>much</strong></em><strong> more seriously.</strong> My sense is that no one is on the ball on the specific issue of job disruption in non-US labor markets. That&#8217;s true for many areas of AI policy as they relate to middle powers &#8211; but it&#8217;s a much bigger issue here, because the solutions spill over so much less. I regret to leave you with as much of a non-answer, but I think the challenge is profound enough to warrant an unsatisfying post; solving it will require quite a bit more firepower than this publication can provide. I do have some starting thoughts:</p><p>My general suspicion is that <strong>this conversation collapses to a geostrategic question</strong>: getting this right might be much less about finding a silver bullet labor policy, and much more about retaining leverage and participation in frontier AI. That&#8217;s a broader and trickier policy ask. It&#8217;s currently not addressed with enough urgency by most middle powers - but I think framing the politically salient labor issue in terms of these strategic terms could provide another reason to change that. Policymakers famously care about jobs quite a lot, and seem generally more willing to believe that job disruption from AI is likely than they are to believe some of the other grand theories of big AI impacts. Pointing out to them that their economies are exposed to AI job disruption should lend some further urgency to the underlying strategic questions and motivate a more decisive response.</p><p>What would that look like if it were effective? As I&#8217;ve <a href="https://writing.antonleicht.me/p/a-roadmap-for-ai-middle-powers">outlined</a> a couple of months ago (and desperately need to comprehensively update), it has a lot to do with <strong>finding points of leverage within an AI-driven world</strong>; either up or down the supply chain. If momentary high leverage can be spun into an <em>enduring</em> share in AI-driven growth, this could plausibly solve the problem. For instance, by selling valuable data in exchange for an enduring share in AI companies&#8217; profits, you might be able to pay for your policy plans. Alternatively, providing necessary manufacturing contributions to an American-led alliance might pay well enough and require a sufficient workforce to keep up with the disruption. Solutions here will inevitably vary between countries to an extent that makes a sweeping response unfeasible.</p><p>The policy conversation on AI and labor is currently mostly happening in the US. I think it&#8217;s important for middle powers to realise that this conversation won&#8217;t help them much; and for US-focused policy researchers to realise that many of their proposals are leaving some billions of exposed workers unaccounted for. <strong>If the rest of the world follows the US on the AI labor conversation, it will be led astray.</strong></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://writing.antonleicht.me/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thank you for reading! Please consider subscribing.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item></channel></rss>