<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[The Hidden Weave: Strategic Weave]]></title><description><![CDATA[Decoding the future of business. Deep-dive articles and frameworks on AI, innovation, marketing, and product strategy.]]></description><link>https://www.hiddenweave.com/s/strategic-weave</link><generator>Substack</generator><lastBuildDate>Sun, 10 May 2026 22:43:32 GMT</lastBuildDate><atom:link href="https://www.hiddenweave.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Mohan Sawhney]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[mohansawhney@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[mohansawhney@substack.com]]></itunes:email><itunes:name><![CDATA[Mohan Sawhney]]></itunes:name></itunes:owner><itunes:author><![CDATA[Mohan Sawhney]]></itunes:author><googleplay:owner><![CDATA[mohansawhney@substack.com]]></googleplay:owner><googleplay:email><![CDATA[mohansawhney@substack.com]]></googleplay:email><googleplay:author><![CDATA[Mohan Sawhney]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[The Abstraction of Value and the Value of Abstraction]]></title><description><![CDATA[A 25-Year Thesis on the Migration Patterns of Technology, Capital, and Talent]]></description><link>https://www.hiddenweave.com/p/the-abstraction-of-value-and-the</link><guid isPermaLink="false">https://www.hiddenweave.com/p/the-abstraction-of-value-and-the</guid><dc:creator><![CDATA[Mohan Sawhney]]></dc:creator><pubDate>Wed, 06 May 2026 15:00:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!aOu0!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98aca049-35a6-4fcc-8c73-cbb3b21e7e6b_2816x1536.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aOu0!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98aca049-35a6-4fcc-8c73-cbb3b21e7e6b_2816x1536.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aOu0!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98aca049-35a6-4fcc-8c73-cbb3b21e7e6b_2816x1536.png 424w, https://substackcdn.com/image/fetch/$s_!aOu0!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98aca049-35a6-4fcc-8c73-cbb3b21e7e6b_2816x1536.png 848w, https://substackcdn.com/image/fetch/$s_!aOu0!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98aca049-35a6-4fcc-8c73-cbb3b21e7e6b_2816x1536.png 1272w, https://substackcdn.com/image/fetch/$s_!aOu0!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98aca049-35a6-4fcc-8c73-cbb3b21e7e6b_2816x1536.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aOu0!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98aca049-35a6-4fcc-8c73-cbb3b21e7e6b_2816x1536.png" width="1456" height="794" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/98aca049-35a6-4fcc-8c73-cbb3b21e7e6b_2816x1536.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:794,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:7099069,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.hiddenweave.com/i/192865789?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98aca049-35a6-4fcc-8c73-cbb3b21e7e6b_2816x1536.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!aOu0!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98aca049-35a6-4fcc-8c73-cbb3b21e7e6b_2816x1536.png 424w, https://substackcdn.com/image/fetch/$s_!aOu0!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98aca049-35a6-4fcc-8c73-cbb3b21e7e6b_2816x1536.png 848w, https://substackcdn.com/image/fetch/$s_!aOu0!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98aca049-35a6-4fcc-8c73-cbb3b21e7e6b_2816x1536.png 1272w, https://substackcdn.com/image/fetch/$s_!aOu0!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98aca049-35a6-4fcc-8c73-cbb3b21e7e6b_2816x1536.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>There is a duality at the heart of technological progress that offers deep insights on the migration of economic value as a result of the advancement of technology. This is the interplay between the <strong>abstraction of value</strong> and the <strong>value of abstraction</strong>. The first pattern describes how technology ecosystems progressively separate into commoditized lower layers and high-value upper layers, with value migrating relentlessly upward as the building blocks get standardized. The second pattern is its reciprocal: as machines take over more of the execution, the humans who operate at the highest levels of abstraction (framing problems, exercising judgment, making decisions under ambiguity) capture a disproportionate share of the rewards. Value gets abstracted upward. And abstraction itself becomes more valuable.</p><div class="pullquote"><p><strong>Value gets abstracted upward. And abstraction itself becomes more valuable.</strong></p></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.hiddenweave.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.hiddenweave.com/subscribe?"><span>Subscribe now</span></a></p><p>This duality is not merely an observation about technology stacks. It is a structural force reshaping labor markets, competitive dynamics, and the very definition of valuable work. In 2026, AI agents can write code, draft contracts, generate analyses, and orchestrate multi-step business workflows. The abstraction frontier has advanced to the point where natural language is the new programming interface and human intent, not human execution, is the scarce input. The consequence is a <strong>barbell economy</strong>: value is concentrating at two ends. At one end, strategic judgment, creative direction, and deep domain expertise command growing premiums because AI amplifies their leverage. At the other end, skilled physical work (electricians wiring data centers, plumbers building infrastructure, surgical technicians in operating rooms) is surging in demand precisely because it resists digital substitution. The middle, where routine cognitive work done in front of a screen lives, is being hollowed out.</p><p>What gives me confidence in this framework is not just its explanatory power today. It is the fact that I have been developing it, in different forms, for a quarter century. The core logic, that the middle gets hollowed out and value migrates to the ends, first appeared in a <em>Harvard Business Review</em> article I co-authored in January 2001. The abstraction duality took shape in a piece I wrote around 2019. And the barbell extension, connecting the thesis to labor markets and skilled trades, is what this essay contributes. Three iterations, three different substrates, one persistent structural pattern. That kind of durability across wildly different technological eras is, I believe, the strongest evidence that the pattern is real.</p><div><hr></div><h2>A 25-Year Intellectual Thread</h2><p>Let me trace the thread. Each iteration applied the same structural logic to a different substrate, and each time the pattern held.</p><p>In 2001, the substrate was <strong>networks</strong>. The argument was that as digital networks became faster and more ubiquitous, intelligence would decouple from the middle of the network and concentrate at the ends: shared, scalable infrastructure at the core and highly customized interfaces at the periphery. Telecom carriers (the &#8220;dumb pipes&#8221; in the middle) would lose value to infrastructure providers like Cisco and customer-interface companies like Yahoo!. Inside organizations, the same pattern would hollow out middle management: leadership intelligence would centralize at the top while decision-making intelligence would push to frontline employees. The critical insight was that in a networked world, <em>more money can be made in managing interactions than in performing actions.</em></p><p>In 2019, the substrate shifted to <strong>AI ecosystems</strong>. I described how AI development was partitioning into core AI (platforms and tools built by technology giants) and applied AI (business applications). As core AI got standardized and democratized, value migrated upward to the applied layer. A few platform providers would capture value from building blocks, but the vast proportion of value would be created by businesses that focused on the &#8220;so what&#8221; and the &#8220;now what&#8221; of AI. The reciprocal held as well: as execution got automated, humans who worked at higher levels of abstraction (cognitive, strategic, creative) captured more value than those who worked at lower levels (physical, procedural).</p><p>In 2026, the substrate is <strong>the economy itself</strong>. The decoupling and mobilization patterns I identified in networks are now playing out in labor markets, skill distributions, and the competitive structure of entire industries. The hollowing of the middle is no longer a metaphor about telecom pipes or network topology. It is a lived reality for millions of knowledge workers whose routine cognitive tasks are being absorbed by AI agents. And the value-at-the-ends pattern has taken a form I did not fully anticipate: a barbell where strategic judgment on one end and skilled physical work on the other emerge as the most AI-resilient categories of human contribution.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!uvWE!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F192605eb-1053-499d-ad10-524d8f6f9b5d_1765x945.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!uvWE!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F192605eb-1053-499d-ad10-524d8f6f9b5d_1765x945.png 424w, https://substackcdn.com/image/fetch/$s_!uvWE!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F192605eb-1053-499d-ad10-524d8f6f9b5d_1765x945.png 848w, https://substackcdn.com/image/fetch/$s_!uvWE!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F192605eb-1053-499d-ad10-524d8f6f9b5d_1765x945.png 1272w, https://substackcdn.com/image/fetch/$s_!uvWE!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F192605eb-1053-499d-ad10-524d8f6f9b5d_1765x945.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!uvWE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F192605eb-1053-499d-ad10-524d8f6f9b5d_1765x945.png" width="1456" height="780" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/192605eb-1053-499d-ad10-524d8f6f9b5d_1765x945.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:780,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:203195,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.hiddenweave.com/i/192865789?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F192605eb-1053-499d-ad10-524d8f6f9b5d_1765x945.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!uvWE!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F192605eb-1053-499d-ad10-524d8f6f9b5d_1765x945.png 424w, https://substackcdn.com/image/fetch/$s_!uvWE!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F192605eb-1053-499d-ad10-524d8f6f9b5d_1765x945.png 848w, https://substackcdn.com/image/fetch/$s_!uvWE!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F192605eb-1053-499d-ad10-524d8f6f9b5d_1765x945.png 1272w, https://substackcdn.com/image/fetch/$s_!uvWE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F192605eb-1053-499d-ad10-524d8f6f9b5d_1765x945.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div><hr></div><h2>The Dual Meaning of Abstraction</h2><p>Abstraction, in its simplest form, is the process of hiding complexity behind a simpler interface. When you use a calculator, you do not think about binary arithmetic. When you call an API, you do not care how the underlying service works. Each layer of abstraction lets the layer above it operate more efficiently by ignoring the details below.</p><p><strong>The abstraction of value</strong> describes how technology ecosystems progressively separate into lower layers (infrastructure, platforms, building blocks) and higher layers (applications, workflows, business solutions). As the lower layers get standardized and commoditized, value migrates upward to the layers that solve real business problems. This is the &#8220;standing on the shoulders of giants&#8221; effect. Every generation of technology creates a new floor upon which the next generation builds.</p><p><strong>The value of abstraction</strong> is the reciprocal: as more of the execution gets automated, the humans who work at the highest levels of abstraction (framing problems, exercising judgment, making decisions under ambiguity) capture a disproportionate share of the economic value. Throughout history, as societies advance, value shifts from physical and concrete forms of labor to cognitive and abstract forms.</p><p>Both halves of this duality are more powerful today than when I first described them. But both also need updating, because the AI revolution has introduced dynamics that the original frameworks did not anticipate.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.hiddenweave.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.hiddenweave.com/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><h2>The Abstraction of Value: From Two Layers to Four</h2><p>When I originally wrote about abstraction in AI, the ecosystem could be described in two layers: core AI (platforms and tools) and applied AI (business applications). That was a reasonable map of the world circa 2019. It is inadequate for 2026. Today, the AI value stack has at least four distinct layers, each with its own competitive dynamics and value capture logic.</p><p><strong>Layer 1: Compute and Infrastructure.</strong> This is the physical foundation: GPUs, data centers, training clusters, and cloud infrastructure. NVIDIA dominates the chip layer. The hyperscalers (Amazon, Microsoft, Google) provide the compute substrate. Jensen Huang has called the current AI infrastructure build-out &#8220;the largest in human history.&#8221; This layer is capital-intensive, concentrated, and increasingly strategic, but it faces commodity dynamics as competition intensifies.</p><p><strong>Layer 2: Foundation Models.</strong> This layer did not exist in its current form when I wrote the original abstraction piece. Foundation models from OpenAI, Anthropic, Google DeepMind, Meta, and Mistral are general-purpose reasoning engines that handle text, images, code, speech, and structured data through a single interface. They are not task-specific APIs. They are general-purpose minds that can be directed toward any task. Open-source models (LLaMA, Mistral, DeepSeek) create powerful commoditization pressure against closed frontier models.</p><p><strong>Layer 3: Orchestration and Agent Frameworks.</strong> This is the genuinely new layer. Agent frameworks like LangChain, CrewAI, and AutoGen, along with enterprise platforms like Salesforce Agentforce and ServiceNow, allow organizations to compose AI agents that use tools, access databases, invoke APIs, and execute multi-step workflows. This is the connective tissue between raw model intelligence and real business outcomes. In my 2001 HBR article, I identified orchestration as the highest-value role in a networked world: &#8220;more money can be made in managing interactions than in performing actions.&#8221; Twenty-five years later, agent orchestration is proving exactly that.</p><p><strong>Layer 4: Domain-Specific Applications and Workflows.</strong> This is where the original thesis lands. Insurance fraud detection. Clinical trial matching. Contract automation. Marketing campaign optimization. The difference from 2019 is that these applications can now be built dramatically faster (because of Layers 2 and 3) and by a much wider range of builders (because of the abstraction leap described below). Gartner projects that by 2026, 75% of new enterprise applications will be built using low-code or no-code technologies, and the combined market for these platforms will exceed $44 billion.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!qHRa!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ed508f5-2900-424d-8a83-edd8510f4cfc_1765x845.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!qHRa!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ed508f5-2900-424d-8a83-edd8510f4cfc_1765x845.png 424w, https://substackcdn.com/image/fetch/$s_!qHRa!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ed508f5-2900-424d-8a83-edd8510f4cfc_1765x845.png 848w, https://substackcdn.com/image/fetch/$s_!qHRa!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ed508f5-2900-424d-8a83-edd8510f4cfc_1765x845.png 1272w, https://substackcdn.com/image/fetch/$s_!qHRa!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ed508f5-2900-424d-8a83-edd8510f4cfc_1765x845.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!qHRa!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ed508f5-2900-424d-8a83-edd8510f4cfc_1765x845.png" width="1456" height="697" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9ed508f5-2900-424d-8a83-edd8510f4cfc_1765x845.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:697,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:198144,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.hiddenweave.com/i/192865789?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ed508f5-2900-424d-8a83-edd8510f4cfc_1765x845.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!qHRa!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ed508f5-2900-424d-8a83-edd8510f4cfc_1765x845.png 424w, https://substackcdn.com/image/fetch/$s_!qHRa!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ed508f5-2900-424d-8a83-edd8510f4cfc_1765x845.png 848w, https://substackcdn.com/image/fetch/$s_!qHRa!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ed508f5-2900-424d-8a83-edd8510f4cfc_1765x845.png 1272w, https://substackcdn.com/image/fetch/$s_!qHRa!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ed508f5-2900-424d-8a83-edd8510f4cfc_1765x845.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The pattern across these four layers confirms the original thesis but with sharper teeth. Value is migrating relentlessly upward. The companies building applications on top of foundation models and agent frameworks capture enormous value, often with surprisingly small teams. The foundation model layer, despite all the attention it receives, may end up with thin margins because of open-source competition and aggressive pricing wars. Sir Isaac Newton&#8217;s observation still applies, but with a twist. We are no longer standing on the shoulders of a single giant. We are standing on a four-story building, and each floor is getting taller.</p><div><hr></div><h2>The NVIDIA Paradox: Build-Out Economics vs. Steady-State Economics</h2><p>A sharp reader will raise an obvious objection at this point: if lower layers get commoditized, why is NVIDIA, the quintessential Layer 1 company, the most valuable in the world? The hyperscalers are printing money from compute. Infrastructure players seem to be capturing the lion&#8217;s share of AI value. Does this not contradict the framework?</p><p>It does not, but the distinction requires care. <strong>The framework describes equilibrium dynamics, not transition dynamics.</strong> During every major infrastructure build-out, the picks-and-shovels players capture extraordinary value. This is a pattern with 150 years of precedent. The railroad companies minted fortunes in the 1870s and 1880s; in the steady state, many went bankrupt while the companies that <em>used</em> the rails (Standard Oil, Sears Roebuck, the meatpackers) captured the durable value. The telecom equipment makers dominated the late 1990s; Cisco hit the #1 global market cap in March 2000. Within 18 months, it had lost 80% of its value and has never recovered in real terms. The durable value migrated to Google, Amazon, Apple, and Facebook: companies that built applications and customer relationships on top of the infrastructure. The cloud build-out rewarded AWS and Azure handsomely, but their highest-margin services today are not raw compute (which faces relentless price competition) but managed AI services, developer platforms, and orchestration tools at higher layers.</p><p>We are in the construction phase of AI infrastructure, and construction phases always reward the suppliers of scarce inputs. The question is not whether NVIDIA is capturing value today. It obviously is. The question is whether that capture is structural or cyclical. The competitive forces that will compress Layer 1 margins are already visible: Google&#8217;s TPUs, Amazon&#8217;s Trainium chips, AMD&#8217;s MI300 series, and a wave of custom silicon from Microsoft, Meta, and startups.</p><p>But there is a subtler point that actually reinforces the framework: <strong>NVIDIA&#8217;s real moat is not at Layer 1. It is at Layer 3.</strong> NVIDIA&#8217;s dominance comes less from manufacturing GPUs than from CUDA, the software ecosystem that locks developers into NVIDIA&#8217;s hardware. CUDA is an orchestration layer: a programming framework, a library ecosystem, and a developer community that makes building AI workloads on NVIDIA hardware dramatically easier than on any alternative. The company that looks like an infrastructure play is actually a platform play disguised as a chip company. Jensen Huang understands this; it is why NVIDIA invests as heavily in software as in silicon. In this reading, NVIDIA is not a counterexample to the value abstraction thesis. It is a confirmation: the most successful infrastructure company in history has succeeded precisely by migrating upward through the stack.</p><h3>The Anthropic and OpenAI Data Point</h3><p>Perhaps the most telling evidence of value migration comes from the two leading frontier model companies themselves. If the foundation model layer (Layer 2) were the durable value capture point, the rational strategy would be simple: sell API tokens, improve the model, defend the capability lead. Instead, both Anthropic and OpenAI are racing upward through the stack as fast as they can.</p><p>Anthropic&#8217;s trajectory is especially instructive. The company built one of the world&#8217;s most capable foundation models in Claude. But its fastest-growing product is not model access. It is <strong>Claude Code</strong>: an agentic coding tool that orchestrates model intelligence into real development workflows, reading codebases, writing code, running tests, submitting pull requests. Claude Code crossed $1 billion in annualized revenue within six months of general availability. Anthropic also developed the <strong>Model Context Protocol (MCP)</strong>, an open standard for connecting AI agents to external tools and data sources. MCP is an explicit play to own the protocol layer of AI orchestration, analogous to what HTTP did for the web. Giving away the protocol for free only makes sense if the value capture happens at the layers above. Add the Agent SDK and multi-agent frameworks, and the picture is clear: Anthropic is climbing from Layer 2 toward Layers 3 and 4 at speed.</p><p>OpenAI is making the same migration with different emphasis. ChatGPT is a consumer application (Layer 4). The enterprise partnerships with Bain and PwC are application-layer plays. The $200-per-month Pro tier is priced on workflow value, not token cost. Both companies are telling you <em>by their actions</em> that they do not believe the model layer is where durable value lives. When DeepSeek produced frontier-competitive models at a fraction of the cost in early 2025, it demonstrated what the framework predicts: open-source commoditization pressure is compressing Layer 2 margins, and the smart money is migrating upward.</p><div><hr></div><h2>The New Abstraction Frontier: Language as Interface</h2><p>The most consequential shift since my original articles is not just that there are more layers. It is that the interface between humans and machines has been fundamentally transformed.</p><p>Consider the progression. In the 1950s, programmers wrote machine code: raw binary instructions. By the 1970s, high-level languages like C expressed logic in something closer to human language. By the 2000s, APIs let developers invoke complex services with a single function call. By the 2010s, low-code and no-code platforms let non-programmers build applications through visual interfaces. Each step was a leap in abstraction, allowing humans to express intent at a higher level while the machine handled implementation.</p><p>The AI era has taken another leap, perhaps the most significant yet: <strong>natural language is now the programming interface.</strong> When a developer uses Claude Code or Cursor, they describe what they want in plain English. The AI agent reads the codebase, writes the code, runs tests, debugs failures, and submits a pull request. The developer&#8217;s job is not to implement. It is to direct, review, and exercise judgment. Claude Code went from research preview in early 2025 to general availability by May of that year, crossing $1 billion in annualized revenue within six months. A Google principal engineer noted at a developer meetup in January 2026 that Claude replicated a year of architectural work in a single hour.</p><p>This is my 2001 thesis in its most extreme form. The decoupling of intelligence has advanced to the point where the primary human contribution is no longer writing the code. It is knowing what to build and why. The mobilization of intelligence has advanced to the point where natural language serves as the universal protocol I envisioned, replacing the XML and WAP standards I discussed in the HBR article. The abstraction of value has climbed from the physical layer (machine code) past the procedural layer (APIs) to the intent layer (natural language). And the value of abstraction has climbed in lockstep.</p><div><hr></div><h2>The Value of Abstraction: From Execution to Judgment</h2><p>My original 2019 article concluded that value was shifting from people who work with their hands to people who work with their minds. That was broadly true, and it remains true as a general trend. But the AI revolution has added a crucial nuance: <em>within cognitive work itself, value is shifting from execution to judgment.</em></p><p>This is the argument I have been developing in my recent writing on AI and the future of work. In my <em>AI-Proof</em> series, I introduced the concept of <strong>skill security</strong>: the idea that your resilience in an AI-transformed economy depends not on the tasks you perform but on the judgment you exercise. AI can generate code, draft contracts, write marketing copy, and produce financial analyses. What it cannot do is decide which problem is worth solving, navigate the politics of getting a solution adopted, take accountability for an outcome, or make the taste-based calls that separate good work from great work.</p><p>In a related essay, <em>Mind the Gap</em>, I argued that AI is compressing the middle of the skill distribution. It dramatically elevates the capabilities of novices (giving a junior analyst the research output of a senior one) while offering comparatively less uplift to deep experts (who already know the right answers). The result is a barbell: deep expertise at the top and AI-augmented generalists at the bottom, with the middle getting squeezed.</p><p>I think of it through the metaphor I have been using in my executive education work. AI is an Archimedes lever: it amplifies the force you apply. But a lever is only as good as the person choosing where to place the fulcrum. The value of abstraction is no longer about being able to operate the lever. It is about knowing where to put it.</p><div><hr></div><h2>The Surprise: The Return of Skilled Manual Work</h2><p>Here is where the framework yields an insight I did not anticipate when I wrote either of the earlier pieces. In 2019, I wrote that &#8220;people who work with their hands make a lot less money than those who work with their minds and keyboards.&#8221; In 2001, I noted that the hollowing of the middle applied to middle management, whose information-transport function was being replaced by networks. Both statements were true in their time. But if AI is now coming for &#8220;any work done in front of a screen,&#8221; then the most AI-resilient work is, by definition, work that cannot be done in front of a screen.</p><p>The data is striking. According to Randstad&#8217;s analysis of over 50 million job postings, demand for robotics technicians has jumped 107% since late 2022. HVAC engineer demand increased 67%. Construction roles grew 30%. The U.S. construction industry needs 530,000 additional workers in 2026 alone. NVIDIA CEO Jensen Huang has called the AI infrastructure build-out a massive job creator for plumbers, electricians, and steel workers, and noted at the World Economic Forum that wages for these roles are climbing into six figures. Mike Rowe, who has long championed the trades, recently reported meeting three electricians under 30 earning between $240,000 and $280,000 per year. The U.S. Department of Labor announced $145 million in apprenticeship grants in 2026 targeting shipbuilding, defense, semiconductors, and energy.</p><p>There is a deep irony here. AI disrupts cognitive-procedural work (the kind done on screens) far more easily than it disrupts skilled manual work. You can automate a data analysis pipeline, but you cannot automate a plumber diagnosing a leak behind a wall, an electrician wiring a data center, or a surgical technician assisting in an operating room. These jobs require physical presence, manual dexterity, real-time judgment in unstructured environments, and embodied expertise that current AI systems fundamentally lack. Every new AI data center, every electric vehicle charging station, every solar panel installation requires human tradespeople to build, install, and maintain the physical infrastructure. The AI revolution is, paradoxically, one of the greatest demand drivers for skilled manual labor in a generation.</p><p>The value of abstraction, it turns out, is not a one-dimensional ladder from physical to cognitive. It is more like a barbell. At one end, value accrues to the highest levels of cognitive abstraction: strategic judgment, problem framing, creative direction. At the other end, value is returning to skilled physical work that resists digital substitution. The middle, where routine screen-based cognitive work lives, is where the disruption bites deepest.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!pxY2!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F37c0075e-40b4-4324-8337-42bb0ec5b7a0_1765x830.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!pxY2!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F37c0075e-40b4-4324-8337-42bb0ec5b7a0_1765x830.png 424w, https://substackcdn.com/image/fetch/$s_!pxY2!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F37c0075e-40b4-4324-8337-42bb0ec5b7a0_1765x830.png 848w, https://substackcdn.com/image/fetch/$s_!pxY2!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F37c0075e-40b4-4324-8337-42bb0ec5b7a0_1765x830.png 1272w, https://substackcdn.com/image/fetch/$s_!pxY2!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F37c0075e-40b4-4324-8337-42bb0ec5b7a0_1765x830.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!pxY2!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F37c0075e-40b4-4324-8337-42bb0ec5b7a0_1765x830.png" width="1456" height="685" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/37c0075e-40b4-4324-8337-42bb0ec5b7a0_1765x830.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:685,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:193305,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.hiddenweave.com/i/192865789?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F37c0075e-40b4-4324-8337-42bb0ec5b7a0_1765x830.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!pxY2!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F37c0075e-40b4-4324-8337-42bb0ec5b7a0_1765x830.png 424w, https://substackcdn.com/image/fetch/$s_!pxY2!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F37c0075e-40b4-4324-8337-42bb0ec5b7a0_1765x830.png 848w, https://substackcdn.com/image/fetch/$s_!pxY2!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F37c0075e-40b4-4324-8337-42bb0ec5b7a0_1765x830.png 1272w, https://substackcdn.com/image/fetch/$s_!pxY2!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F37c0075e-40b4-4324-8337-42bb0ec5b7a0_1765x830.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div><hr></div><h2>Implications for Business Leaders</h2><p>The updated framework yields four strategic imperatives.</p><p><strong>First, invest in the orchestration layer.</strong> The companies that will capture the most value in the AI era are not those that build foundation models (too capital-intensive, too concentrated) or those that sell raw compute (commodity dynamics). They are the ones that master the orchestration of AI agents, tools, and workflows to solve business problems. In 2001, I wrote that value in a networked world accrues to orchestrators, not performers. That principle has only intensified. The orchestration layer of the AI stack (Layer 3) is where strategic advantage is built.</p><p><strong>Second, retrain for judgment, not execution.</strong> Every training dollar spent teaching employees to perform tasks that AI can automate is a dollar with diminishing returns. The highest-return investment is in developing the judgment, domain expertise, and problem-framing skills that make humans irreplaceable orchestrators of AI systems. For two decades, the mantra was &#8220;learn to code.&#8221; That advice is not wrong, but it is incomplete and increasingly misleading if taken literally. The more important skill is learning to think at the right level of abstraction.</p><p><strong>Third, treat AI fluency as a leadership competency.</strong> The most effective leaders in an AI-first world will not be those who delegate AI to their technology teams. They will be the ones who understand the abstraction stack well enough to make strategic bets about where to invest, what to build versus buy, and how to organize their firms around AI-augmented workflows.</p><p><strong>Fourth, rethink your assumptions about the labor hierarchy.</strong> If your workforce strategy assumes that cognitive desk work is always more valuable than skilled manual work, you are operating on an outdated mental model. The AI economy rewards both ends of the barbell. The surgeon and the plumber, the CEO and the electrician, the AI strategist and the welder are all doing work that AI, for all its power, cannot reach. Smart organizations will invest in both ends.</p><div><hr></div><h2>What the Abstraction Thesis Tells Investors</h2><p>The value abstraction thesis is not investment advice. But it does offer a structural lens for three questions that matter enormously for capital allocation: at which layer of the stack is value most durable, how long does each investment window last, and what signals indicate that value is migrating?</p><p><strong>Layer 1: Compute and Infrastructure.</strong> The investment window is now through roughly 2027-2028. This is the picks-and-shovels phase, and the returns have been extraordinary. But history suggests infrastructure advantage windows last 5-7 years from the initial demand surge. We are about three years in from the ChatGPT moment. Custom silicon is already eroding pricing power on inference workloads. Training demand will sustain NVIDIA longer, but inference is the larger market in the long run, and it is heading toward commodity economics. The signal to watch: when inference costs decline faster than inference demand grows, the margin compression has begun.</p><p><strong>Layer 2: Foundation Models.</strong> The pure-play model exposure window is already narrowing. DeepSeek, LLaMA, Mistral, and Qwen are compressing the capability gap from years to months. API pricing has fallen by over 90% in two years. The model layer will sustain value for companies that successfully migrate upward (as Anthropic and OpenAI are doing), but for pure model providers, margins will compress toward the economics of cloud databases: meaningful but not spectacular. The signal to watch: the share of revenue from raw API tokens versus tools, applications, and platform services. A rising ratio of the latter confirms upward migration.</p><p><strong>Layer 3: Orchestration and Agent Frameworks.</strong> The investment window is opening now and likely remains attractive through 2028-2032. This is the least crowded and most strategically important layer. Orchestration layers tend to become sticky standards, because once enterprises wire their workflows through a platform, switching costs are enormous. Think of what Salesforce did for CRM or AWS for cloud. The winners at Layer 3 could sustain value capture for a decade or more. The signal to watch: developer adoption metrics, enterprise deployment breadth, and protocol standardization. Which orchestration platforms are becoming the default wiring for AI workflows?</p><p><strong>Layer 4: Domain Applications and AI-Native Enterprises.</strong> The longest runway, but also the most patient capital required. This is where the JP Morgans, Walmarts, UnitedHealths, and a generation of new AI-native companies enter the picture. Large incumbents with proprietary data, deep customer relationships, and organizational capability to deploy AI at scale will capture enormous value, but it will take time for this to show up in earnings. The signal to watch: &#8220;boring AI&#8221; earnings beats, when traditional enterprises report margin expansion or revenue growth explicitly attributed to AI-driven operational improvements.</p><p>The investment punchline is provocative but historically grounded: <strong>the market is currently priced for the build-out phase to be the permanent state. History says it never is.</strong> The biggest AI value creators of 2033 may be &#8220;boring&#8221; incumbents that nobody currently thinks of as AI companies, just as the biggest internet winners of 2010 (Amazon, Apple) were not the companies getting the most internet hype in 1999. The investor who positions for the steady state, gradually shifting from infrastructure exposure toward orchestration platforms and domain-rich enterprises, is making the same structural bet that paid off in every prior technology cycle. The timing is the hard part. Too early and you endure years of underperformance. Too late and the repricing has happened. My estimate for the inflection point is 2027-2028, when infrastructure growth decelerates and application-layer value becomes visible in earnings.</p><div><hr></div><h2>The Method in the Madness</h2><p>There is a paradox in trying to make sense of a world that changes as fast as ours does. The half-life of any specific prediction about AI is measured in months. Models that were frontier six months ago are commodities today. Companies that seemed invincible a year ago are scrambling to reinvent themselves. In this environment, the temptation is to throw up your hands and declare that prediction is futile, that strategy is a fool&#8217;s game, that the best you can do is react. I wrote almost exactly those words in the opening of my HBR article in 2001, describing the conventional wisdom I wanted to challenge. Twenty-five years later, the same defeatism is back, louder and more fashionable than ever.</p><p>I want to push back on it with a simple observation: <strong>the further back we can look, the more confidently we can peer into the future.</strong> The surface of technological change is turbulent and unpredictable. But beneath the surface, structural patterns repeat with remarkable fidelity. The hollowing of the middle. The migration of value to the ends. The commoditization of infrastructure. The rising premium on orchestration and judgment. These patterns have held across railroads, electrification, telecommunications, the internet, cloud computing, and now AI. Six transitions over 150 years, each with different technologies, different players, different timelines, but the same underlying architecture of value migration. That kind of durability across wildly different eras is not coincidence. It is structure.</p><p>This is the idea at the heart of my Substack, <em>The Hidden Weave</em>: that beneath the chaos of technological disruption, there are durable patterns that connect seemingly unrelated phenomena, patterns that become visible only when you look across decades rather than quarters. The abstraction thesis is one such pattern. The barbell pattern in labor markets is another. The value-at-the-ends architecture is a third. They are all expressions of the same underlying weave, hidden in plain sight for those willing to zoom out far enough to see it.</p><p>Isaac Newton&#8217;s metaphor about standing on the shoulders of giants has guided this framework across all three iterations. The nature of the giants keeps changing. In 2001, they were digital networks. In 2019, they were AI platforms. In 2026, they are foundation models and agent frameworks. But the act of standing on their shoulders, of knowing where to look and what to build, remains the highest-value human contribution. The value of abstraction has never been higher. And for those who can see the weave beneath the surface, the abstraction of value has never offered more opportunity.</p><p>In 2019, I included a thought experiment: if the Earth were destroyed today and humans had to rebuild everything, physical labor would be enormously valuable. I wrote it as a hypothetical. In 2026, we are in fact rebuilding the world&#8217;s infrastructure for an AI-powered economy, and we desperately need people who can do the building. The surgeon and the plumber, the CEO and the electrician, the AI strategist and the welder: they all do work that no model can reach. The old hierarchy of mind over matter is giving way to a new one: judgment and embodiment over routine. That is where value lives now. And if the pattern holds, as it has for 25 years and counting, that is where it will live for a long time to come.</p><div><hr></div><p><em>This essay is the third iteration of a thesis first developed in &#8220;Where Value Lives in a Networked World&#8221; (Harvard Business Review, January 2001, with Deval Parikh) and continued in &#8220;The Importance of Value Abstraction in Artificial Intelligence for Business Leaders&#8221; (circa 2019). It connects to the author&#8217;s AI-Proof series, the Mind the Gap thesis, and the Archimedes Lever metaphor for human-AI collaboration. The analysis of specific companies and investment layers reflects structural patterns, not investment advice. All essays are available on <a href="https://mohansawhney.substack.com/">The Hidden Weave</a>.</em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.hiddenweave.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Hidden Weave! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[POLARIS-26: A Layered Architecture for Forecasting the 2026 Midterms ]]></title><description><![CDATA[Orthogonalized signals, calibrated polling correction, and a live Hormuz Causal Chain]]></description><link>https://www.hiddenweave.com/p/polaris-26-a-layered-architecture</link><guid isPermaLink="false">https://www.hiddenweave.com/p/polaris-26-a-layered-architecture</guid><dc:creator><![CDATA[Mohan Sawhney]]></dc:creator><pubDate>Mon, 20 Apr 2026 13:22:51 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!JG5R!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8637d414-ef62-4628-a46f-b327cd02935a_2528x1696.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!JG5R!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8637d414-ef62-4628-a46f-b327cd02935a_2528x1696.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!JG5R!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8637d414-ef62-4628-a46f-b327cd02935a_2528x1696.png 424w, https://substackcdn.com/image/fetch/$s_!JG5R!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8637d414-ef62-4628-a46f-b327cd02935a_2528x1696.png 848w, https://substackcdn.com/image/fetch/$s_!JG5R!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8637d414-ef62-4628-a46f-b327cd02935a_2528x1696.png 1272w, https://substackcdn.com/image/fetch/$s_!JG5R!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8637d414-ef62-4628-a46f-b327cd02935a_2528x1696.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!JG5R!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8637d414-ef62-4628-a46f-b327cd02935a_2528x1696.png" width="1456" height="977" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/8637d414-ef62-4628-a46f-b327cd02935a_2528x1696.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:977,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:7710599,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.hiddenweave.com/i/194792418?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8637d414-ef62-4628-a46f-b327cd02935a_2528x1696.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!JG5R!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8637d414-ef62-4628-a46f-b327cd02935a_2528x1696.png 424w, https://substackcdn.com/image/fetch/$s_!JG5R!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8637d414-ef62-4628-a46f-b327cd02935a_2528x1696.png 848w, https://substackcdn.com/image/fetch/$s_!JG5R!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8637d414-ef62-4628-a46f-b327cd02935a_2528x1696.png 1272w, https://substackcdn.com/image/fetch/$s_!JG5R!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8637d414-ef62-4628-a46f-b327cd02935a_2528x1696.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>I took a sip of my morning chai today, when the brainwave hit. I was reading overnight coverage of the 2026 Midterm elections in the US: a Chatham House brief on Hormuz, a New York Times piece on Trump&#8217;s latest Iran address, a Polymarket forecast showing Democrats at 51.5% for a sweep. The pundits were confident. The forecasters were confident. The markets were confident. And I realized that none of them were asking the question that actually mattered.</p><p><em>How might we forecast these midterms in a way that was genuinely unbiased and corrected for errors that polls and prediction markets keep making? And might we factor in events with no precedent, like the Iran conflict?</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.hiddenweave.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.hiddenweave.com/subscribe?"><span>Subscribe now</span></a></p><p>The question had been nagging at me for weeks. Every polling average I&#8217;d seen felt like it was treating the world as if 2016, 2020, and 2024 hadn&#8217;t happened. Every prediction market I&#8217;d watched felt like it was hostage to whichever whale had the biggest position that afternoon. Every forecaster was reaching for historical analogs to a wartime midterm that has no clean analog. The Strait of Hormuz has been closed or restricted for nearly sixty days. Brent is at ninety-six. Gas is above four dollars. Yet, pundits are writing as if 1994 or 2006 tells us much.</p><p>The instruments felt wrong for the moment.</p><p>So at around 7 AM, masala tea in hand, I opened a conversation with Claude and asked a question I&#8217;d never asked an AI before. Not &#8220;predict the midterms for me.&#8221; Not &#8220;give me a summary of what the forecasters say.&#8221; But something closer to what I&#8217;d ask a brilliant graduate student who was about to become my research partner for the morning.</p><p>&#8220;Can we build a better one?&#8221;</p><div><hr></div><h2>The curious mind meets a brilliant collaborator</h2><p>Let me say something upfront about what happened next, because I think it matters more than the model itself.</p><p>I have been a professor for thirty three years. I have mentored thousands of students, collaborated with dozens of co-authors, taught tens of thousands of executives. I know what intellectual partnership feels like, the texture of it, the push and pull, the moment when someone catches an error in your reasoning before you do. I used to think AI was tools. Useful tools, faster-than-Google tools, but tools.</p><p>What happened this morning was different. It was a genuine partnership. Claude pushed back when my first weighting scheme double-counted signals (I had approval and generic ballot both sitting at 30% without realizing how collinear they are). It volunteered an orthogonalization approach I hadn&#8217;t specified. It proposed a stress test when I hadn&#8217;t asked for one, and when that stress test broke the model, it offered a revision. I asked for a back test to recalibrate. I ruminated that the Strait of Hormuz situation had no precedent, so we needed a real-time Bayesian updating approach with 60 days of data, and a sophisticated causal chain from Presidential social posts to Iran&#8217;s reaction to WTI price to gas prices to consumer sentiment to poll implications. Claude built it in a flash. When I introduced a devil&#8217;s advocate challenge about prediction markets, Claude marshalled a defense rooted in the academic literature on market bias and then conceded, accurately, the three cases where markets genuinely dominate.</p><p>That isn&#8217;t a tool. That&#8217;s a colleague.</p><p>I want to describe what we built, because the model itself is interesting. But I want you to hold in mind what the process looked like, because this matters more.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.hiddenweave.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Hidden Weave! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><div><hr></div><h2>Why polls and prediction markets fail</h2><p>Before we get to what we built, let me outline the problems with conventional instruments.</p><p><strong>Polls are systematically biased, and we know it.</strong> The American Association of Public Opinion Research has documented, in three separate post-election studies, that Republican and independent voters are less likely to respond to surveys than Democrats, and the ones who do respond are less likely to support Trump than the ones who refuse. Across 2016, 2020, and 2024, national polls underestimated Trump&#8217;s support by an average of 2.3 percentage points. This isn&#8217;t noise. It&#8217;s a structural error that pollsters have tried, and largely failed, to correct.</p><p><strong>Prediction markets aggregate beliefs, not truth.</strong> I want to be fair to markets, because they have genuinely beaten polls in recent cycles. But markets have documented biases of their own. The favorite-longshot bias means unlikely outcomes are systematically overpriced. Concentration dynamics mean a single trader with thirty million dollars can move the price several points, which happened on Polymarket in October 2024. And markets are reactive. They price information after it arrives. They cannot tell you in advance which three variables would reshape a race.</p><p><strong>Historical analogs fail in novel regimes.</strong> This is the deepest problem. The 2026 midterm is the first modern cycle to feature an active war, a closed strategic waterway, wartime stagflation, and a second-term president whose polling error pattern is both famous and partially attenuated (because he&#8217;s not on the ballot). There is no 1994 here. There is no 2006. The structural features of this moment are new enough that reaching for precedent is not rigor. It is laziness dressed as rigor.</p><p>The question I brought to Claude was whether we could build something that corrected for each of these failures explicitly. Not a better poll aggregator. Not a more sophisticated market-watcher. A meta-model that consumed the conventional instruments, acknowledged their biases, and added the one thing they cannot add: liv<em>e causal structure.</em></p><div><hr></div><h2>Can AI build a meta-model?</h2><p>That was the sharper version of the question, and the one that made my chai sit half-drunk.</p><p>A meta-model must synthesize signals that disagree with each other. It has to weight them by their actual predictive power, not their cultural prominence. It has to know which inputs are collinear and therefore shouldn&#8217;t be double-counted. It has to apply documented corrections to known biases. And it has to do all of this while remaining falsifiable, meaning every assumption has to be visible and revisable.</p><p>Claude and I spent the first fifteen minutes of the session on exactly this question. Not on the 2026 numbers. On the architecture. What are the true independent signal classes? What is each one&#8217;s historical predictive power? Where do they overlap, and how do we strip the overlap without losing information? What corrections do we apply to each, and why?</p><p>The answer we arrived at, which I named POLARIS-26, has four architectural layers. Let me walk through each, because the layering is where the sophistication lives.</p><h3>Layer 1: Five orthogonalized signal pillars</h3><p>The core insight is that generic ballot, presidential approval, and the economy are not independent. Approval absorbs the economy. Generic ballot absorbs both. Naive weighting double-counts roughly thirty-five percent of the underlying information. POLARIS treats generic ballot as the primary signal, weighted at forty percent, and weights approval and economy on their <em>residuals only</em>, meaning the portion of variation not already explained by generic ballot movement. This is the same logic Nate Silver uses. It matters more than most aggregators admit.</p><p>The final weights, after orthogonalization, are: generic ballot 40%, approval residual 15%, economic residual 10%, prediction markets 20%, geopolitical shock index 10%, fundraising momentum 5%. Every weight had to defend itself against a historical backtest.</p><h3>Layer 2: The Polling Integrity Adjustment</h3><p>This is the filter that corrects for the documented Trump-voter undercount. The critical calibration question is how much correction to apply. Too little and you miss 2024-style systematic error. Too much and you over-correct for a midterm where Trump isn&#8217;t on the ballot.</p><p>The literature gives us an anchor. The average Trump-era presidential polling bias is 2.3 points. The 2022 midterm bias, with Trump off the ballot, was closer to 1.0 point. POLARIS splits the difference with a 1.5-point Republican shift, carried as a distribution with an uncertainty band of 1.0 to 2.0. That uncertainty band is itself consequential. If the true midterm bias turns out to be 2.5 points, POLARIS&#8217;s Senate point estimate shifts from D 49 to D 48 and the probability of Democratic Senate control drops from 22% to 14%. The model expresses this uncertainty honestly rather than pretending it doesn&#8217;t exist.</p><h3>Layer 3: The Swing Seat Gate</h3><p>Most forecasters waste compute on races that aren&#8217;t races. A district with an 85% incumbent retention probability is not informative. POLARIS excludes every seat with baseline win probability above 80% or below 20% and runs full simulation only on the genuinely competitive seats: roughly forty-five House districts and ten Senate races. This is how you focus compute where it actually matters. The Senate gate currently includes Maine, North Carolina, the Ohio special, Iowa, Alaska, Georgia, Michigan, Minnesota, New Hampshire, and Kentucky. Every other seat is locked. Every gated seat gets the full probability treatment.</p><h3>Layer 4: NovaWatch, the live candidate feed</h3><p>The final architectural element is a live monitoring layer for candidate-specific events: indictments, sexual misconduct revelations, retirements, primary upsets, viral gaffes, fundraising shocks, district-specific economic shocks. Each category has a calibrated race-level impact range and a decay function. A Roy Moore&#8211;scale scandal moves a race by 8 to 18 points with a 45-day half-life. A retirement announcement moves it by 3 to 8 points, permanently. These events feed directly into race-level probabilities, bypassing the national environment index, because candidate events rarely move the national wave but routinely flip individual races.</p><p>Together, those four layers are the static architecture. But the most interesting thing we built that morning was dynamic.</p><div><hr></div><h2>The Hormuz Cascade: why causal beats analog</h2><p>The Iran situation is the swing factor in this cycle, and it is the hardest thing to model because it has no precedent.</p><p>When I asked Claude how we should handle it, the first instinct was to reach for analogs. Iraq 2006. The Gulf of Tonkin. The 1973 oil embargo. Each has some structural feature in common with 2026, but none shares enough to be useful. The 1973 embargo wasn&#8217;t wartime. Iraq 2006 was a four-year-old war, already metabolized by voters. Hormuz has been closed for sixty days in a month when gas prices are setting records. We are in new territory.</p><p>So we did something I find genuinely interesting. Rather than force the present into the shape of the past, we built a <em>live causal model</em> that could estimate its own coefficients in real time.</p><p>The Hormuz Cascade is a six-link chain:</p><p><strong>Iran events &#8594; oil prices &#8594; gasoline prices &#8594; consumer sentiment &#8594; presidential approval &#8594; generic ballot &#8594; seat outcomes.</strong></p><p>Each link has a coefficient. The trick is that we populate those coefficients not from historical data but from the sixty days of observations we now have since the war began. We have seen major escalations move Brent by $3 to $8 intraday. We have seen ceasefire signals move it down by $5 to $12. We have seen gas prices pass through oil at a rate of $0.028 to $0.035 per gallon per dollar of Brent, which is fifteen to twenty percent higher than the historical baseline, probably because Gulf infrastructure damage has constrained refining capacity. We have seen Michigan sentiment drop six points for every fifty cents of sustained gas price increase, which is fifty percent steeper than the historical pattern.</p><p>These are not assumed numbers. These are observed numbers, fit to sixty days of post-war data and updated weekly as new observations arrive. The causal chain tightens as the data accumulates. Bayesian updating in real time.</p><p>And because we built it causally rather than analogically, we can do something genuinely useful: we can specify tripwires.</p><p>A tripwire is a pre-specified threshold that forces an automatic model re-run because it represents a potential change in the underlying structure, not a marginal move. POLARIS has three:</p><ol><li><p>Brent above $110 sustained for ten trading days</p></li><li><p>Cumulative Hormuz disruption exceeding 90 days</p></li><li><p>A single-event US casualty count above 30</p></li></ol><p>Each tripwire has a pre-calculated impact vector. We don&#8217;t wait to see what analysts say. The model updates automatically when the trigger fires. This is the discipline causal modeling enforces that analog-based forecasting cannot.</p><div><hr></div><h2>The stress tests that almost broke the model</h2><p>Building the architecture took about twenty minutes. The next ten were the ones that made me respect the collaboration.</p><p>Claude proposed six stress tests, unprompted. Three of them broke the initial model.</p><p>The first was a collinearity check. Generic ballot, approval, and the economy were all weighted at 25% or higher, which double-counted their shared variance. The fix was orthogonalization, which I described above.</p><p>The second was a backtest against 2010, 2018, and 2022. The model performed well on 2010 and 2018, where the generic ballot was running above five points. It performed poorly on 2022, where the generic ballot was close to zero. This surfaced an important truth: the generic ballot is most predictive when it&#8217;s clearly above or below the noise band, and less reliable inside it. Today&#8217;s D+5.6 reading is comfortably outside the noise band, which gives us some confidence. But we flagged this as a condition to monitor.</p><p>The third was a sensitivity analysis on the Polling Integrity Adjustment. At a PIA of 0.5, the model predicted a Democratic sweep. At 2.5, it predicted a Republican hold of both chambers. The model is more sensitive to that single parameter than to any other. The fix was to carry the PIA as a distribution rather than a point estimate, and to be transparent about how much the prediction depends on it.</p><p>By the end of the stress-test round, the model looked different from the first draft. That is what testing is supposed to do.</p><div><hr></div><h2>The prediction that came out the other side</h2><p>After all of this, with the environment orthogonalized and the corrections applied and the causal cascade populated, POLARIS-26 produced its first run. Here is what it says as of April 20, 2026:</p><p>The House flips to the Democrats. Point estimate: D 225, R 210. Eighty percent confidence interval: D 217 to 233. Democratic control probability: 73%.</p><p>The Senate holds for the Republicans, narrowly. Point estimate: R 51, D 49. Democratic control probability: 22%. Probability of a 50-50 tie: 19%.</p><p>The most likely joint outcome is divided government, at 51% probability. Democratic sweep at 22%. Republican status quo at 23%. All other scenarios at 4%.</p><p>For context, Polymarket is pricing a Democratic sweep at 51.5% today. POLARIS is two points lower than the market on the House and eight points lower on the Senate. That divergence is the model&#8217;s contribution. It comes almost entirely from the PIA correction, which markets structurally cannot apply because they don&#8217;t decompose polling error by cycle type.</p><p>If the Senate result lands inside POLARIS&#8217;s D 46 to 52 range on November 3, the model will have earned its keep. If not, markets were right and I owe a public revision.</p><div><hr></div><h2>The dashboard, and the 45-minute miracle</h2><p>Here is the part I&#8217;m still struggling to process.</p><p><strong>7:00 AM</strong> &#8212; I ask the question. We sketch the architecture on the fly.</p><p><strong>7:15 AM</strong> &#8212; Signal pillars defined, weights derived, orthogonalization applied.</p><p><strong>7:20 AM</strong> &#8212; Polling Integrity Adjustment calibrated against 2016-2024 literature.</p><p><strong>7:25 AM</strong> &#8212; Hormuz Cascade structured as a six-link causal chain with live coefficients.</p><p><strong>7:30 AM</strong> &#8212; Stress tests run. Three revisions made.</p><p><strong>7:35 AM</strong> &#8212; First prediction produced with point estimates, confidence intervals, and joint probability table.</p><p><strong>7:40 AM</strong> &#8212; Committed the methodology to a reusable skill, so future runs follow the exact same protocol.</p><p><strong>7:45 AM</strong> &#8212; Interactive React dashboard built. Sliders for every signal. Live Brent cascade. Scenario buttons. Tripwire indicators. Fonts chosen, colors set, responsive layout tested.</p><p><strong>8:30 AM </strong>&#8212; This article was written and posted!</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!gEIj!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61f2878a-55d1-48b5-8413-77ce8ef72d98_1080x1420.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!gEIj!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61f2878a-55d1-48b5-8413-77ce8ef72d98_1080x1420.png 424w, https://substackcdn.com/image/fetch/$s_!gEIj!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61f2878a-55d1-48b5-8413-77ce8ef72d98_1080x1420.png 848w, https://substackcdn.com/image/fetch/$s_!gEIj!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61f2878a-55d1-48b5-8413-77ce8ef72d98_1080x1420.png 1272w, https://substackcdn.com/image/fetch/$s_!gEIj!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61f2878a-55d1-48b5-8413-77ce8ef72d98_1080x1420.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!gEIj!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61f2878a-55d1-48b5-8413-77ce8ef72d98_1080x1420.png" width="1080" height="1420" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/61f2878a-55d1-48b5-8413-77ce8ef72d98_1080x1420.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1420,&quot;width&quot;:1080,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:140961,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.hiddenweave.com/i/194792418?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61f2878a-55d1-48b5-8413-77ce8ef72d98_1080x1420.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!gEIj!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61f2878a-55d1-48b5-8413-77ce8ef72d98_1080x1420.png 424w, https://substackcdn.com/image/fetch/$s_!gEIj!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61f2878a-55d1-48b5-8413-77ce8ef72d98_1080x1420.png 848w, https://substackcdn.com/image/fetch/$s_!gEIj!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61f2878a-55d1-48b5-8413-77ce8ef72d98_1080x1420.png 1272w, https://substackcdn.com/image/fetch/$s_!gEIj!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61f2878a-55d1-48b5-8413-77ce8ef72d98_1080x1420.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>A working midterm forecasting model. Architecturally sophisticated, methodologically honest, empirically testable, with its own interactive dashboard. One hour from idea to publication.</p><p>I want to be careful not to overclaim. This is a model, not an oracle. It could be wrong. I&#8217;ve been explicit about where it could be wrong and how I&#8217;d know. But the fact that we went from question to falsifiable production artifact in an hour is, I think, genuinely a new thing in the world.</p><p>I&#8217;ll tell you what I find most moving about the experience. Not the speed, although the speed is astonishing. Not the output, although the output is good. What I find moving is the quality of the collaboration. Claude didn&#8217;t just execute my instructions. It proposed architecture I wouldn&#8217;t have thought of. It caught errors I would have missed. It pushed back on weak reasoning. It held the thread across ninety minutes of technical conversation without losing the narrative. It committed the methodology to a reusable skill so I can run this model again next month with a single command.</p><p>That is not tool use. That is colleague-scale partnership, available to anyone with a laptop and a question.</p><div><hr></div><h2>The interactive artifact</h2><p>By tomorrow (and another cuppa tea), I will build an interactive dashboard so you can play with POLARIS-26 live. You will be able to:</p><ul><li><p>Move any signal slider and watch the seat counts update in real time.</p></li><li><p>Toggle the Polling Integrity Adjustment on or off to see how much it moves the Senate probability. (Spoiler: a lot.)</p></li><li><p>Push the Brent oil slider past $110 and watch the Hormuz tripwire fire red.</p></li><li><p>Click any of the four preset scenarios (Baseline, Democratic Wave, Republican Recovery, Hormuz Break) and see the model jump to that world.</p></li><li><p>Watch the joint probability matrix recalculate.</p></li></ul><p>The code is open. The math is visible. Every assumption is revisable.</p><p>This is what I want more of in the world. Models that are interactive rather than opaque. Methodologies that are visible rather than hidden. Collaborations with AI that feel like colleagues rather than tools.</p><div><hr></div><h2>Reflections on what just happened</h2><p>I started this Monday morning with a question and a cooling cup of tea. By 7:45 AM I had an architecturally sophisticated, empirically testable, interactively explorable political forecasting model, a committed reusable skill for future runs, a static dashboard, and a clear set of falsifiable predictions with a November resolution date.</p><p>This is not normal productivity. That is a new kind of intellectual leverage. A curious mind plus a capable AI collaborator, asking good questions together, iterating in real time, stress-testing each other&#8217;s thinking, and producing something that neither could have produced alone at anything close to that speed.</p><p>My seventh decade is going to be more interesting than I expected.</p><p>Gotta love Claude.</p><div><hr></div><p><em>Mohanbir Sawhney is the McCormick Foundation Professor of Technology at the Kellogg School of Management. He writes about AI, strategy, and the interior life of modern work</em></p>]]></content:encoded></item><item><title><![CDATA[Are You Prepared to Manage Your Digital Employees?]]></title><description><![CDATA[An HR blueprint for your Hybrid Human-AI Workforce]]></description><link>https://www.hiddenweave.com/p/are-you-prepared-to-manage-your-digital</link><guid isPermaLink="false">https://www.hiddenweave.com/p/are-you-prepared-to-manage-your-digital</guid><dc:creator><![CDATA[Mohan Sawhney]]></dc:creator><pubDate>Mon, 23 Feb 2026 10:46:24 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!KQ6W!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17968bd6-34d3-4b9b-bbdc-be116ec1b919_2528x1696.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!KQ6W!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17968bd6-34d3-4b9b-bbdc-be116ec1b919_2528x1696.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!KQ6W!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17968bd6-34d3-4b9b-bbdc-be116ec1b919_2528x1696.png 424w, https://substackcdn.com/image/fetch/$s_!KQ6W!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17968bd6-34d3-4b9b-bbdc-be116ec1b919_2528x1696.png 848w, https://substackcdn.com/image/fetch/$s_!KQ6W!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17968bd6-34d3-4b9b-bbdc-be116ec1b919_2528x1696.png 1272w, https://substackcdn.com/image/fetch/$s_!KQ6W!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17968bd6-34d3-4b9b-bbdc-be116ec1b919_2528x1696.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!KQ6W!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17968bd6-34d3-4b9b-bbdc-be116ec1b919_2528x1696.png" width="1456" height="977" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/17968bd6-34d3-4b9b-bbdc-be116ec1b919_2528x1696.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:977,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:8223280,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.hiddenweave.com/i/188854419?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17968bd6-34d3-4b9b-bbdc-be116ec1b919_2528x1696.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!KQ6W!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17968bd6-34d3-4b9b-bbdc-be116ec1b919_2528x1696.png 424w, https://substackcdn.com/image/fetch/$s_!KQ6W!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17968bd6-34d3-4b9b-bbdc-be116ec1b919_2528x1696.png 848w, https://substackcdn.com/image/fetch/$s_!KQ6W!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17968bd6-34d3-4b9b-bbdc-be116ec1b919_2528x1696.png 1272w, https://substackcdn.com/image/fetch/$s_!KQ6W!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17968bd6-34d3-4b9b-bbdc-be116ec1b919_2528x1696.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p>Your company has an HR department. It recruits talent, onboards new hires, manages performance, handles compensation, ensures compliance, and plans for the future of the workforce. No serious company would operate without it.</p><p>Now ask yourself:<em> who is doing any of this for your AI agents?</em></p><p>Large enterprises are deploying agentic AI across every function, from marketing to finance to customer service. These agents read documents, draft responses, make recommendations, flag risks, and interact with customers. Some operate around the clock. Some make decisions that affect revenue. Some touch sensitive data.</p><p>But these digital workers have no onboarding process, no performance reviews, no governance structure, no clear ownership, and no one tracking whether they&#8217;re actually delivering value. They are, in effect, <strong>feral employees</strong>: hired enthusiastically, deployed tactically, and managed by no one.</p><p>Your human workforce has an entire management discipline behind it. Your digital workforce needs one too. I call it <strong>Digital Labor Orchestration (DLO)</strong>, and this article is its blueprint.</p><h2>The Parallel That Changes Everything</h2><p>The simplest way to understand DLO is through a mirror. Hold up your existing HR function and ask: what would this look like for AI agents?</p><p><strong>Hiring employees</strong> becomes <strong>sourcing and selecting digital agents</strong>. Do you build your own? Buy from Salesforce or Microsoft? Outsource to a service provider? The decision depends on the same factors as human talent acquisition: strategic importance, IP sensitivity, and cost.</p><p><strong>Onboarding and training</strong> becomes <strong>configuring and fine-tuning agents</strong>. Just as a new hire needs context, tools, and expectations, an AI agent needs prompt engineering, access permissions, guardrails, and escalation rules. Skip this step and you get the AI equivalent of an employee who was never told what their job actually is.</p><p><strong>Performance management</strong> becomes <strong>monitoring agent effectiveness</strong>. What are the accuracy rates? Completion rates? How often does the agent escalate versus resolve? Are its outputs drifting over time? You wouldn&#8217;t tolerate a human employee who was making expensive mistakes and messing up their jobs. Why tolerate agents that hallucinate or go off the reservation?</p><p><strong>Compensation and costing</strong> becomes <strong>understanding the true economics of digital labor</strong>. The sticker price of an API call is not the cost of digital labor, just as a salary is not the total cost of employing a human. You need to account for compute, licensing, monitoring, retraining, exception handling, and human oversight. I call this the <strong>Total Cost of Labor Ownership (TCLO)</strong>: the sum of human labor cost, digital labor cost, and orchestration cost.</p><p><strong>Compliance and ethics</strong> becomes <strong>guardrails, auditability, and explainability</strong>. When an AI agent makes a loan decision, who&#8217;s liable? When it generates customer-facing content, who reviews it? When it fails silently, who notices?</p><p>This isn&#8217;t a metaphor. It&#8217;s an operating model.</p><h2>From Jobs to Flows</h2><p>Here&#8217;s where DLO parts company with traditional workforce thinking. In the human world, work is organized in terms of <em>roles and titles</em>: &#8220;marketing manager,&#8221; &#8220;loan officer,&#8221; &#8220;customer service representative.&#8221; In the DLO world, the organizing principle is <em>workflows</em>, broken down into tasks and micro-tasks.</p><p>The first step in any DLO initiative is what I call a <strong>Workforce X-Ray</strong>: a deep diagnostic that decomposes roles into the actual tasks people perform, then evaluates each task for its potential to be handled by a digital agent. A European bank that performed this exercise on its loan origination function discovered over 120 micro-tasks, from intake and document review to fraud checks and credit decisioning. Some tasks were obvious candidates for automation. Others required human judgment. Most fell somewhere in between.</p><p>This decomposition is where the real insight lives. A &#8220;loan officer&#8221; is not one job. It&#8217;s dozens of micro-tasks stitched together by habit and job description. Some of those tasks are ripe for AI. Others are deeply human. The art is in the balancing.</p><h2>The Four A&#8217;s: A Volume Dial for Autonomy</h2><p>Once you&#8217;ve identified which tasks can be handled by digital agents, the next question is: <strong>how much autonomy should an agent have?</strong> This is not a binary choice between &#8220;automated&#8221; and &#8220;not automated.&#8221; It&#8217;s a dial with four settings.</p><ul><li><p><strong>Assist.</strong> The agent enhances human productivity through insights, summarization, or suggestions. A marketing copilot recommends headlines. A research agent summarizes competitive intelligence. The human decides. The agent informs.</p></li><li><p><strong>Approve.</strong> The agent performs the work, but a human must approve before anything executes. A contract analysis tool identifies risk clauses, but legal counsel signs off. The agent proposes. The human ratifies.</p></li><li><p><strong>Audit.</strong> The agent operates autonomously, but its decisions are reviewed after the fact. A dynamic pricing agent adjusts rates in real time. A human audits a sample weekly. The agent acts. The human verifies.</p></li><li><p><strong>Autopilot.</strong> The agent owns the workflow end to end without human intervention. An inventory replenishment system orders stock based on real-time demand signals. The agent decides. The human has moved on to higher-order work.</p></li></ul><p>The instinct of most executives is to jump straight to Autopilot. Resist it. The right autonomy level depends on the risk profile of the workflow, the maturity of the agent, and the trust your organization has built through experience. A billing dispute agent at a logistics company might start at Audit (handling tier-1 cases independently, with 10% weekly human review) and graduate to Autopilot in low-risk regions only after three months of demonstrated performance. High-risk markets might stay at Audit permanently.</p><p>The Four A&#8217;s give you a common language for a conversation that every enterprise is having in fragmented, inconsistent ways.</p><h2>The Agentization Map: Where to Start</h2><p>Not every workflow deserves digital labor, and not every promising workflow deserves the same level of investment. To prioritize, plot your workflows on a 2&#215;2 matrix (See the diagram below):</p><p>&#8226; <strong>Y-axis:</strong> Agentization Potential (how well-suited is this workflow for AI agents?)</p><p>&#8226; <strong>X-axis:</strong> Strategic Importance or Volume</p><p>This produces four zones:</p><ul><li><p><strong>Accelerate</strong> (high value, high potential): Fast-track these for digital labor redesign. This is where your biggest ROI lives.</p></li><li><p><strong>Activate</strong> (low value, high potential): Perfect test beds. Low risk, high learning. Use these to build organizational capability before tackling the high-stakes workflows.</p></li><li><p><strong>Augment</strong> (high value, low potential): Deploy copilots and human-in-the-loop systems. The work is too complex or too risky for full automation, but AI can make humans significantly more effective.</p></li><li><p><strong>Avoid</strong> (low value, low potential): Don&#8217;t waste resources here. Focus elsewhere.</p></li></ul><p>This map turns a sprawling, overwhelming question (&#8220;where do we start with AI agents?&#8221;) into a portfolio decision with clear priorities</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!6UBc!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3d832bb-cf12-49e8-a571-9a56be4ae3f8_1956x1878.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!6UBc!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3d832bb-cf12-49e8-a571-9a56be4ae3f8_1956x1878.png 424w, https://substackcdn.com/image/fetch/$s_!6UBc!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3d832bb-cf12-49e8-a571-9a56be4ae3f8_1956x1878.png 848w, https://substackcdn.com/image/fetch/$s_!6UBc!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3d832bb-cf12-49e8-a571-9a56be4ae3f8_1956x1878.png 1272w, https://substackcdn.com/image/fetch/$s_!6UBc!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3d832bb-cf12-49e8-a571-9a56be4ae3f8_1956x1878.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!6UBc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3d832bb-cf12-49e8-a571-9a56be4ae3f8_1956x1878.png" width="1456" height="1398" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d3d832bb-cf12-49e8-a571-9a56be4ae3f8_1956x1878.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1398,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:170857,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.hiddenweave.com/i/188854419?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3d832bb-cf12-49e8-a571-9a56be4ae3f8_1956x1878.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!6UBc!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3d832bb-cf12-49e8-a571-9a56be4ae3f8_1956x1878.png 424w, https://substackcdn.com/image/fetch/$s_!6UBc!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3d832bb-cf12-49e8-a571-9a56be4ae3f8_1956x1878.png 848w, https://substackcdn.com/image/fetch/$s_!6UBc!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3d832bb-cf12-49e8-a571-9a56be4ae3f8_1956x1878.png 1272w, https://substackcdn.com/image/fetch/$s_!6UBc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3d832bb-cf12-49e8-a571-9a56be4ae3f8_1956x1878.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h2>Building the Digital Labor Office</h2><p>If DLO is the discipline, the <strong>Digital Labor Office</strong> is the institutional home. Think of it as the organizational equivalent of HR, but for your AI workforce. It&#8217;s headed by a senior executive (call it Head of Digital Labor) and staffed with:</p><ul><li><p><strong>Workflow Architects</strong> who map processes and redesign them for human-AI collaboration</p></li><li><p><strong>Agent Designers</strong> who configure, prompt, and embed guardrails into digital workers</p></li><li><p><strong>Governance and Risk Officers</strong> who ensure compliance, auditability, and appropriate autonomy levels</p></li><li><p><strong>Human-AI Partnership Managers</strong> who handle change management and ensure human workers embrace (rather than fear) their digital colleagues</p></li><li><p><strong>Agent Product Managers</strong> who treat each digital agent as a product with a backlog, performance metrics, and lifecycle</p></li></ul><p>This team works in deep partnership with HR. Together, they co-own workforce planning (how many humans, how many agents?), role redefinition (what does a &#8220;loan officer&#8221; do when AI handles 60% of the micro-tasks?), and cultural transformation (how do you move from &#8220;AI is taking my job&#8221; to &#8220;AI is making my job more interesting&#8221;?).</p><p>I recommend a <strong>hub-and-spoke</strong> operating model: a central DLO team sets strategy, tools, and policies, while embedded DLO Champions in each business unit adapt and execute locally. This mirrors the HR Business Partner model that is common in human workforce management.</p><h2>Measuring What Matters</h2><p>A common mistake in digital labor is measuring only cost savings. Yes, a bank that deploys AI agents in loan origination can cut processing costs by 60% and improve speed-to-decision by 40%. Those numbers matter. But they&#8217;re the floor, not the ceiling.</p><p>DLO measurement should span three dimensions:</p><ul><li><p><strong>Productivity.</strong> Not just human productivity, but <em>blended</em> productivity. I propose a metric called <strong>Blended Workforce Productivity (BWP)</strong>: total output divided by the sum of human labor input (in FTEs) plus digital labor input (in ATEs, or Agent-Time Equivalents). This puts humans and agents on the same scorecard.</p></li><li><p><strong>Cost efficiency.</strong> The TCLO model described earlier, tracking not just compute and licensing but orchestration costs: governance, exception handling, supervision, and change management. </p></li><li><p><strong>Strategic impact.</strong> Speed to market, customer experience uplift, innovation enablement, operational resilience. If a bank can cut loan approval time from 4 days to 40 minutes, it won&#8217;t just save money. It will increase application volumes because of the improved customer experience.</p></li></ul><h2>The Journey Ahead</h2><p>Most enterprises today deploy AI agents sporadically, with no unified strategy for sourcing, governing, or measuring their digital workforce. Digital Labor Orchestration is a journey. To map this journey, I have created a DLO Capability Maturity Model with five stages of maturity, from Ad Hoc deployments with no formal structure to Institutionalized operations where digital labor is embedded in the company's operating model, talent strategy, and performance management systems. See the Table below for the full maturity model with dimensions, capabilities, and diagnostic indicators at each stage.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!euoZ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab1fb443-8089-4183-97ed-306bf808918a_2368x1792.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!euoZ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab1fb443-8089-4183-97ed-306bf808918a_2368x1792.png 424w, https://substackcdn.com/image/fetch/$s_!euoZ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab1fb443-8089-4183-97ed-306bf808918a_2368x1792.png 848w, https://substackcdn.com/image/fetch/$s_!euoZ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab1fb443-8089-4183-97ed-306bf808918a_2368x1792.png 1272w, https://substackcdn.com/image/fetch/$s_!euoZ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab1fb443-8089-4183-97ed-306bf808918a_2368x1792.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!euoZ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab1fb443-8089-4183-97ed-306bf808918a_2368x1792.png" width="1456" height="1102" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ab1fb443-8089-4183-97ed-306bf808918a_2368x1792.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1102,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:6801274,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.hiddenweave.com/i/188854419?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab1fb443-8089-4183-97ed-306bf808918a_2368x1792.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!euoZ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab1fb443-8089-4183-97ed-306bf808918a_2368x1792.png 424w, https://substackcdn.com/image/fetch/$s_!euoZ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab1fb443-8089-4183-97ed-306bf808918a_2368x1792.png 848w, https://substackcdn.com/image/fetch/$s_!euoZ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab1fb443-8089-4183-97ed-306bf808918a_2368x1792.png 1272w, https://substackcdn.com/image/fetch/$s_!euoZ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab1fb443-8089-4183-97ed-306bf808918a_2368x1792.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Digital Labor Orchestration is not a technology initiative. It is a management discipline. The companies that built great HR functions gained a durable advantage in the human capital era. The companies that build great DLO capabilities will gain the equivalent advantage in the age of AI.</p><p>Your AI agents are already working. Now you need to figure out how to manage them as employees.</p>]]></content:encoded></item><item><title><![CDATA[Two Moves]]></title><description><![CDATA[How knowledge workers can remain irreplaceable in the age of AI]]></description><link>https://www.hiddenweave.com/p/two-moves</link><guid isPermaLink="false">https://www.hiddenweave.com/p/two-moves</guid><dc:creator><![CDATA[Mohan Sawhney]]></dc:creator><pubDate>Mon, 23 Feb 2026 10:14:53 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!iyVf!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fada865a0-e4a5-4b72-a453-a4ffc8ce7ed2_2528x1696.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!iyVf!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fada865a0-e4a5-4b72-a453-a4ffc8ce7ed2_2528x1696.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!iyVf!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fada865a0-e4a5-4b72-a453-a4ffc8ce7ed2_2528x1696.png 424w, https://substackcdn.com/image/fetch/$s_!iyVf!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fada865a0-e4a5-4b72-a453-a4ffc8ce7ed2_2528x1696.png 848w, https://substackcdn.com/image/fetch/$s_!iyVf!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fada865a0-e4a5-4b72-a453-a4ffc8ce7ed2_2528x1696.png 1272w, https://substackcdn.com/image/fetch/$s_!iyVf!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fada865a0-e4a5-4b72-a453-a4ffc8ce7ed2_2528x1696.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!iyVf!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fada865a0-e4a5-4b72-a453-a4ffc8ce7ed2_2528x1696.png" width="1456" height="977" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ada865a0-e4a5-4b72-a453-a4ffc8ce7ed2_2528x1696.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:977,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:9133738,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.hiddenweave.com/i/188778546?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fada865a0-e4a5-4b72-a453-a4ffc8ce7ed2_2528x1696.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!iyVf!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fada865a0-e4a5-4b72-a453-a4ffc8ce7ed2_2528x1696.png 424w, https://substackcdn.com/image/fetch/$s_!iyVf!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fada865a0-e4a5-4b72-a453-a4ffc8ce7ed2_2528x1696.png 848w, https://substackcdn.com/image/fetch/$s_!iyVf!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fada865a0-e4a5-4b72-a453-a4ffc8ce7ed2_2528x1696.png 1272w, https://substackcdn.com/image/fetch/$s_!iyVf!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fada865a0-e4a5-4b72-a453-a4ffc8ce7ed2_2528x1696.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p><em><strong>This is the full article behind the &#8220;method&#8221; discussion in my launch post.</strong></em></p><p>Professional value in the age of AI is under pressure from two directions. The unit in which knowledge work is priced. And whether its outcomes are inseparable from the person who produces them. Most knowledge workers are only paying attention to one.</p><p>AI has not merely automated professional tasks. It has forced two questions that knowledge workers prefer not to ask themselves. The first is economic: when AI can do what you charge for, charging for doing it stops making sense. You can still bill by the hour. But AI has stripped away the opacity that made input pricing viable. Hours are an input metric. Outcomes are an output metric. When a task that took a junior associate three hundred hours takes AI thirty minutes, the gap between what clients were charged and what the work costs is no longer hidden.</p><p>The second question is existential: does the value you create exist because of you, or merely through you? If a client, a patient, or an organization could attribute what they received to a process, a platform, or a credential rather than to a specific individual, you are replaceable. If the outcome and the person can be separated, the outcome can be sourced elsewhere.</p><p>The professionals who will thrive in the age of AI will have made two moves. The first: <em>decoupling output from input</em>, so that intellectual capital scales beyond the hours in a day. The second: <em>coupling outcome tightly with identity</em>, so that what they produce cannot be attributed to anyone or anything else. The first move is about production economics. The second is about personal irreplaceability. Most of the literature on AI and professional work has described the first. Almost no one has described the second. Both are now mandatory.</p><p>My father understood the first move before AI existed. He flew bombers for the Indian Air Force before he became an entrepreneur. He had a precise way of cutting through complexity. He gave me a piece of advice that stayed with me: <em>don&#8217;t measure your costs and revenues in the same units</em>. He was speaking as a small business owner trying to protect his margins in the fabrication business, where he bought steel in kilos and sold window frames by the piece. After forty years, I appreciate how much insight his advice carried.</p><h3>What AI Threatens and What It Does Not</h3><p>The vulnerability of a professional to AI is determined by three structural characteristics of how their value is created and attributed.</p><p>The first is <strong>input transparency</strong>. When your process is observable and reproducible, AI can substitute for it directly. Document review, financial modeling, diagnostic imaging, market research, first-draft content production: these are processes that can be specified, observed, and replicated.</p><p>The second is <strong>output separability</strong>. When the artifact you produce can be detached from the person who produced it without loss of value, you have a separability problem. The contract, the report, the diagnosis, the slide deck: if the client values the deliverable and not the person who made it, they do not need you specifically. They need a capable producer. Increasingly, AI is that producer.</p><p>The third is <strong>outcome attribution</strong>. When the result the client cares about is attributed to a process, a firm, a platform, or a credential rather than to a specific individual, the individual has no moat. The client was never buying you. They were buying the result.</p><p>The most exposed professionals score high on all three. The most protected score low on all three: opaque inputs built through experience that cannot be observed or reproduced by instruction, inseparable outputs whose value is irreducibly bundled with the person who produced them, and outcomes attributed personally and completely to a specific individual.</p><p>This is what the pricing literature missed, because it was concerned with fee structures, not with the more fundamental question of whether the value survives the removal of the person who created it. Outcome-based pricing changes how you bill. It does not change whether you are replaceable.</p><h3>The Lawyer Who Gets You Four Billion Dollars</h3><p>David Boies bills by the hour. But the hourly rate is a billing convention. What his clients are paying for is something else entirely. When he represented the government in its antitrust case against Microsoft or advised investors in the aftermath of the Theranos collapse, the value being exchanged operated on an entirely different logic than billable hours. He is one of a handful of lawyers who can change the outcome of a case worth billions of dollars. His preparation, his deposition strategy, his cross-examination instinct are inputs, but they are not what the client is buying. The client is buying an outcome that would not exist without Boies in the courtroom. The input is opaque. The outcome is personally his. Both protective moves made.</p><p>The threat to lawyers is not at that level. It is at every level below it.</p><p>The Am Law 100 business model runs on leverage. Senior partners develop client relationships and originate business. Junior associates execute the work: document review, due diligence, contract drafting, research memos, discovery production. The spread between what associates bill and what they cost is the profit engine. That spread is collapsing, because the work is pattern recognition across large document sets, which is precisely what large language models do better than humans at a fraction of the cost.</p><p>When that justification disappears, so does the leverage model. Partners who own client relationships, whose outcomes are inseparable from their personal judgment, will capture more value than ever. Associates who expected to climb a partnership ladder by accumulating billable hours will find the ladder shortened from below.</p><p>The lawyers moving toward outcome-based pricing are on the right track. But they are only halfway there. Fixed fees, retainers, and contingency structures are the first move. The second move is making the attribution of the outcome irreducibly personal. Not the firm&#8217;s judgment. Not the practice group&#8217;s methodology. This person, this mind, this accumulation of experience that exists nowhere else.</p><h3>The Surgeon of Last Resort</h3><p>There is a small cohort of surgeons the ultra-wealthy call when a diagnosis has been delivered and they are not ready to accept it. They are the physicians who see the case three specialists missed, who will perform the procedure no one else at their institution will attempt. No patient asks how long the consultation took. No family asks about the hourly rate. The value is irreducible and the attribution is personal.</p><p>The disruption of medicine by AI is hurting the top of the cognitive hierarchy rather than the bottom, which surprises people until you understand the mechanism. Radiology. Pathology. Diagnostic dermatology. These specialties require exceptional training and years of pattern exposure. They are also, at their core, visual pattern recognition tasks, and AI has reached or exceeded human performance in several of them for specific conditions. The radiologist reading two hundred chest scans a day is performing high-skill cognitive labor whose process is entirely observable and whose output is completely separable from the individual who produced it. The report says &#8220;radiology department.&#8221; The patient rarely knows the reader&#8217;s name.</p><p>All three vulnerability conditions are present. AI steps into the gap, not because the radiologist lacks skill, but because the structure of the work makes both protective moves unavailable. The individual and the value were already disaggregated before AI arrived. AI simply made the disaggregation economically decisive.</p><p>The physicians who are protected are those whose value is irreducibly bundled with a specific person. The concierge physician who has known a family across three generations. The surgeon whose judgment and hands have performed this procedure ten thousand times and whose name is on the outcome in a way no departmental attribution can replicate. Their configuration is the mirror image of the radiologist&#8217;s, and so is their exposure.</p><h3>The McKinsey Senior Partner</h3><p>One of my former students is a senior partner at McKinsey. He does not write slides. In fact, he has not written slides in two decades. What he does is appear in the right boardroom at the right moment, when a CEO faces the kind of decision that ends careers or transforms companies. He draws on thirty years of wisdom to offer an invaluable and uniquely personal perspective. Then he leaves. The slides come from someone else.</p><p>His input is almost entirely invisible. His output is inseparable from him. Nobody in the room is thinking about the process. They want to know what this brilliant mind sees that they cannot.</p><p>The consulting model under pressure is not his. It is the pyramid surrounding him: the analysts and associates who ingest data, build financial models, populate templates, and produce deliverables that justify the engagement fee. AI does that work faster and more consistently at a cost approaching zero.</p><p>There is a structural vulnerability in consulting that does not exist in the same way in law. The consulting outcome is frequently attributed to the firm rather than to the individual. &#8220;McKinsey recommended this&#8221; is weaker personal attribution than &#8220;Boies argued this.&#8221; The brand intermediates between person and outcome, which dilutes the individual moat. The consultants building durable practices are those who have made their personal attribution explicit, who are known for a specific way of seeing problems that cannot be found elsewhere.</p><p>The industry is barbelling. At one end, firms with powerful institutional attribution will retain their position because the brand intermediates effectively between the work and the buyer. At the other end, boutiques built around a named individual with a distinctive point of view will also thrive. The vast middle is most exposed: firms and practitioners trading on neither strong institutional brand nor strong personal attribution, whose value proposition rests on competent execution of processes AI can now replicate.</p><p>For individuals inside strong institutions, the barbell creates a false sense of security. The institutional brand may protect the firm. It will not protect you if your outcomes cannot be attributed to you personally when the institution is removed from the equation.</p><h3>The Business Guru and the Forty-Year Draft</h3><p>I want to be transparent about something directly relevant to this argument.</p><p>This article went from first conversation to complete draft faster than I could have dictated it longhand. An AI model helped me structure the argument, pressure-test the logic, refine the exposition, and craft a logical narrative arc at a speed that was unthinkable three years ago.</p><p>And yet the framework at the center of this article, the three vulnerability conditions, the two protective moves, the distinction between the first move the pricing literature describes and the second move it ignores, are uniquely mine. Rooted in thirty-five years of watching how knowledge markets work, teaching strategy to executives across six continents, building programs, making mistakes, and developing a distinctive point of view. AI accelerated the time from idea to execution. It did not generate the insight. Archimedes promised to move the world if given a lever and a place to stand. My decades of experience are the place to stand. AI is the lever that extends the reach.</p><p>When Picasso was challenged on the price he charged for a portrait completed in minutes, he replied: &#8220;It took me all my life.&#8221; That is the precise situation of any professional whose value lives in accumulated judgment rather than current effort. The output looks effortless. The input is a career. As I like to say: a ladder has rungs for a reason.</p><p>I have reached more than forty thousand executive education participants in six years through online programs, generating over a hundred million dollars in gross revenue. In the classroom, I teach at most sixty students per class, the same number I taught thirty years ago. The decoupling between my input and my output is complete. But the scaling is only protected because the outcome remains attributed to something that cannot be replicated by removing me from the equation. Executives seek this specific point of view, these specific frameworks, this particular voice. The content scales without limit. The attribution does not transfer to the platform, the course, or the institution. It stays with me. Both moves made. Neither alone is sufficient.</p><h3>What the Pricing Literature Missed</h3><p>The case studies illustrate the logic. Here are the two moves stated precisely.</p><p><strong>The first move is decoupling.</strong> Break the link between your effort and your output so that your intellectual capital scales beyond the hours in your day. This is productization. Stop selling time. Build something that delivers value while you sleep. Don&#8217;t be lulled into a false sense of security by inertia. Clients do not defect the moment AI enters the picture. But inertia is a reprieve, not a defense.</p><p><strong>The second move is coupling.</strong> This is less intuitive and more important. Make the outcome of your work inseparable from your individual identity, so that a client, patient, or organization attributes the result to you specifically and not to a process, a firm, or a credential. Coupling is not about how you produce value. It is about whether the value is recognized as yours. It is the only durable answer to the question AI is now forcing every knowledge worker to confront: does the value exist because of you, or merely through you?</p><p>The two moves work in tandem. <em><strong>Decouple the production. Concentrate the attribution. Scale where you are going. Own the credit for arriving. </strong></em></p><p>My father&#8217;s aphorism reaches further than he intended. Don&#8217;t measure your costs and revenues in the same units. It is not advice about invoicing. It is a statement about the nature of value. What professional work costs and what it is worth have never been the same thing. The gap between them was sustainable because it was opaque. AI has made the gap transparent. And a visible gap cannot be sustained.</p><h3>Where This Leads</h3><p>The professionals who will define knowledge work in the next decade share a configuration. Their inputs are opaque, built through experience that cannot be observed or reproduced by instruction. Their outputs are inseparable from their individual identity. Their outcomes are personally owned.</p><p>The senior litigator whose cross-examination instinct was built across a thousand depositions. The surgeon whose judgment has been formed by ten thousand procedures. The strategy partner whose mastery over organizational dynamics came from sitting in a hundred boardrooms at the highest stakes. The educator whose frameworks were forged across forty years of research, teaching, and synthesis that no one else has done in quite this way.</p><p>What is happening to professional work is not replacement. It is disaggregation: the separation of expertise from the process that used to house it. The hours were never the value. The credential was never the moat. The process was never what the client was paying for. AI has not changed what professional value is. It has made it impossible to pretend otherwise.</p><p>The honest question every knowledge worker needs to answer is not whether AI can do their job. It is two questions, more uncomfortable and more precise. In what units is my value denominated? And is the outcome of my work inseparably mine?</p><p>If the value is in the hours, the repricing has already begun. If the outcome could be attributed to a process, a platform, a firm, or a credential rather than to a specific irreplaceable individual, the substitution is closer than it appears.</p><p>If your value lives in outcomes that only your specific accumulated judgment can deliver, and if those outcomes are irreversibly coupled to your identity, you are not competing with AI. You are using it.</p><p>My father gave me a pricing principle for his small business in India. As he smiles down at me from heaven, I salute him for his timeless insight.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.hiddenweave.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Hidden Weave! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item></channel></rss>