Two of the largest executive surveys published in the past year converge on the same finding, and it isn’t encouraging for companies planning their next AI budget cycle.
McKinsey’s 2025 State of AI survey covered nearly 2,000 executives across 105 countries. PwC’s 29th Annual CEO Survey reached 4,454 CEOs across 95 countries. Released months apart by different firms using different methodologies, they point at the same conclusion: almost everyone is spending more on AI, and most of it is not working.
The instinct when returns fall short is to blame the tools, the vendors, or the timeline. The data points somewhere else entirely.
The 92/56 Disconnect
A parallel McKinsey report, Superagency in the Workplace (January 2025), found that 92% of executives plan to increase their AI spending over the next three years. PwC’s CEO Survey found that 56% of CEOs report zero measurable returns from their current AI investments. Taken together, the numbers describe a market where confidence in AI budgets has completely decoupled from evidence of AI results.
The problem is not technological. The tools work. Language models summarize, classify, and generate at a level that would have been unimaginable three years ago. The disconnect sits between what the tools can do and how organizations have structured work around them.
McKinsey’s data sharpens the same point from a different angle: of 25 organizational attributes tested for their correlation with EBIT impact, fundamental workflow redesign ranked highest. Yet only about 21% of organizations using gen AI have redesigned any workflows, with nearly 80% layering AI on top of existing processes. The NBER research on AI adoption found the same mechanism operating at the individual firm level: the gap between companies that extract returns and those that don’t is management practice, not tool selection.
What the 6% Did Differently
McKinsey’s survey identifies a small group, roughly 6% of respondents, as AI high performers. These are companies generating measurable EBIT impact from their AI investments. The differentiator was not budget size. It was not the sophistication of the models deployed. High performers were 2.8 times more likely to have redesigned workflows around AI. PwC’s 2026 AI Performance Study found the same concentration from a different angle: 20% of companies capture 74% of AI-driven returns, and the cohort pulling ahead shows both a 2.5x investment advantage and the same workflow-redesign disposition.
The finding reframes the entire budget conversation.
For the 94% that have not restructured how work gets done, each additional AI dollar funds the same pattern: more tools competing for attention inside unchanged processes, more pilots without clear owners, and more fragmentation dressed up as digital transformation. The budget does not close the gap between these companies and the high performers. It widens it, because every unintegrated tool adds coordination overhead without contributing to outcomes.
The pattern mirrors what the vendor consolidation analysis already showed at the tool layer: when AI capabilities are fragmented across disconnected platforms, the cost shows up as coordination overhead, not just license fees. Every unintegrated tool consumes management attention that could be going to the workflow redesign itself. The same pattern plays out at the agent layer, where operational knowledge trapped inside vendor-specific contexts becomes recurring coordination cost rather than IP the company owns. The skills versus agents analysis walks through that lower layer.
Three Structural Redirects
The question for any CEO reviewing their AI budget is not whether to spend, but how to redirect spending toward the management infrastructure that actually produces returns.
Audit integration, not adoption. Walk through every AI tool currently in use and ask one question: is this tool embedded in a workflow with a clear owner, or is it a standalone pilot waiting for someone to figure out what to do with it? Standalone pilots are where AI budgets go to disappear quietly. Gartner’s 2024 research estimated that 30% of generative AI projects would be abandoned after the proof-of-concept stage by the end of 2025, largely because they never connected to a workflow that justified continued investment.
Fund process redesign before the next tool. McKinsey’s 2.8x finding is a direct instruction. For every dollar allocated to AI tooling, allocate matching resources to workflow mapping, role clarity, and change management. The tool without the redesign produces a dashboard nobody uses. The redesign without the tool produces a process ready to scale the moment the right capability plugs in.
Replace adoption metrics with outcome metrics. Most companies track AI adoption by counting seats licensed, queries processed, or features activated. None of these predict business impact. The metrics that matter are the ones that connect to decisions: revenue per employee, decision cycle time for strategic commitments, customer retention rates in AI-assisted interactions. If the AI investment is not moving these numbers, the budget is funding activity, not progress.
The Budget as Amplifier
The uncomfortable truth from both surveys is that AI budgets function as amplifiers, not solutions. For companies that have done the management work, meaning redesigned workflows, assigned clear ownership, and connected AI outputs to business decisions, more budget accelerates returns. For companies that have not, more budget accelerates the confusion.
The 6% pulling ahead are not outspending everyone else. They are out-managing them. And the gap between the two groups grows with every budget cycle that treats new tools as a substitute for the harder, less photogenic work of changing how decisions actually get made.
Stanford’s 2026 AI Index sharpened the allocation side of this argument with a 3-4x productivity spread across functions and a macro-level aggregation hedge. The four allocation rules that fall out of Chapter 4 are the next move for CEOs who want to redirect the 2026 budget against the spread rather than the average.