← All articles

You're Not Using AI Wrong. You're Building Wrong.

5 min read
You're Not Using AI Wrong. You're Building Wrong.

Harvard Business Review published what might be the clearest diagnosis of the AI transformation gap to date. Not model quality. Not data readiness. The gap between what AI can do and how organizations are actually designed.

The authors, a Harvard professor and Microsoft’s AI lead, called it “the last mile problem.” Their conclusion: “The challenge for today’s executives is to decide if they are willing to redesign the organization.”

Most aren’t. They’re buying tools, training employees, measuring adoption rates. Most of the effort goes into getting people to use AI. Almost none goes into redesigning how the work actually flows.

The numbers behind the gap

McKinsey tested 25 organizational attributes to find what separates companies that actually profit from AI from those that don’t. Out of everything they measured, one factor had the biggest effect: workflow redesign. Companies that fundamentally redesigned individual workflows were nearly 3x more likely to see real EBIT impact from AI (55% of high performers vs. 20% of all others).

Not better models. Not bigger budgets. Not more training. The single most predictive factor was whether someone sat down and rethought how the work gets done before plugging AI into it.

Andrew Ng demonstrated this from the technical side. A single prompt to write code produced a 48% success rate. The same model, structured into a multi-step workflow, hit 95%. Nothing changed except the architecture around it.

The pattern is consistent: the tool isn’t the bottleneck. The process around it is.

Where the money actually goes

BCG’s research reveals how organizations allocate their AI investments. Companies that successfully create value from AI follow what BCG calls the 10-20-70 principle: 10% on algorithms, 20% on technology, and 70% on people and processes.

Most organizations do the opposite. They invest heavily in tools and infrastructure, then wonder why adoption stalls. Companies that restructure critical workflows see 30-50% efficiency gains. Those that deploy AI into existing operations without changing anything see 10-20% productivity improvement. The difference between “using AI” and “building for AI” is a 2-3x gap in outcomes.

Gartner’s survey of 700+ CIOs puts the cost in perspective: 72% report their organizations are breaking even or losing money on AI investments. Gartner attributes this to hidden costs and change management challenges, not model quality or data issues. As one Gartner analyst put it: “While not all AI is ready to deliver value, humans are even less ready to capture it.”

The organizational design problem

Deloitte surveyed 3,235 business and IT leaders across 24 countries. The findings split the market into three camps: 37% use AI at surface level with little or no change to underlying business processes. 30% are redesigning key processes around AI. And 34% report using AI to deeply transform products, processes, or business models.

The 37% in the first camp are the “using AI” organizations. They’ve distributed tools. They’ve run training sessions. They’ve measured adoption rates. And they’ve left the underlying workflow untouched.

Even more telling: 84% of organizations have not redesigned jobs around AI capabilities. The technology changed. The org chart didn’t.

HBR calls the current default “process debt.” Automating a broken process just makes it break faster. Deloitte adds a complementary concept: “cultural debt,” the negative consequences organizations accumulate when they scale AI without redesigning leadership behavior, accountability, and workplace norms. 65% of respondents believe their culture needs to change significantly for AI, but only 5% are making great progress on it.

What building for AI actually looks like

The distinction is practical, not philosophical.

Using AI means distributing a tool and hoping for results. Building for AI means redesigning the workflow before the tool touches it. One is a training initiative. The other is a design decision.

Giving every employee a calculator didn’t transform accounting. Redesigning the accounting department did.

Building for AI starts with a question: where does this process break, and how would we design it from scratch if AI were a given? That’s different from asking: where can we add AI to what we already do?

The companies in McKinsey’s high-performer category didn’t just adopt AI faster. They rethought which decisions need human judgment, which tasks can be automated end-to-end, and where the handoffs between human and machine should sit. The redesign came first. The tool came second.

The uncomfortable part

This isn’t a technology project. It touches roles, approval flows, team handoffs, and how decisions get made. That’s why it keeps getting postponed.

Consider what happens when a company adds AI to its customer support workflow. The typical approach: give every agent a chatbot assistant and measure how many tickets they close. The redesign approach: rethink the entire support flow. Which questions should never reach a human? Which ones require judgment that AI can’t provide? Where does a human review an AI draft before it goes out, and where does the AI handle the full cycle? That redesign changes job descriptions, team sizes, escalation paths, and quality metrics. It’s not an AI project anymore. It’s an operations project.

BCG’s 10-20-70 principle suggests that 70% of the effort should go into people and process changes. Most organizations allocate that 70% to technology and wonder why the remaining 30% (the people part) is creating friction.

The tools are easy. The redesign is the actual work.

And this gap is widening, not shrinking. Models are getting smarter faster than organizations are redesigning. The companies that treat AI as a design constraint from day one, rethinking end-to-end flows, human-AI handoffs, and what “a job” even means, will pull away structurally. Everyone else will keep celebrating a 12% productivity bump while the redesigners compound 40-50% gains and then layer the next generation of models on top of already-optimized systems.

Before your next AI initiative, ask one question: are you adding a tool to an existing process, or redesigning the process for what AI makes possible?


Related: 515 Startups Got the Same AI Tools. The Ones Who Saw a Map Generated 1.9x More Revenue. presents what may be the cleanest experiment to date on why reorganizing production around AI matters more than the tools themselves.

Ron Gold Founder, A-Eye Level
Read the original post on LinkedIn Get the weekly signal