← All articles

Your Board Is Asking About AI. What Does Your Report Actually Say?

4 min read
Your Board Is Asking About AI. What Does Your Report Actually Say?

Boards are asking about AI returns. Not whether you’re using it. They want to know what it’s producing. And most companies don’t have a good answer.

PwC surveyed 4,454 CEOs globally in their 29th Annual CEO Survey. 56% reported no significant financial benefit from their AI investments. The money went in. The results didn’t come out. That’s not an adoption problem. That’s a measurement problem.

The knowledge gap in the boardroom

Deloitte’s Global Boardroom Program surveyed 695 board members and C-suite executives across 56 countries. The findings tell a clear story. Only 5% of organizations have AI fully incorporated into their business and operating plans. Two-thirds of board members describe their own AI knowledge as “limited to none,” improved from 79% the prior year, but still a majority.

Board members are approving AI budgets they can’t evaluate. They’re signing off on strategies they don’t fully understand. Not because they lack intelligence or diligence, but because no one has given them a framework for tracking AI progress the way they track revenue, margins, or customer retention.

The pressure from outside

The gap isn’t staying inside the boardroom. PwC’s Global Investor Survey, covering 1,074 investment professionals across 26 countries, found that only 34% of investors believe companies adequately disclose their AI governance practices. 42% want more transparency on AI returns.

The people writing the checks are saying: we can’t see what you’re doing.

The pressure works from both directions. Inside the organization, boards lack the knowledge to ask the right questions. Outside, investors are already asking them. The companies caught in between are the ones reporting “we’re using AI” without being able to say what it produced.

What a real AI board report looks like

“We’re using AI” is not a board update. “We reduced processing time by 40% in three departments using automated classification” is. The difference between those two statements is a reporting framework.

A functional AI progress report to a board covers three things:

What you measure. Not adoption metrics like “percentage of employees with access.” Outcome metrics: cost reduction per process, time savings per workflow, error rate changes, revenue influenced by AI-assisted decisions. The metric has to connect AI activity to business impact, not just AI activity to AI activity.

How you attribute value. AI rarely works in isolation. A process that improved 30% might owe half of that to the AI tool and half to the workflow redesign that accompanied it. Boards need honest attribution, not inflated claims. Overstating AI’s contribution erodes trust faster than understating it.

What you commit to reporting each quarter. Consistency matters more than comprehensiveness. A board that receives the same five metrics every quarter can track trajectory. A board that gets a different slide deck each time has no baseline for comparison.

The reporting discipline test

Most companies don’t have this framework yet. The data says so. Deloitte found that 35% of organizations have no formal AI strategy at all, and another 42% are still developing their roadmap. If you don’t have a strategy, you certainly don’t have a reporting structure for it.

The companies that will build board confidence in AI aren’t the ones spending the most. They’re the ones that can show what the spending produced, clearly, consistently, and with honest attribution.

Before your next board meeting, ask one question: if someone asked you to show exactly what your AI investments produced last quarter, how long would it take you to answer?


Related: The AI Agent Governance Gap examines the operational side: 81% of companies scaling agents without the governance to match.

Ron Gold Founder, A-Eye Level
Read the original post on LinkedIn Get the weekly signal