guide

How Do I Know If AI Is Actually Working?

TL;DR: Buying AI tools is not adoption. Most companies stall because they confuse access with adoption, usage with impact, and distributed responsibility with ownership. Close these three gaps and you can answer the board when they ask if AI is working.

The Question Nobody Can Answer

The board asks: "Is AI working?" The room goes quiet.

Not because the company hasn't invested. They have. Licences were purchased. Tools were rolled out. Teams were told to start using AI. There was a town hall. Maybe a Slack channel.

The silence comes because nobody can prove it changed anything.

This is not a technology problem. The tools work fine. This is a measurement problem, an ownership problem, and sometimes an honesty problem.

Most companies I work with can tell me what they bought. Very few can tell me what it did.

The Three Gaps

Companies that stall on AI adoption usually hit one of three gaps. Sometimes all three.

The Access vs. Adoption Gap. Licences and logins are not adoption. You gave everyone access. That is procurement, not transformation. True adoption means workflows changed. People do things differently than they did before. And that change persists without someone standing over them.

Most companies stop at access and call it done. It is not done.

The Usage vs. Impact Gap. "The team is getting comfortable with AI" is not a business outcome. Regular usage does not guarantee value. Someone using ChatGPT to rewrite emails every day is usage. It is not impact. Impact means something measurable changed: cycle time dropped, error rates fell, throughput increased. If you cannot point to a number, you have activity, not results.

The Ownership Gap. When responsibility for AI adoption is distributed across departments, nobody tracks outcomes. Product says it is engineering's job. Engineering says it is product's job. Everyone is "exploring AI" and nobody is accountable for whether it worked.

Progress requires a single accountable owner. Not a committee. One person.

What Real Adoption Looks Like

Real adoption is behavioural change that persists without supervision.

It is not someone using Copilot because their manager asked them to try it. It is someone using Copilot because it makes them faster and they would notice if it disappeared.

You can see it in the work. Deployments happen more frequently. Code reviews move faster. Customer support resolves tickets in fewer steps. Documentation exists where it didn't before.

You can also see when it is not there. The tools are available but ignored. People revert to old workflows under pressure. Nobody can explain what changed.

The difference between real adoption and theatre is specificity. "We're using AI" is theatre. "AI-assisted code review reduced our average review cycle from 3 days to 1 day" is adoption.

Boards do not want enthusiasm. They want evidence.

The One Thing That Works

Every company I have seen succeed at AI adoption shares one thing: individual ownership.

Someone is accountable for tracking whether meaningful change occurred. Not for buying tools. Not for running workshops. For proving that something in the business is different because of AI.

That person needs three things.

One. Authority to change workflows. If they can only recommend, nothing will happen.

Two. A clear baseline. You cannot measure change if you do not know where you started.

Three. A reporting cadence. Monthly at minimum. What changed, what didn't, what's next.

The title does not matter. CTO, VP of Engineering, Head of AI, COO. What matters is that when the board asks the question, one person can answer it with data.

The AI Adoption Playbook

I built an open-source framework for this. The AI Adoption Playbook runs inside Claude Code and is built for anyone responsible for AI adoption: founders, CTOs, CAIOs, VPs of Engineering, COOs.

It covers three things.

Diagnostic assessment. Where are you actually stuck? Not where you think you are stuck. The playbook identifies which of the three gaps is blocking you and what to do about it.

90-day implementation plan. Clear owners, clear milestones, clear deliverables. Not a strategy deck. A plan that someone can execute starting tomorrow.

Board narrative coaching. How to present AI adoption progress to stakeholders in a way that is honest, specific, and compelling. Because "we're making good progress" is not a board update.

The playbook is free and open source. Check it out on GitHub.

Related Questions

How do I measure AI ROI?

Stop measuring activity and start measuring outcomes. Licences purchased and logins per week tell you nothing.

Track what changed in the business: cycle time, error rates, throughput, revenue per head. If you cannot connect AI usage to a business metric, you are not measuring ROI. You are measuring spend.

Who should own AI adoption?

One person. Not a committee, not a working group, not "everyone."

Distributed ownership means nobody tracks outcomes. Pick someone with the authority to make changes and the accountability to report results.

Title matters less than mandate. It could be a CTO, a VP of Engineering, or a Head of AI. What matters is that when the board asks "Is AI working?", one person can answer.

What is a realistic timeline for AI adoption?

Ninety days to see signal. Not transformation, but measurable change.

The first month is diagnostic: understanding where you actually are versus where you think you are. The second month is implementation: changing workflows, not just adding tools. The third month is measurement: proving what worked and cutting what didn't.

Most companies skip the diagnostic and jump to buying tools. That is why they stall.

Why is my team using AI but nothing is changing?

Because usage is not the same as adoption. People can log in every day and still work the same way they always have.

Real adoption means workflows changed. Decisions happen faster. Output quality improved. If none of that is true, you have access, not adoption.

The fix is not more tools. It is clearer expectations about what should be different.

Need help with this?

Book a free 20-min call