your team isn’t using the AI tools you’ve bought
Licences sit dormant. People default to old habits. The investment looks wasted. It usually isn't a tools problem — it's a confidence problem, and there are three patterns that fix it.
take the Snapshot to find your blockersFree. Five minutes. No card needed.
“We rolled out Copilot to forty people. After three months, eight were using it. The Snapshot showed us the issue wasn't training — it was that no one had told the team what AI was allowed to be used FOR.”
three patterns we see again and again
Most failed AI adoptions look the same on the way down. The licences arrive, the lunchtime demo runs, the early adopters do their thing — and then the curve flattens, the dormant accounts pile up, and the procurement decision starts to feel like a mistake.
- No permission. People aren't sure what they're allowed to put into a model — so they don't try. The fix is a 'green list' you can publish in a fortnight.
- No reward. Trying AI is extra work in week one to save time in week three. If nobody is celebrating the week-three saving, the experiment dies. The fix is a weekly five-minute share.
- No anchor. Tools without a use-case stay generic. The fix is to anchor each tool to one concrete weekly task per role and review it for a month.
where your blockers actually are
The People pillar of the Snapshot scores three things specifically: how your team feels about AI, how AI-literate leadership is, and whether anyone has the capacity to learn. Low scores on any of these predict a stalled rollout — and they predict it independently of which tool you bought.
If your scores are high but adoption is still stalling, the answer is usually in the Documents and Strategy pillars instead — and the report tells you which.
a plan for the next four weeks
The 12-week plan is sequenced. The first four weeks for low-people-readiness businesses are almost always the same shape: publish a green list, run a single use-case sprint per team, set up the weekly five-minute share. Cheap, repeatable, and it moves dormant licences into daily use.
Week five onwards is where the picks diverge by business. The report shows you that fork too.
the things people ask
- We've already done training. Why isn't it sticking?
- Training without a 'green list' and a weekly cadence almost always plateaus. People remember the demo, but they don't have a specific weekly task to anchor to and they aren't sure what they're allowed to use AI for. The Snapshot flags both gaps.
- Is this Microsoft Copilot specific?
- No. The People pillar predicts adoption regardless of vendor — Copilot, ChatGPT Enterprise, Gemini Workspace, custom internal tools. The blocker patterns are the same.
- What if leadership isn't AI-literate?
- That's the most common low-scoring sub-pillar we see. The fix isn't a CEO training course — it's two-week leadership pairing with a fluent operator and a small public commitment from leadership about what AI is for. The plan walks through it.
find the real reason adoption stalled
Five minutes. The three blockers, ranked. The three fixes, sequenced. Yours by tomorrow morning.
take the Snapshot to find your blockers