On social media, it feels like every company is already running on AI.
In the Monday meeting, it is different: a handful of pilots, a few rogue power users, a nervous GC, and a spreadsheet that still runs the show.
That gap is not a failure of vision. It is the diffusion pattern of enterprise AI: the way ideas seep, stall, and finally stick inside real organizations. And it is always slower, messier, and more uneven than what the consumer landscape makes you expect.
If you misread that pattern, you misjudge maturity. You assume "everyone is doing it," push for a big-bang rollout, and then wonder why the dashboard looks flat six months later.
Why enterprises always lag the feed
Consumer AI diffuses at the level of the individual: one person downloads an app, tries a prompt, and keeps or deletes it.
Enterprise AI diffuses at the level of roles, workflows, and risk:
- A partner worries about privilege and confidentiality.
- A KM lead thinks in taxonomies and retention policies.
- A line manager thinks in error rates and rework.
- A CIO thinks in vendor lock-in and integration debt.
Every one of those actors slows the curve in a different way. This is why "everyone is using AI" on Twitter translates to "a few motivated people are experimenting at the edges" inside a firm.
If your mental model is "consumer-style viral growth, but at work," you will be permanently disappointed.
A more honest diffusion pattern
When you look closely, enterprise AI maturity tends to follow a repeatable pattern:
-
Hype at the edges.
A few curious people automate tasks in their own stack: drafting emails, summarizing transcripts, cleaning up research. None of this is logged. It does not show up in your metrics, but it changes expectations. -
Shadow workflows.
Teams quietly build "illegal" processes: pasting client text into web UIs, screenshotting tools, routing everything through a single power user. Value is real, risk is also real, and governance is basically a shared Google Doc. -
Official pilots.
Someone finally decides, "We should do this properly." You pick a narrow use case, wrap it in policy, and negotiate with vendors. Adoption is still thin, but the work becomes visible, trackable, and arguable. -
Workflow redesign.
The real break comes when you stop asking, "Where can we bolt AI on?" and start asking, "If we assumed a reliable model exists, how should this workflow look?" Hand-offs, approval steps, and metrics all change. -
Platform thinking.
Only at the far end of maturity do you see AI as infrastructure: common services, reusable components, internal tooling, and clear patterns for how new use cases plug in.
Social media mostly shows you stages 4 and 5. Your firm is probably bouncing between 1 and 3.
There is nothing wrong with that. The problem is pretending they are the same.
What diffusion patterns reveal about maturity
Once you see diffusion as a pattern rather than a score, the conversation about maturity changes.
"Advanced" and "behind" stop making sense as labels. A firm that has one genuinely redesigned workflow is in a different place than a firm with twelve pilots that never got past stage 3. The former actually understands something. The latter has expensive confusion.
The more interesting question is not where you rank. It is where the friction lives. Is AI absent because of risk and compliance? That is a governance problem. Because of missing integrations? That is an infrastructure problem. Because nobody has made the case internally? That is a different problem again. Same apparent "lag," completely different roots.
The stage model also exposes a particular kind of self-deception I see often: firms that count stage-2 shadow workflows as evidence of maturity, because the usage feels real and organic. But shadow workflows are not maturity. They are just unmanaged diffusion. The value is real; the risk is also real; and nothing has actually changed about how work gets done.
Maturity, as I think about it, is less about how much AI is running inside a firm and more about how clearly the firm can see where it sits in the pattern — and why.
The diffusion pattern of AI in the enterprise is always slower than the social media narrative and the consumer landscape make it feel.
That is not an excuse for slowness. It is just an accurate description of how large organizations actually absorb change. The firms I find most interesting are not the ones racing to close the gap with the feed. They are the ones that have stopped measuring themselves against the feed entirely.