The Training Paradox

Why AI budgets are soaring while training budgets crater

Happy Monday!

AI budgets are up 88% while training budgets are down 8 percentage points.

Think about that for a second. Companies are spending billions on models and infrastructure, then acting shocked when their teams can't use any of it effectively. It's like buying everyone a Ferrari and being surprised they can't drive stick.

Two major reports dropped recently: Wharton's third annual AI adoption study and McKinsey's State of AI. Between them, they surveyed nearly 3,000 organizations. They both tell the same story: we've moved from "this is fascinating" (2023) to "let's try everything" (2024) to "okay seriously, where's the ROI?" (2025).

The technical adoption numbers look great: 82% use AI weekly, 46% use it daily, and usage is mainstream.

But here's the problem: the human capital side is collapsing. Training investment is cratering. Confidence in training as a solution dropped 14 percentage points. Half of employers can't hire the AI talent they need. And 43% of leaders are warning that their employees are losing skills because they're too dependent on AI they don't actually understand.

QUICK TAKES:

  • 46% use AI daily (up 17% year-over-year)

  • 72% now formally measuring ROI

  • Training budgets DOWN 8% despite widening skill gaps

  • 43% warn of skill atrophy from AI dependence

  • Executives think it's working (81% see positive ROI), managers don't (69%)

Wharton's year-three data shows AI usage is mainstream, but while 72% now measure ROI and 3 out of 4 see positive returns, training investment is cratering. Meanwhile, 43% warn employees are losing skills to AI dependence, and hiring advanced AI talent remains the #1 challenge. McKinsey confirms: 88% use AI but most haven't embedded deeply enough for enterprise-level benefits. While everyone focused on models, nobody investing in the people using them.

TL;DR

The Three-Year Pattern

2023: 37% weekly use. Everyone's exploring. Sentiment is "fascinated but cautious."

2024: 72% weekly use. Spending up 130%. Pilots everywhere. Excitement maturing into "wait, does this actually work?"

2025: 82% weekly use. 72% formally measuring ROI. The party's over—prove value or lose budget.

2026 prediction: Scale what delivers returns, kill everything else. Deploy agentic systems. Reallocate budgets toward proven programs.

But that 2026 vision only happens if someone fixes the training crisis. Right now, nobody's even trying.

The Numbers Everyone's Celebrating

Daily AI use jumped 17 percentage points to 46%. Weekly use is at 82%. Nearly three-quarters are formally measuring ROI: tracking profitability, throughput, workforce productivity. And most are seeing positive returns.

Sounds great, right?

Except when you look at what people are actually doing with AI. It's mostly shallow productivity wins: document summarization, data analysis, editing. These aren't transformative use cases. They're "make my existing job slightly faster" cases.

McKinsey's data confirms this. Yeah, 88% of organizations use AI in at least one function, but most haven't embedded it deeply enough for enterprise-level benefits. They're sprinkling AI on top of existing workflows instead of redesigning how work actually gets done.

And the gaps between functions are wild. IT and Procurement are fairly AI-native at this point with high usage and high confidence. Marketing and Operations? Still lagging in the same pattern as 2023. Three years in, and we haven't closed the divide.

The ROI Reality: Who's Lying?

Here's where it gets interesting. 72% are measuring ROI. Three out of four see positive returns. Budget increases of 10%+ are expected by 62% over the next few years.

But there's a key split: 81% of VP-level executives see positive ROI while only 69% of Mid-managers do. They work at the same companies. They're looking at the same AI deployments, but they're seeing completely different realities.

VP’s see usage dashboards trending up and call it success. Managers see broken workflows, confused teams, and tools that don't actually integrate with how work gets done. Both can be right at the same time.

What's actually happening? Budget reallocation means that companies aren't finding new money for AI, they're cannibalizing legacy IT and HR budgets. The stuff that doesn't deliver measurable returns is getting killed. Fast.

And 30% of AI tech budgets are going to internal R&D. Companies are building custom solutions instead of buying off-the-shelf. Only time will tell who wins and who loses in the age-old “build vs buy” discussion.

Bottom line: FOMO spending is dead. "Show me the metrics or lose your budget" is the new normal.

The Training Crisis That's Getting Worse

Training budgets are down 8 percentage points. Confidence in training as the path to AI fluency is down 14 percentage points. Only a third of employees received any AI training in the past year.

Meanwhile, half of employers say they can't fill AI-related positions. The reality is companies have given up on training their existing workforce while simultaneously complaining they can't hire the talent they need. The talent shortage they're experiencing is partially self-inflicted.

Meanwhile, 43% of leaders warn that employees are losing fundamental skills because they're too dependent on AI tools they don't understand. We're creating a generation of workers who can't do the underlying work anymore because the AI does it for them, but they also can't judge if the AI's output is any good because they never learned the fundamentals.

Additionally, nearly half of organizations report technical skill gaps. Recruiting AI talent is seen as a primary challenge, and the International Data Corporation (IDC) estimates the skills shortage could cost the global economy $5.5 trillion by 2026.

So what are companies doing about it? They hope employees figure it out, they are trying to hire unicorns that don't exist at scale, and they are relying on vendors to solve what are fundamentally organizational problems.

And yet 48% of employees rank training as the most important factor for AI adoption. Companies with enterprise-wide AI literacy programs report 90% faster decision-making, 81% increased revenue, and 81% better retention.

The ROI of training is obvious. But training budgets are shrinking while AI budgets explode.

The Executive-Manager War

56% of VP’s think their organization is adopting AI much faster than peers while only 28% of managers think the same thing. These are members of the same companies with radically different worldviews.

VP’s want top-down rollout and centralized control. Managers want employee-led innovation and actual training programs. VP’s only see adoption metrics while managers see their teams struggling with half-baked implementations.

Executive leadership in AI has surged in recent years: 67% of organizations now have C-suite ownership of AI programs and 60% have Chief AI Officers. Strategy moved to the top, but the people actually doing the work aren't bought in fully.

These fundamental disagreements about how to roll AI out are killing momentum and preventing many organizations from seeing the results these programs can ultimately deliver.

What Actually Works

McKinsey found that high performers are 3x more likely to fundamentally redesign workflows instead of just adding AI to existing processes. Laggards add ChatGPT to their email workflow and call it transformation, while leaders rebuild how work happens with AI as a core assumption from the start.

Other patterns that separate winners from losers:

Laggards have tight usage restrictions, slow rollout, unclear guardrails, and low trust.

Leaders have open access, fast deployment, crystal-clear guardrails, and they're using AI to govern AI (62% for fraud detection, 59% for risk management).

The widening gap goes beyond technology and encompasses organizational design. That said, it’s important that these transformations do fit into existing workflows. Redesigning systems is important, but they can’t be so radically different that employees have to fundamentally change everything about their process.

What This Means If You're Building AI

Your buyers are measuring ROI like their jobs depend on it, because they often do. Vague transformation promises are dead. In order to win, you must tie your product to productivity, profitability, or throughput. Showing tangible metrics becomes critically important.

Your users are undertrained. A third got zero AI training last year. That means your product needs to be so intuitive that untrained users get value immediately, or you need to build training into your go-to-market motion. Documentation alone won't cut it.

I’ve said this several times in the past, but the reality is integration matters more than features. The most-used AI tools are the ones that fit into existing workflows like document editing, data analysis, and summarization. If your product requires learning a whole new tool, you're fighting an uphill battle.

16% of companies still barely use AI. As the gap widens, they'll pay for solutions designed for low AI literacy, heavy guardrails, and minimal change management. On the other hand, many large enterprises are still building, not buying. While I don’t believe this is a viable long-term strategy for most organizations, it’s still an important metric to consider when building AI tools.

The Bottom Line

Three years into gen AI, usage is mainstream but value is concentrated. The companies winning aren't the ones with the biggest AI budgets. They're often the ones investing in people, redesigning workflows, and measuring what matters.

The companies losing are easy to spot: exploding AI budgets, shrinking training budgets, executives convinced it's working while managers watch it fail, and growing gaps between what the tools can do and whether teams know how to use them.

In motion,
Justin Wright

If we're spending billions on AI infrastructure while training budgets crater and nearly half of leaders warn of skill atrophy, are we building a workforce that's dependent on tools they don't understand, and what happens when those tools evolve faster than people can adapt?

Food for Thought

Sources

  1. Disrupting the f irst reported AI-orchestrated cyber espionage campaign (Anthropic)

  2. An Agent that Plays, Reasons, and Learns With You in Virtual 3D Worlds (Google)

  3. Cursor raises a massive Series D round (Cursor)

  4. GPT-5.1: A smarter, more conversational ChatGPT (OpenAI)

  5. ElevenLabs launches celebrity voices for their voice model (ElevenLabs)

I am excited to officially announce the launch of my podcast Mostly Humans: An AI and business podcast for everyone!

Episodes can be found below - please like, subscribe, and comment!