In 2016, Harvard Business Review published an article on the failures of leadership training and said this:
“American companies spend enormous amounts of money on employee training and education—$160 billion in the United States and close to $356 billion globally in 2015 alone—but they are not getting a good return on their investment. For the most part, the learning doesn’t lead to better organizational performance, because people soon revert to their old ways of doing things.”
That article was written almost nine years ago from the day of this blog’s publishing—and not much has changed since then.
In those nine years, and at this particular point in my career, it has become a point of deja vu that I read celebratory comments about launching new learning programs and initiatives, only to read yet another research paper about how we’ve failed our learners again.
As someone who has been in the learning field for close to a decade, I’ve had the time and opportunity to reflect and build a voice around how to flip decades of wasteful spending on large-scale learning.
The Real Reasons Big Programs Miss
Weak Transfer Systems, Not Weak Courses
Learning transfer, or how people actually use the knowledge gained in a learning event, is notoriously fragile. Research evidence shows that transfer depends as much on the work environment and follow-through as on the training intervention itself (think manager support, cues in workflow, and consequences). If these environments aren’t designed in tandem with learning events, knowledge decays and habits revert.
We Teach for Memory, but Don’t Engineer Retrieval
One-and-done events are still common, despite robust evidence for spacing and retrieval practice. Memory strengthens when recall is effortful and repeated over time. However, most enterprise programs ship a single “learning event,” then create an expectation that learning will occur through the osmosis of work.
Culture and Sponsorship Are Treated as Comms, Not Conditions
When leaders don’t change what they attend to, measure, and reward, the culture won’t change either. Long-running benchmarking, done by organizations such as Prosci, shows that active, visible sponsorship is the number one success factor in change efforts, but many “learning at scale” launches outsource sponsorship to a kickoff email.
Personalization Is Promised, but Rarely Delivered Where it Counts
Learners don’t want “Netflix for learning.” L&D leaders say aligning to business goals is now their top focus, yet most catalogs being purchased by businesses still push generic pathways that don’t connect to daily work. Or worse, businesses push AI algorithms that haven’t been properly trained in organizational values, goals, and intended outcomes.
We Measure What’s Easy, Not What Matters
Many businesses are still only focusing on measuring Kirkpatrick Level 1 (maybe Level 2) outcomes; things like our fan-favorite completion percentage metrics and smile sheet surveys are still rampant. For years now, we have been unable to create a remedy that helps us move past these default metrics. Without linking learning to performance and business outcomes, exec confidence rightfully drops.
Designing for Use of Knowledge Gained
So, that may have seemed a bit doom and gloom, but we honestly need to take a reality check in order to flip the script and achieve meaningful outcomes. As business executives AND learning leaders find better ways to collaborate, we have to reflect on what’s gone awry so that we can build a future that’s better prepared for a changing workforce and technological landscape.
Below is a practical playbook that blends learning science with implementation science. As defined by the University of Washington’s Department of Global Health, implementation science seeks to systematically close the gap between what we know and what we do.
If you already build great content, this is your path to results.
Start With a Behavior Definition: Capability, Opportunity, Motivation (COM-B) Framework
- Capability: Do people have the skills/knowledge and cognitive bandwidth to apply the behavior?
- Opportunity: Are there prompts, time, tools, and social norms that make the behavior easy?
- Motivation: Are incentives, identity, and feedback aligned with the new behavior?
Before you storyboard a course, define the target behavior precisely and diagnose what must change in COM-B. For example, “Quarterly business reviews use the new pricing playbook” is a behavior; “understand the playbook” is not. For those of you who have always wondered why I hate the word “understand” when it comes to building course objectives, this is why.
Using COM-B helps take that learner analysis a step further to decide whether the bottleneck is a training problem (capability), a workflow problem (opportunity), or a leadership/measurement problem (motivation). Then build only what the diagnosis demands.
Engineer for Retrieval Over Repeating
This isn’t theory: spaced practice and retrieval practice consistently outperform massed study for durable learning and real-world transfer. Replace or supplement single learning events with a retrieval and spacing cadence that follows the forgetting curve, with boosters when people are likely to lapse:
- Instead of asking learners to re-read steps from a learner guide, generate a safe space for learning that focuses on the action of the steps. Doing this in the form of microlearning or bite-sized content is a great way to ensure optimal retrieval.
- Incorporate micro-assessments and checklists at the natural moments of use (e.g., pipeline reviews, shift change, code review). Reflections are a great opportunity for learners to celebrate the knowledge they’ve gained or identify ways to continue to improve.
- Build a cadence into calendars using the tools that you are currently using or plan to use in the future (LMS integrations, Slack/Teams bots, collaboration tools such as Asana or Confluence, etc.).
Give the Learning Environment the Same Attention as the Product
Learning doesn’t exist in a vacuum. Your “user interface” isn’t just a course; it’s the environment surrounding the behavior your learners have learned. Things like manager conversations, dashboards, forms, templates, and default settings are all factors that will affect your learners’ ability to perform with their new knowledge on hand. Design these to nudge the new behavior by default:
- Update templates and systems (e.g., pre-filled fields, dropdown defaults) to make the new practice the easiest path.
- Review your fields in your CRM/WMS/ATS and add or modify them to force a moment of retrieval (“Which playbook tier?” “Which risk flag?”).
- Equip managers with 10-minute coaching scripts and observables (what to look for on ride-alongs, in huddles, or in Jira/Confluence artifacts). Research shows transfer rises when the climate (cues, consequences, social proof) aligns with the target behavior. Design that climate as deliberately as you design your course.
Make Sponsorship Operational, Not Ceremonial
Sponsors need to show up beyond kickoff. It doesn’t have to be huge. Recognizing good behaviors in all-hands, approving updated KPIs, or visiting teams once a quarter all matter. The key is visible action. People notice when leaders pay attention.
Measure Use Over Attendance
Most companies still “measure learning” by counting completions, tracking quiz scores, or handing out smile sheets after a workshop. These numbers look tidy, but they don’t actually tell you if anything changed. Additionally, the weakest point usually lies in the follow-up and measurement of visible behavior change.
A better way: track the smallest visible action that proves people are using what they learned, and create a loop to reteach when KPIs aren’t met. For example, instead of “90% finished the safety course,” track “Our number of X-specific incidents since this training went live is down X%. Where we have the most opportunity is focusing on X area to bring down related events.”
Don’t celebrate training for being completed, celebrate it for making progress towards the change you want to see.
A Quick Self-Audit To Apply This Now:
I would be a bit of a hypocrite if I gave you all of this info and no clear way to apply what you’ve learned here. Outside of reaching out directly to our Organizational Change Management (OCM) team at Summit, here’s a quick question list to take away and reflect on.
-
- What exactly should people do differently, and how will I know?
- Where in their daily workflow can I trigger retrieval?
- What systems or templates can I adjust to make the new way the default?
- What will leaders visibly do to reinforce it?
- What’s the smallest proof of application I can measure, and which KPI does it connect to?
Our team at Summit prides itself on being able to help you answer these questions effectively. As you look to build and improve your large-scale learning programs, consider scheduling a consultation with us to discuss effective ways to approach your problems.
Get in touch with the Summit team.
Sources:
Why Leadership Training Fails —and What to Do About It
Transfer of Training: A Meta-Analytic Review
Distributed practice in verbal recall tasks: A review and quantitative synthesis