There’s a growing narrative that EDI training doesn’t work.
I hear it often:
“It’s performative.”
“It doesn’t change anything.”
“People attend and then go back to doing the same things.”
In some cases, that critique is warranted.
But I don’t think it’s telling us what we think it is.
Because the problem is not learning.
And it’s not training.
The problem is what we expect training to do on its own.
In the 20 years I have spent designing and facilitating EDI-centred learning across different environments, I’ve seen learning land in very different ways.
Not because the content was fundamentally different. But because the conditions around the learning were.
Most EDI training is asked to do too much – in isolation.
We ask a workshop, a course, or a learning series to shift deeply held beliefs, build new skills, change behaviour, reshape culture, and produce measurable outcomes. Often within a few hours.
We ask training to do what organizations are unwilling to design for.
And when that doesn’t happen, we conclude: “Training doesn’t work.”
But that’s not a failure of learning.
It’s a failure of design.
In most areas of organizational life, we don’t treat training as a standalone solution.
We don’t expect someone to become a strong manager after one workshop.
Or a confident driver after one lesson.
Or physically stronger after a single workout.
In most areas, we understand that training is part of a system.
Training is paired with practice, embedded in process, and reinforced over time.
EDI is one of the few areas where we compress all of that into a single intervention—and then question its effectiveness.
At its best, learning creates shared language, awareness of patterns, recognition of bias and inequity, and initial shifts in perspective.
These are not small things. They are necessary conditions for change.
But they are not sufficient on their own.
Learning opens the door. It does not walk people through it.
The gap isn’t in the learning itself.
It’s in what happens next.
In many organizations, there is little structured opportunity to practice new skills, or to integrate learning into decision-making processes, or to apply what was learned.
So people return to environments where existing norms remain unchanged, risk remains uneven, and new behaviours are unsupported.
Under those conditions, it’s not surprising that learning doesn’t translate into action.
I’ve also seen the opposite.
In one context, we shifted from delivering a single training session on equitable hiring to designing a more scaffolded approach. The initial learning still focused on awareness — bias, systemic barriers, and how inequities show up in hiring processes.
But we didn’t stop there.
We introduced structured evaluation criteria and scoring tools, guided panel deliberation practices, concrete prompts to interrupt bias in real time, and expectations for how decisions would be documented and reviewed.
Leaders weren’t just asked to understand bias.
They were supported in practicing how to interrupt it inside actual hiring decisions.
Ongoing peer support through a community of practice provided further opportunity to build skills and confidence in a collegial setting.
Over time, the conversations in hiring panels changed.
Not perfectly.
But noticeably.
Language became more specific. Some decisions were more clearly tied to criteria.
There was greater consistency in how candidates were evaluated.
The learning didn’t create the change on its own. It made different decisions possible.
There’s another layer that often gets missed.
Awareness does not automatically translate into skill.
Understanding bias is not the same as interrupting it in a heated discussion, navigating it in a performance conversation, or responding when harm is named.
Those are practices.
And practices require rehearsal, feedback, iteration, and time.
Without that, learning stays conceptual.
Even with skill, change doesn’t sustain without accountability.
Not punitive accountability.
Structural accountability.
Where expectations are clear, processes support equitable decisions, leaders are responsible for how they act under pressure, and outcomes are examined over time.
Especially in moments where speed, risk, and scrutiny are high.
Without this, organizations rely on individual motivation.
And motivation is not a system.
So when we say:
“EDI training doesn’t work”
What we may actually be seeing is learning without scaffolding, skill-building without practice, pivots without capacity-building.
Training, on its own, was never designed to carry that weight.
This is not a critique of training. It’s a call to design beyond it.
I don’t think we need less EDI learning. I think we need to stop asking it to do work the system isn’t designed to support.
Because training doesn’t fail in a vacuum.
It fails when it’s expected to compensate for systems that remain unchanged.
When learning, skill-building, and accountability are aligned, change becomes possible.
Not immediate.
Not simple.
But real.
The question isn’t whether training works.
It’s whether we’ve built anything around it that allows it to.
-sd