Show Notes
In this special episode of the Pre-Accident Investigation Safety Podcast, hosted by Todd Conklin, we take a moment to honor the legacy of James Reason, a prominent figure in the field of safety and human error study, who passed away on February 5th, 2025. James Reason revolutionized the way we understand human errors, particularly in complex systems, with groundbreaking concepts such as the Swiss cheese model and the theory of a just culture.
Listen as we delve into the life and contributions of James Reason, from his early conversations that expanded our understanding of organizational accidents to his influence in various sectors, including healthcare and aviation. Featuring a special tribute from David Woods, this episode reflects on Reason's enduring impact and challenges us to continue advancing in the world of safety and reliability.
Join us in remembering a man whose work has profoundly shaped the foundation of modern safety practices. In a unique tribute, we invite you to enjoy a sandwich with Swiss cheese, a nod to Reason's famous model, and reflect on his contributions to the field. Thank you, James Reason, for your remarkable influence on safety and human error research.
Show Transcript
WEBVTT
00:00:00.000 --> 00:00:11.920
Music.
00:00:11.768 --> 00:00:16.548
Hey, everybody. Todd Conklin, Pre-Accident Investigation Safety Podcast.
00:00:17.188 --> 00:00:20.368
And this is kind of a special podcast.
00:00:20.828 --> 00:00:28.228
In fact, it's even noted as a special podcast because this week was a sad week
00:00:28.228 --> 00:00:31.708
for a really profound reason.
00:00:31.908 --> 00:00:38.588
And that is on Wednesday of this week, the 5th of February, 2025,
00:00:39.488 --> 00:00:41.608
James Reason passed away.
00:00:41.768 --> 00:00:51.548
And it's hard to do the work we do at any level without thinking of the people who went before us.
00:00:51.788 --> 00:00:59.768
And it was so interesting because to have that thought, many names come into place.
00:00:59.908 --> 00:01:05.548
Richard Cook, remarkable people, Jim Howe, and of course, James Reason.
00:01:06.448 --> 00:01:12.848
And James Reason did things for us that we would not be having these conversations.
00:01:12.848 --> 00:01:19.428
We would not even be close to having these conversations without James Reason
00:01:19.428 --> 00:01:22.148
having some conversations early on.
00:01:22.988 --> 00:01:26.508
And so I think it's worthwhile for us to take a moment and sort of think about
00:01:26.508 --> 00:01:31.508
where we've come from in order to understand where we're going to.
00:01:32.508 --> 00:01:41.628
And James Reason passing is a pretty important moment in the stories that we tell.
00:01:42.148 --> 00:01:48.028
James T. Reason was born on the 1st of May, 1938, and he was a professor of
00:01:48.028 --> 00:01:50.148
psychology at the University of Manchester.
00:01:50.488 --> 00:02:00.468
He graduated University of Manchester in 1962, and he was a tenured professor from 77 to 2001.
00:02:01.508 --> 00:02:11.148
What he's probably most famous for are the works he did and specifically his books on human error,
00:02:11.508 --> 00:02:19.308
including such aspects as absent-mindedness or aviation human factors, maintenance error,
00:02:20.188 --> 00:02:24.088
risk management, to look at organizational accidents.
00:02:24.308 --> 00:02:29.448
And these are all terms, this is amazing to me because he coined many of these
00:02:29.448 --> 00:02:35.088
ideas. the idea of an organizational accident, of a systems accident, was really new.
00:02:35.448 --> 00:02:42.988
In 2003, he was awarded an honorary doctor of science by the University of Aberdeen.
00:02:43.108 --> 00:02:48.308
He was a fellow in the British Academy, the British Psychological Society,
00:02:48.788 --> 00:02:54.448
the Royal Aeronautical Society, and the Royal College of General Practitioners.
00:02:54.648 --> 00:03:03.428
He received his CBE in 2003 for his services in the reduction of the risks in health care.
00:03:04.208 --> 00:03:10.128
And in 2011, he was elected an honorary fellow of the Safety and Reliability Society.
00:03:11.436 --> 00:03:17.976
Among the multitude of contributions, one of the things James Reason introduced
00:03:17.976 --> 00:03:20.716
was the Swiss cheese model.
00:03:20.956 --> 00:03:24.696
And if you followed any recent journalism now,
00:03:25.056 --> 00:03:30.636
you're hearing everybody talk about the Swiss cheese model, a conceptual framework
00:03:30.636 --> 00:03:36.076
for the description of accidents based upon the notion that accidents will happen
00:03:36.076 --> 00:03:38.456
only if multiple barriers fail.
00:03:38.456 --> 00:03:41.376
Thus creating a path from an
00:03:41.376 --> 00:03:47.656
initiating cause all the way to the ultimate and unwanted consequence such as
00:03:47.656 --> 00:03:52.716
harm to people or assets or the environment reason also described the first
00:03:52.716 --> 00:03:58.756
fully developed theory of a just culture in his 97 book managing the risks of
00:03:58.756 --> 00:04:00.076
organizational accidents.
00:04:00.736 --> 00:04:10.876
We'll miss Jim a lot because Jim always started out, as David Woods told me, fighting the zombies.
00:04:11.276 --> 00:04:15.816
And there are more zombies now than probably ever before.
00:04:16.236 --> 00:04:23.876
But the path he cut, the swath that he laid, made it possible for the rest of
00:04:23.876 --> 00:04:28.736
us to do the work and have the ideas and think the things that we think.
00:04:29.056 --> 00:04:32.136
He's made the world a better place.
00:04:32.536 --> 00:04:38.116
And I just happened to be talking to David Woods from the Ohio State University,
00:04:38.576 --> 00:04:41.036
the Center for Resilience Engineering.
00:04:41.516 --> 00:04:49.356
And I asked David, what legacy Jim Reason will leave the world?
00:04:49.696 --> 00:04:56.656
And this is his response. Listen carefully, because he told us this just for us.
00:04:57.096 --> 00:05:00.976
So this is David Wood's tribute to Jim Reason.
00:05:00.240 --> 00:05:06.640
Music.
00:05:06.936 --> 00:05:11.716
So I think there are many, many things.
00:05:12.676 --> 00:05:21.036
One is that he was part of a group who created modern safety.
00:05:21.036 --> 00:05:27.316
Not in any specific thing they did, but in the enterprise, the line of inquiry they started.
00:05:27.576 --> 00:05:37.236
It started in 1980, and I got to join directly, intimately with it in 1983 at the Bellagio meeting.
00:05:37.656 --> 00:05:40.036
Not Las Vegas, the real Bellagio.
00:05:40.996 --> 00:05:46.596
And I should show you a picture of Eric and I there when we were young and handsome.
00:05:47.116 --> 00:05:49.596
And I could talk all day while running.
00:05:51.576 --> 00:05:56.696
And as what would happen in those days is we'd have this elegant,
00:05:57.036 --> 00:06:03.416
multi-course European dinner at the villa, and then I'd go, great,
00:06:03.516 --> 00:06:05.896
that was a wonderful appetizer. I need some food.
00:06:05.996 --> 00:06:08.896
Let's go down to the town and eat dinner now. Yeah.
00:06:10.446 --> 00:06:14.586
But anyway, it was an enterprise that was started. Richard Cook and I called
00:06:14.586 --> 00:06:16.466
it the new look behind human error.
00:06:17.766 --> 00:06:21.886
Later, you know, in many ways got called Safety 2 by Eric.
00:06:22.226 --> 00:06:27.226
Again, trying to move people forward, but also running the risk of it being
00:06:27.226 --> 00:06:29.146
as a compromise breaking down.
00:06:29.706 --> 00:06:33.986
But the momentum built and grew and more stuff happened.
00:06:34.146 --> 00:06:37.686
And yeah, not as concentrated and following a single path.
00:06:37.686 --> 00:06:45.606
But often it was multi-path discovery about the nature of surprise,
00:06:46.126 --> 00:06:47.726
how accidents were impossible.
00:06:47.726 --> 00:06:49.706
They were normal because of the
00:06:49.706 --> 00:06:54.746
complexities. All of them highlighting fundamental points underneath it.
00:06:54.846 --> 00:07:02.126
We now see the nature of their fundamentalness more clearly because of the progress that went on.
00:07:02.866 --> 00:07:06.746
So Jim was central in the way he worked, the way he worked with people,
00:07:06.966 --> 00:07:11.466
not because we were always right or he was always right, but because we produced
00:07:11.466 --> 00:07:16.866
the energy and persistence in the face of all the pressures to compromise,
00:07:17.186 --> 00:07:20.626
all the things, but it's just a human's error.
00:07:20.966 --> 00:07:25.786
We don't have to change the system. We don't have to do things differently.
00:07:26.446 --> 00:07:30.646
We can ignore the systemic factors that matter.
00:07:31.726 --> 00:07:36.206
And and so we've built up cases success
00:07:36.206 --> 00:07:39.106
stories techniques now we
00:07:39.106 --> 00:07:43.766
haven't organized them we've talked about a field guide because we've made so
00:07:43.766 --> 00:07:49.406
much progress from people like jim so that's the first point is he and others
00:07:49.406 --> 00:07:53.886
and i've gone back looking at the pictures from 83 looking for somewhere there's
00:07:53.886 --> 00:07:58.106
a picture of the 80 meeting of which was a clam bake on a beach in Maine.
00:07:59.086 --> 00:08:05.546
And the people who initiated this line of inquiry, you know,
00:08:05.606 --> 00:08:09.726
it is funny, sometimes in these things, there really is an origin point,
00:08:09.846 --> 00:08:11.306
despite multiple influences.
00:08:11.606 --> 00:08:15.726
And this was the Clambake meeting in 1980 that started things going,
00:08:15.726 --> 00:08:20.926
you know, just like resilience engineering started because NASA had accidents
00:08:20.926 --> 00:08:25.486
in 1999, and they didn't rationalize it away as human error.
00:08:25.486 --> 00:08:29.406
They said it was a systemic factor of pressure from the faster,
00:08:29.586 --> 00:08:35.546
better, cheaper pressure led, you know, successful managers to cut what didn't
00:08:35.546 --> 00:08:38.166
to be efficient, which turned out,
00:08:38.406 --> 00:08:41.086
as Eric put it later, not to be thorough.
00:08:42.126 --> 00:08:43.606
Right? What's not to be thorough?
00:08:44.506 --> 00:08:49.366
Shortcuts, right? The shortcuts look good, like, and it turned out they were
00:08:49.366 --> 00:08:53.166
undercutting important things that contributed, right?
00:08:53.286 --> 00:08:57.686
And how did they contribute? you this is the second thing this is what jim pointed
00:08:57.686 --> 00:09:03.206
out in the in the mid to late 80s and it first appeared in the 1990 book in
00:09:03.206 --> 00:09:04.966
last or second to last chapter.
00:09:06.377 --> 00:09:13.577
Which is easily summarized as, right, accidents happen due to multiple contributors,
00:09:13.877 --> 00:09:16.257
each necessary but only jointly sufficient.
00:09:16.597 --> 00:09:22.977
And many of those contributors were present in the system for a much longer
00:09:22.977 --> 00:09:31.157
period of time prior to the trigger and the immediate sequence of events that led to the accident.
00:09:31.357 --> 00:09:36.677
And you can see that, for example, in the BP accidents of Deepwater Horizon
00:09:36.677 --> 00:09:39.717
in Texas City, that you can see that.
00:09:39.837 --> 00:09:45.857
Jim saw it in the Herald of Free Enterprise ferry disaster in Zabrugia Harbor.
00:09:47.097 --> 00:09:51.997
And many people have gone on, again, as we try to use cartoons,
00:09:52.397 --> 00:09:55.317
Swiss cheese, simple metaphors, right?
00:09:55.477 --> 00:09:59.917
Multiple lines of defense still have holes in them. There are gaps to close.
00:10:00.857 --> 00:10:04.237
And misunderstanding the ways that gaps get closed.
00:10:04.657 --> 00:10:07.817
And if you don't understand how gaps are getting closed locally,
00:10:08.097 --> 00:10:13.137
it may turn out that closing one gap exposes you to others.
00:10:13.417 --> 00:10:19.637
Or closing one gap means the system doesn't fail or give the evidence of potential
00:10:19.637 --> 00:10:23.057
failure. So everybody thinks the system's working when it's full of holes.
00:10:24.257 --> 00:10:28.937
Richard and I wrote the gaps paper in 2000, And, you know, again,
00:10:29.197 --> 00:10:34.437
trying to highlight the key point that this, you know, that goes that that arose
00:10:34.437 --> 00:10:37.217
in the late 80s due to Jim and others work.
00:10:38.101 --> 00:10:43.601
And guess where we are now? We now have theorems about this that say the messiness
00:10:43.601 --> 00:10:45.941
is inherent in this universe.
00:10:46.181 --> 00:10:52.021
It will have gaps. No matter the successes you have, no matter the advances
00:10:52.021 --> 00:10:55.281
you bring to bear, new gaps will move.
00:10:55.401 --> 00:10:58.121
Gaps will move around. Surprises will recur.
00:10:58.861 --> 00:11:02.561
Snafu is a natural part of this universe.
00:11:02.801 --> 00:11:08.441
And snafu catching is essential. Now, that can go on in local ad hoc ways,
00:11:08.461 --> 00:11:15.041
or that can be synchronized, coordinated, and empowered in our systems and organizations.
00:11:16.041 --> 00:11:20.301
If you leave it local and ad hoc, you're going to end up with systems that are
00:11:20.301 --> 00:11:25.521
competent but brittle, where these hidden local sources of adaptive capacity
00:11:25.521 --> 00:11:26.921
keep the system working.
00:11:26.921 --> 00:11:32.381
And so you think the system is more effective than it really is, right?
00:11:32.561 --> 00:11:37.761
Or competent and extensible, which provisions the system to be responsive in
00:11:37.761 --> 00:11:39.941
the face of new challenges and change.
00:11:40.281 --> 00:11:44.181
Why does this happen? We now have the basics for why this is.
00:11:44.301 --> 00:11:50.141
We can prove these things in the most powerful sense of the word prove, right?
00:11:50.601 --> 00:11:54.781
Finite resources are always the case. Sometimes we try to hide that,
00:11:54.921 --> 00:11:59.101
but in the end, finite resources always constrict what we do.
00:11:59.281 --> 00:12:00.721
It means there's trade-offs.
00:12:01.601 --> 00:12:07.781
Change never stops. It may seem quiescent for a bit, but the world will start again.
00:12:08.361 --> 00:12:10.381
Change will come forward.
00:12:11.141 --> 00:12:17.841
And three, others are there, too, who are adapting. And what are they adapting to? Our successes.
00:12:19.008 --> 00:12:25.928
In other words, one of the drivers of change, adaptation, and the need for extensibility is success.
00:12:26.388 --> 00:12:32.548
As you deploy new capabilities, right, to build competence, what do you do?
00:12:32.668 --> 00:12:34.888
You scale up systems. They grow.
00:12:35.528 --> 00:12:40.668
Interdependencies grow. Others adapt. And so, you know, this is something that
00:12:40.668 --> 00:12:42.528
you can look at in the two videos.
00:12:42.708 --> 00:12:48.188
One is called Growth and Complexification. and you can look at it in the example
00:12:48.188 --> 00:12:50.908
of the rise of high-frequency trading.
00:12:51.428 --> 00:12:57.328
And then what goes with that is a new one we just put out, which uses George
00:12:57.328 --> 00:13:01.848
Box's old line, all models are wrong, some models are useful.
00:13:01.848 --> 00:13:07.588
And I run through how people compromised his insights, his challenge, right?
00:13:08.148 --> 00:13:14.768
And they rationalized in a bunch of ways, They say, yes, models are limited
00:13:14.768 --> 00:13:20.788
but valuable, but the limits don't really matter now, or the limits only matter occasionally.
00:13:21.368 --> 00:13:24.668
What matters is how much more valuable it is than it used to be,
00:13:25.048 --> 00:13:27.228
how much more powerful it is.
00:13:27.388 --> 00:13:30.408
And the world's basically the same. And the answer is, no, it's not.
00:13:30.548 --> 00:13:33.408
You produce growth, and growth produces complexification.
00:13:34.108 --> 00:13:41.068
And these factors kick in, producing new kinds of surprises and new demands for extensibility.
00:13:41.248 --> 00:13:47.588
And so I end it with the new version of Box's original line.
00:13:48.068 --> 00:13:53.388
And I won't give away the punchline to the video. It's only 16 minutes long.
00:13:54.348 --> 00:13:57.248
And it's beautifully done. It's really a great video.
00:13:57.588 --> 00:14:01.788
What do you think James Reason's lasting legacy will be?
00:14:02.128 --> 00:14:07.308
And would he be happy with that? So first,
00:14:07.668 --> 00:14:15.948
I feel very confident that he rested easy in his last illnesses,
00:14:15.948 --> 00:14:18.528
that he had made a difference,
00:14:18.668 --> 00:14:25.168
that he had fought the battles to grow safety, to create safety,
00:14:25.348 --> 00:14:28.088
and that he had an impact,
00:14:28.348 --> 00:14:30.428
that he had made a difference.
00:14:30.428 --> 00:14:35.528
And over his, the life that began,
00:14:35.828 --> 00:14:39.288
I mean, because it was a complete change in the direction of his research and
00:14:39.288 --> 00:14:46.048
activity that began in 1980 and continuing in 83 when I first met him and on,
00:14:46.728 --> 00:14:52.948
that that period of his life really made a positive difference in healthcare,
00:14:53.508 --> 00:14:55.568
oil and gas, and other places.
00:14:57.068 --> 00:15:05.548
On the other hand, remember, The battles don't stop. We have to refight them.
00:15:06.448 --> 00:15:12.748
The human tendencies to oversimplify, to linearize, to defer,
00:15:13.348 --> 00:15:17.028
the pressures now that are real.
00:15:18.387 --> 00:15:21.747
To be effective and efficient right now in order to be competitive,
00:15:22.127 --> 00:15:28.107
in order to meet expectations of other roles and parties who have power and authority.
00:15:29.007 --> 00:15:33.947
All of these tend to simplify how this world really works.
00:15:34.827 --> 00:15:41.327
And so I think the difference is he started a process that hopefully fought
00:15:41.327 --> 00:15:45.787
successfully but did not end the need to fight on.
00:15:45.787 --> 00:15:52.067
And that we now have reached a new phase in honor of his and others' legacy
00:15:52.067 --> 00:15:58.167
in that journey, which is we now have solid foundations, right?
00:15:58.347 --> 00:16:04.727
And we can challenge, but that requires a cadre of people who understand what's fundamental.
00:16:04.727 --> 00:16:09.827
And in that, they can see a range of compromise around how to use those fundamentals
00:16:09.827 --> 00:16:15.007
now and how to use them in different contexts, given various pressures and constraints,
00:16:15.587 --> 00:16:19.427
without compromising on the fundamental, right?
00:16:19.587 --> 00:16:23.587
You can't compromise on the fundamental. You can compromise on how you bring
00:16:23.587 --> 00:16:26.407
the fundamental point to bear.
00:16:28.147 --> 00:16:29.467
That's the difference.
00:16:34.107 --> 00:16:37.027
So that's the impression that david woods has
00:16:37.027 --> 00:16:45.147
and really since i asked him what he offered us and i can't not think that the
00:16:45.147 --> 00:16:49.267
most important tribute you can have from one of your peers is the realization
00:16:49.267 --> 00:16:54.747
that your life's work has made a difference i mean i wish that for all of us every single one of us.
00:16:54.847 --> 00:16:57.747
I hope we all feel that way. That's remarkable.
00:16:58.307 --> 00:17:06.267
Take some time, think about all the work you do, and then think about how important
00:17:06.267 --> 00:17:09.887
James' reason was to that work.
00:17:10.307 --> 00:17:16.547
And do me a favor. This is kind of a special favor, but at some point in the
00:17:16.547 --> 00:17:20.707
next week or two, have a sandwich that has Swiss cheese on it.
00:17:21.167 --> 00:17:25.207
I can think of no better tribute at all than that.
00:17:25.767 --> 00:17:30.627
Because James Reason introduced the idea of Swiss cheese and then throughout
00:17:30.627 --> 00:17:33.567
his career lived with that idea of Swiss cheese.
00:17:34.047 --> 00:17:40.307
And it's an important model for an introduction into the way we think to the world.
00:17:40.607 --> 00:17:43.447
It's a compromise for sure, as Dave Woods would say.
00:17:43.867 --> 00:17:50.487
But what better tribute you can have than to enjoy a sandwich with Swiss cheese.
00:17:51.147 --> 00:17:56.047
Thank you, James Reason. You made a huge difference, and I appreciate you greatly.
00:17:56.627 --> 00:18:00.527
Learn something new every single day. Have as much fun as you possibly can.
00:18:00.587 --> 00:18:03.727
Be good to each other. Be kind to each other. Check in on one another.
00:18:04.087 --> 00:18:08.047
And for goodness sakes, be safe. Thanks, Uncle Jim.
00:18:08.720 --> 00:18:20.053
Music.