×

PAPod 522 - Remix - Unveiling the Future of Safety and Resilience with Eric Hollenagle

PreAccident Investigation Podcast

The Pre Accident Podcast is an ongoing discussion of Human Performance, Systems Safety, & Safety Culture.

Show Notes

Welcome to an exciting episode of the Pre-Accident Podcast with Todd Conklin, where we dive into a thought-provoking conversation with the remarkable Eric Hollenagle. Known for his pioneering work on the Efficiency-Thoroughness Trade-Off (ETTO) principle and the Functional Resonance Analysis Method (FRAM), Eric shares insights into the evolving landscape of safety and resilience.



In this episode, Todd and Eric explore the future of system performance, questioning traditional safety paradigms and advocating for a holistic understanding of organizational functioning. Eric emphasizes the importance of perceiving the unseen and understanding the complexities of industries like healthcare, drawing on his vast experience and research.



Todd's engaging storytelling, including a personal anecdote about resilience in the face of a spilled Coke Zero, sets the stage for a compelling discussion. Eric's reflections on his journey, the impact of his work, and the need for a shift towards synthesis in safety thinking make this episode a must-listen for anyone interested in advancing reliable systems and organizational performance.



Join us for an enlightening conversation that challenges conventional safety narratives and offers a fresh perspective on managing complexity and change. Don't miss this chance to learn from one of the leading voices in the field of safety and resilience!


Show Transcript

WEBVTT

00:00:00.017 --> 00:00:03.377
Sit back and relax. It's a big podcast day. This, my friends,

00:00:03.717 --> 00:00:06.897
is the Eric Hollenagle Show. We'll be right back.

00:00:07.920 --> 00:00:19.120
Music.

00:00:19.097 --> 00:00:23.657
Howdy, everybody. It's Todd Conkron, and this is the Pre-Accident Podcast. How are you today?

00:00:24.257 --> 00:00:27.357
Good, I hope. Hey, it's a big show. This will be kind of exciting.

00:00:27.357 --> 00:00:34.677
If you've not heard of or ever been a part of a discussion around the ETO principle,

00:00:34.917 --> 00:00:37.417
E-T-T-O, Efficiency, Thernus, Trade-Off,

00:00:37.877 --> 00:00:43.077
or Safety 1, Safety 2, or the FRAM, the Functional Residence Model,

00:00:43.397 --> 00:00:47.297
or, gosh, I could just go on for months, months, I tell you.

00:00:47.737 --> 00:00:51.477
Then if you haven't ever had those kind of conversations, today's going to be

00:00:51.477 --> 00:00:55.517
a really exciting day for you because everything's going to be new and shiny

00:00:55.517 --> 00:00:57.157
like a penny found on the street.

00:00:57.597 --> 00:01:02.237
And if you have, you're probably even more excited because if you had had these

00:01:02.237 --> 00:01:06.937
conversations today, you're going to get to talk to somebody who is just quite amazing.

00:01:07.117 --> 00:01:12.977
Eric Hall and I roll and Eric is just a fine human being. Let's start with there.

00:01:13.897 --> 00:01:18.037
And I didn't think I would get him on the podcast, but I kind of tricked him.

00:01:18.377 --> 00:01:23.157
I'm not saying I blackmailed him. I tricked him. It's more it's it's less severe

00:01:23.157 --> 00:01:26.957
than blackmail. But he had to introduce me at a conference a couple of weeks

00:01:26.957 --> 00:01:32.357
ago, and he asked me to write some things down for the introduction. And I said, I'd love to.

00:01:32.517 --> 00:01:35.137
I'll give them to you when we record a podcast.

00:01:35.757 --> 00:01:38.737
And he fell for it. I mean, he totally fell for it. Just hook,

00:01:38.837 --> 00:01:42.417
line, and sinker. Boom. I had him. So that is how that started,

00:01:42.477 --> 00:01:43.597
and we'll get to that in a minute.

00:01:43.997 --> 00:01:47.877
Let's bring you up on the speed of light in the world around us.

00:01:47.977 --> 00:01:51.557
It's, you know, it's an exciting day. I've

00:01:51.557 --> 00:01:54.437
had tremendous amounts of things go

00:01:54.437 --> 00:01:57.197
on and many things to talk to you about but probably

00:01:57.197 --> 00:02:00.097
the big one I should talk to you about is the power of

00:02:00.097 --> 00:02:02.777
resilience and the reason I'm going to tell you about the

00:02:02.777 --> 00:02:08.937
power of resilience is because I spilt not a coke not a diet coke but in fact

00:02:08.937 --> 00:02:15.477
a coke zero sugar pretty much a full-blown coke zero sugar I spilled from the

00:02:15.477 --> 00:02:20.257
little plastic tray on the back of the airline seat into my briefcase.

00:02:21.167 --> 00:02:25.407
And luckily for me, my briefcase was holding my computer.

00:02:25.587 --> 00:02:30.787
And luckily for me, my computer was standing upright. So like on the skinny

00:02:30.787 --> 00:02:33.447
end, it was standing up because, you know, it was in the briefcase.

00:02:33.767 --> 00:02:40.507
And even more luckily for me, I'd put the top of the case, not the keyboard

00:02:40.507 --> 00:02:47.647
where the clamshell closes, but the back hinge where the vents are for cooling the laptop.

00:02:47.647 --> 00:02:56.107
So when I did spill that Coke Zero sugar, it did travel into my briefcase and

00:02:56.107 --> 00:02:58.107
directly down into my computer.

00:02:58.387 --> 00:03:04.867
Yes, it's true. I lost it all. It was gone. It was a sad day for my computer.

00:03:05.547 --> 00:03:10.707
Oh, so sad. And it's kind of non-recoverable, although I did take it to the

00:03:10.707 --> 00:03:13.647
computer place because they said it's worth taking it to the computer place.

00:03:13.647 --> 00:03:19.287
But I'm pretty sure if the Coke can live through, I mean, well, I'm sure the Coke lived.

00:03:19.447 --> 00:03:28.127
If the computer lives through the Coke Zero Sugar bath, I will be amazed because not a lot.

00:03:28.367 --> 00:03:31.967
I mean, it's just one of those little, you know, it's an airplane.

00:03:31.967 --> 00:03:33.367
You've been on a million of them.

00:03:33.567 --> 00:03:36.767
One of those little plastic cups on that little tray.

00:03:36.987 --> 00:03:43.227
And see, you probably can just lean that tray out and casually work on it.

00:03:43.227 --> 00:03:47.087
For me, I have to kind of suck in a little to bring the tray down,

00:03:47.087 --> 00:03:52.047
and I moved around, and it turned the tray, and it went up, and you can write the ending.

00:03:52.247 --> 00:03:55.407
I kind of wish that the Coke would have gone in the magazine pocket,

00:03:55.407 --> 00:03:59.547
because I don't really own the magazine pocket, nor do I care much about it.

00:03:59.907 --> 00:04:02.427
But alas, it went into my briefcase.

00:04:03.280 --> 00:04:09.280
But there's good news. Yes, I have good news for you. And the good news is, is I back it up.

00:04:09.660 --> 00:04:16.300
And that is the power of resilience. I can't control when I'm going to spill a Coke in my briefcase.

00:04:16.580 --> 00:04:22.000
Clearly, I think I've proven that beyond a shadow of a doubt that that I clearly cannot control.

00:04:22.000 --> 00:04:24.840
But what I can control is my

00:04:24.840 --> 00:04:28.460
response to a coke being spilled in

00:04:28.460 --> 00:04:31.680
my briefcase and my response was that I

00:04:31.680 --> 00:04:36.720
had a backup and I had to get a new computer you knew that was coming and I

00:04:36.720 --> 00:04:41.920
plugged it into the little thing and it said do you want to start new or do

00:04:41.920 --> 00:04:46.680
you want to copy off your old computer and I clicked copy off your old computer

00:04:46.680 --> 00:04:50.240
and 45 minutes later probably even that long. I probably made that up.

00:04:50.640 --> 00:04:54.800
A little period of time, let's say 45 minutes, that's kind of an estimate later,

00:04:55.160 --> 00:04:56.800
it looked just like my last computer.

00:04:56.960 --> 00:04:59.680
It was just, you know, a little bit.

00:05:00.060 --> 00:05:05.220
I lost some notes that I'd taken, and I lost an article I brought down about

00:05:05.220 --> 00:05:07.400
the school bus driver in Chattanooga.

00:05:07.540 --> 00:05:11.120
But I think I can still find it, because I was really interested in that school

00:05:11.120 --> 00:05:15.300
bus driver, because they're saying he was on the phone, but he's saying he was

00:05:15.300 --> 00:05:17.640
not on the phone. There was another truck involved.

00:05:17.820 --> 00:05:22.280
So it's getting the stories now starting to build around culpability.

00:05:22.560 --> 00:05:25.520
Well, it's already, I would say, built around culpability. But now it's starting

00:05:25.520 --> 00:05:28.940
to it's moving into sort of lawyer speak. And so that's interesting as well.

00:05:29.080 --> 00:05:31.660
That is my giant story. That took a long time to tell.

00:05:32.620 --> 00:05:37.920
But the good news is The reason I'm not so upset about it And I'm really not

00:05:37.920 --> 00:05:40.040
It's just a computer, right?

00:05:40.460 --> 00:05:44.960
Is I think because it was backed up And that really is that emphasis around

00:05:44.960 --> 00:05:48.620
The notion of resilience that I think Makes the biggest difference in the world

00:05:48.620 --> 00:05:51.180
So that is kind of what we're going to talk about Now,

00:05:51.300 --> 00:05:54.500
to get started on this podcast I'm going to jump right in because I've already

00:05:54.500 --> 00:05:58.520
probably Taken too much of your time telling that stupid story Eric and I sat down,

00:05:59.301 --> 00:06:06.121
And the first thing I ask him is what's the future look like,

00:06:06.141 --> 00:06:10.521
which is normally a question I kind of saved till the end, but I was really

00:06:10.521 --> 00:06:14.261
curious to see what he had to say. So let's jump into this podcast.

00:06:14.521 --> 00:06:18.501
This is Eric Hallnagel and Todd Conklin on the Pre-Accident Podcast.

00:06:18.941 --> 00:06:23.061
Listen carefully because the first thing he's going to say is kind of what he's

00:06:23.061 --> 00:06:24.381
thinking about is happening next.

00:06:29.301 --> 00:06:35.101
On the next three or four years, I don't think we'll see much real difference.

00:06:35.321 --> 00:06:36.481
We'll see growing interest.

00:06:36.721 --> 00:06:40.201
But I mean, if you see it playing out in the sense of having an impact,

00:06:40.801 --> 00:06:45.061
you're talking 10, 15 years or even more.

00:06:45.581 --> 00:06:47.901
Look at the paradigms that we have now.

00:06:48.621 --> 00:06:56.601
I don't know where it will be going. I know what I'm interested in is to actually

00:06:56.601 --> 00:07:02.661
try and get rid of the word safety and the term safety and to look at system performance.

00:07:03.181 --> 00:07:07.601
Safety is an aspect of system performance, but I'm interested in how systems

00:07:07.601 --> 00:07:14.541
organizations function and to understand that functioning and to be able to

00:07:14.541 --> 00:07:15.921
manage that functioning.

00:07:16.581 --> 00:07:22.881
And safety in the classical sense is the part of that, but so is quality, so is sustainability,

00:07:23.421 --> 00:07:30.521
so is customer satisfaction, so is efficiency, see so many environmental impact

00:07:30.521 --> 00:07:33.421
or, I mean, there's so many things that are part of that.

00:07:33.581 --> 00:07:39.861
And really, I want to see them together and want to understand how they work together.

00:07:40.081 --> 00:07:43.361
But isn't that close to sort of blasphemous? I mean, does that,

00:07:43.521 --> 00:07:46.961
do you get pushback when you talk about that outside of the academy?

00:07:48.210 --> 00:07:55.030
Actually, I don't talk much about it in the academy. I'm not really an academic anymore.

00:07:55.570 --> 00:08:01.670
But no, in fact, people are very interested in that because I think you must

00:08:01.670 --> 00:08:02.730
have seen that yourself,

00:08:02.990 --> 00:08:09.750
that people feel that they are stuck in silos and they treat safety as if you

00:08:09.750 --> 00:08:13.550
could treat it isolated from whatever else happens in an organization.

00:08:13.610 --> 00:08:15.750
And we all know you can't.

00:08:16.470 --> 00:08:23.790
It's just that we don't have the concepts and the terminology and the ideas

00:08:23.790 --> 00:08:26.130
really to bind it all together.

00:08:26.330 --> 00:08:32.870
And I think that's, for me, what's interesting and what I'm trying in my own

00:08:32.870 --> 00:08:35.450
feeble way to work a little at.

00:08:35.550 --> 00:08:40.270
But I don't know whether that's where it's going. I do see we have in healthcare

00:08:40.270 --> 00:08:45.630
had over the last conferences in resilient healthcare, a number of papers that

00:08:45.630 --> 00:08:49.870
sort of tried to dissolve the difference between quality and safety.

00:08:50.530 --> 00:08:52.810
And I think that's certainly the right way to go.

00:08:53.810 --> 00:08:56.970
Tell us about your journey. How did you get to where you are?

00:08:57.110 --> 00:08:59.690
What led you to think about these ideas?

00:09:01.110 --> 00:09:07.290
That's a question I find very difficult to answer because the honest answer is it just happened.

00:09:07.290 --> 00:09:12.710
I mean, it wasn't a journey as in retrospect, yes, you can see I have sailed

00:09:12.710 --> 00:09:17.590
these waves and then changed course here and there, but it wasn't pre-planned

00:09:17.590 --> 00:09:20.450
in any way. It just happened.

00:09:21.450 --> 00:09:25.410
What started your interest? I mean, what made it start to happen?

00:09:25.730 --> 00:09:29.150
Was there an accident? Did somebody include you in a failure?

00:09:29.430 --> 00:09:32.410
I mean, where'd this kind of take its origin? Yeah.

00:09:33.726 --> 00:09:41.546
Hard to say. I mean, of course, I was working with nuclear control room design

00:09:41.546 --> 00:09:44.686
and human machine systems when Tremont Island happened,

00:09:44.846 --> 00:09:48.746
of course, and that focused everybody's interest in a certain direction.

00:09:49.486 --> 00:09:55.986
And as a psychologist, I was always a bit skeptical about the idea of human

00:09:55.986 --> 00:10:03.306
error as a separate cause and a special cause and looking at things that go wrong all the time.

00:10:04.046 --> 00:10:10.366
So, I think it's more or less that by working with the issues and by talking

00:10:10.366 --> 00:10:16.386
with, discussing with colleagues, good colleagues, I mean, one of the people I met in,

00:10:16.606 --> 00:10:19.326
I think before TMI in 78,

00:10:20.006 --> 00:10:24.786
was David Woods and he was a young brash boy at the time and we had a lot of

00:10:24.786 --> 00:10:27.206
interesting discussions and I think talking with David,

00:10:27.426 --> 00:10:32.586
talking with other people and And hearing that and sort of when you sit down and talk,

00:10:32.746 --> 00:10:37.486
you say, well, what about this and what about that and could it be and it gradually

00:10:37.486 --> 00:10:42.646
starts to develop and you think that you find that some things are interesting,

00:10:42.846 --> 00:10:47.046
other things aren't and you follow that and it's a passage.

00:10:47.326 --> 00:10:51.306
I mean, it ended here. It could have ended anywhere else. Yeah.

00:10:52.019 --> 00:10:57.379
So it's just serendipity, really. So you could be like a really famous author

00:10:57.379 --> 00:11:03.139
around baking cakes, if cakes would have been the serendipitous outcome of this outcome.

00:11:03.719 --> 00:11:10.779
Probably. I like making cakes, actually. I haven't created any new cakes on

00:11:10.779 --> 00:11:14.019
my own, but I like to bake cakes and pastry.

00:11:14.259 --> 00:11:21.519
But, yeah, I mean, listen, if you ask me 10 years ago or even certainly 20 years

00:11:21.519 --> 00:11:25.299
ago, where would I be today? I would never have said I would be sitting here

00:11:25.299 --> 00:11:27.899
and talking to you. I wouldn't even have known who you were.

00:11:28.479 --> 00:11:32.379
And I wouldn't have known this world. And it just happened. I mean,

00:11:32.499 --> 00:11:34.079
it's not planned in any sense.

00:11:34.399 --> 00:11:37.599
And yet you've had this remarkable impact. I mean, and you know this,

00:11:37.739 --> 00:11:43.379
although you may not know, but the impact you've had has been very powerful.

00:11:43.999 --> 00:11:50.479
In really starting a conversation, at least in industry, around the trade-offs

00:11:50.479 --> 00:11:55.559
between being efficient and being thorough, being safe and being productive.

00:11:55.919 --> 00:12:00.119
And what's so interesting about that is I think that's always existed,

00:12:00.119 --> 00:12:06.399
but I think you brought it to the conversation in a way that allowed both operations

00:12:06.399 --> 00:12:11.799
people and academics to come together and see a ray of light.

00:12:11.799 --> 00:12:20.279
That notion of that trade-off has, I think, really changed the way we look at reliable systems.

00:12:20.759 --> 00:12:22.519
And that's powerful.

00:12:23.359 --> 00:12:27.679
What started all that? I mean, was that a bolt of lightning one day?

00:12:29.119 --> 00:12:34.899
Well, it's the same Hans has before. I don't really know. I should say what

00:12:34.899 --> 00:12:42.299
I'm probably trying to do all the time is trying to make sense out of things for myself.

00:12:43.539 --> 00:12:46.839
And so then I think I understand what's going on.

00:12:47.139 --> 00:12:51.079
And this idea about the trade office is, of course, very old in psychology.

00:12:51.079 --> 00:12:56.859
And it just turned out to me that this seems like a useful thing.

00:12:56.859 --> 00:13:00.919
And I noticed that I work with people, work with industries.

00:13:00.919 --> 00:13:08.319
And it's just like you get an idea and you begin to look for it and you suddenly see it everywhere.

00:13:08.519 --> 00:13:11.479
And that's where you have to be a bit cautious because you shouldn't take over.

00:13:11.759 --> 00:13:16.099
Because it is just an idea. It's not a psychological mechanism.

00:13:16.099 --> 00:13:18.719
It's just a convenient way of describing things.

00:13:18.899 --> 00:13:24.499
And any convenient way of describing things will capture something that's essential

00:13:24.499 --> 00:13:28.639
and throw away things that also potentially could be essential.

00:13:28.639 --> 00:13:33.659
And that's why you have to be very careful all the time about how you describe things.

00:13:34.459 --> 00:13:39.579
But again, don't ask me how I came to think about that.

00:13:39.719 --> 00:13:44.099
It just happened someday for some reason.

00:13:44.299 --> 00:13:48.059
I may have read something somewhere or heard something. I don't know.

00:13:48.787 --> 00:13:54.867
That, I think, is the amazing part of it is that I think you hit on it perfectly.

00:13:54.947 --> 00:13:56.767
It's the way you see the world.

00:13:57.127 --> 00:14:02.747
And I think what you bring to the table or to the discussion is a very,

00:14:02.927 --> 00:14:04.787
very interesting worldview.

00:14:05.087 --> 00:14:10.307
And that's, I mean, it's really apparent when you read your books,

00:14:10.487 --> 00:14:11.547
the writing you've done.

00:14:11.547 --> 00:14:17.267
I mean, the remarkable way you really try to look at complexity and the way

00:14:17.267 --> 00:14:20.787
you look at sort of causation models, it's been really powerful.

00:14:21.067 --> 00:14:27.867
And really, you're kind of maybe a one-man campaign to sort of expand the world's

00:14:27.867 --> 00:14:28.947
understanding of cause.

00:14:29.087 --> 00:14:34.727
Has that served you well or has that been a rough road to hoe?

00:14:37.167 --> 00:14:41.067
I don't feel I'm on a campaign to do anything. things.

00:14:41.147 --> 00:14:49.147
Sometimes I do feel I'm a one man, not quite alone because others think and

00:14:49.147 --> 00:14:54.767
act in the same way, but I'm not really trying to make a campaign for anything.

00:14:54.907 --> 00:15:00.327
As I said, and this is really honest, I really try to understand things myself,

00:15:00.487 --> 00:15:04.147
to make sense of things myself and the world,

00:15:04.327 --> 00:15:11.167
whatever happens in the world, and that needs sometimes concepts and ideas that

00:15:11.167 --> 00:15:14.427
I can't find, and then I stumble upon them,

00:15:14.587 --> 00:15:19.527
or we talked about, apropos, podcasts and reading,

00:15:19.727 --> 00:15:23.727
and I mean, I've read a lot, I still read a lot, and I think a lot of what I've

00:15:23.727 --> 00:15:28.527
read over the years, it's there in the back of my mind, and the mind works in

00:15:28.527 --> 00:15:33.507
a weird and wonderful way, and you read something else and something clicks and say,

00:15:33.687 --> 00:15:38.627
isn't that a bit like it reminds you of and then something pops up.

00:15:38.687 --> 00:15:40.647
You can't really explain it. It just happens.

00:15:41.107 --> 00:15:47.147
But I think you can, I wouldn't even say I prepare it, but I have a pleasure

00:15:47.147 --> 00:15:52.807
in reading many different things, many different sciences of books of all kinds.

00:15:53.087 --> 00:15:57.287
And I think there are sort of nuggets there that, I mean, one thing,

00:15:57.447 --> 00:16:03.887
the term I've used and many others have used is what you look for is what you find.

00:16:04.107 --> 00:16:07.527
But sometimes what you look for is not what you're conscious you look for,

00:16:07.607 --> 00:16:13.327
but you still look for it because the brain has this ability to relate what

00:16:13.327 --> 00:16:17.787
you perceive to what's in there, even though you can't remember what's in there.

00:16:18.167 --> 00:16:24.267
And then sometimes a little signal pops up and you work a bit on it and it turns

00:16:24.267 --> 00:16:26.847
out to be useful and that's how it works.

00:16:26.967 --> 00:16:32.187
But I mean, it's a mystery to me. If you were to advise people...

00:16:33.529 --> 00:16:39.089
The books that you've written to read, what do you think you'd have them read?

00:16:39.269 --> 00:16:47.349
What are the books that you think you're most proud of, you're most effective with?

00:16:48.149 --> 00:16:51.069
That's a good question, I know. That's an answer to a question.

00:16:51.889 --> 00:16:56.989
Well, you mentioned the Eto'o principle, and I think that I'm pretty proud of that.

00:16:57.109 --> 00:17:03.249
I think that was a neat way of formulating an idea that had been existing for a long time.

00:17:03.529 --> 00:17:09.449
Others have thought of before, but being fortunate to be able to come up with

00:17:09.449 --> 00:17:14.309
examples to illustrate that, I think that's very important when you write something

00:17:14.309 --> 00:17:15.549
to be able to illustrate it.

00:17:15.629 --> 00:17:21.049
So I think the etta principle is something I feel is a nice piece of work.

00:17:23.409 --> 00:17:28.349
And I mean, the two other ones that are really, I don't want to say I'm proud

00:17:28.349 --> 00:17:35.549
of them, but that I really think are decent works are Fram and Safety 1 and Safety 2.

00:17:36.049 --> 00:17:40.429
I think Fram in particular, Safety 1 and Safety 2, is not that original because

00:17:40.429 --> 00:17:44.949
this idea about you can look at things in different ways has been around for many times.

00:17:45.309 --> 00:17:49.729
And the ideas of Fram, of course, has also been around, but not sort of put

00:17:49.729 --> 00:17:51.549
together in a single place before.

00:17:51.809 --> 00:17:56.749
And I think that is perhaps a bit original, but I'm not sure how much.

00:17:58.209 --> 00:18:02.949
And the frame is something I was still working on and still developing and still being extended.

00:18:03.369 --> 00:18:08.729
I think your framework, at least for my career at the National Laboratories

00:18:08.729 --> 00:18:14.429
in the United States, that book changed the way we really thought about learning

00:18:14.429 --> 00:18:16.509
technically across the board.

00:18:16.709 --> 00:18:20.209
It was a powerful tool for us.

00:18:20.269 --> 00:18:23.629
And the timing seemed really right. And it really was,

00:18:23.909 --> 00:18:33.309
as odd as this sounds, at the time it was somewhat controversial to talk about

00:18:33.309 --> 00:18:36.829
cause other than sort of monocausal phenomena.

00:18:37.269 --> 00:18:44.029
And that surprises me. But maybe it surprises me just because of work you've

00:18:44.029 --> 00:18:45.409
done and others like you.

00:18:45.569 --> 00:18:48.849
We've sort of moved beyond the fallacy of monocausality.

00:18:49.029 --> 00:18:52.829
But it's a powerful force. And you see it in industry. I mean,

00:18:53.229 --> 00:18:55.269
that's what we try to tackle.

00:18:55.949 --> 00:19:03.429
What advice would you give a person who's managing work, who's actually out

00:19:03.429 --> 00:19:07.369
in the field doing this new work?

00:19:07.669 --> 00:19:12.749
What would you tell them to think about or to listen to or to read?

00:19:13.994 --> 00:19:20.394
Well, can I just come back to one thing you said? Absolutely. The word timing.

00:19:21.194 --> 00:19:26.954
I mean, because I think maybe I've just been lucky I've come up with some ideas

00:19:26.954 --> 00:19:30.334
and the environment has been receptive to them.

00:19:30.454 --> 00:19:34.534
Others have come up with the same ideas at a time when the environment was not receptive to them.

00:19:34.634 --> 00:19:39.974
So it's very much about timing you sort of the right thought at the right time.

00:19:40.254 --> 00:19:43.354
And you can't control that. That's just sheer luck.

00:19:44.434 --> 00:19:51.054
It happens. Well, if I may, I would also suggest that the way you package the

00:19:51.054 --> 00:19:57.494
ideas and the way you create analogies and discussions and case studies and examples,

00:19:57.694 --> 00:20:03.574
I think actually makes these rather esoteric concepts more impactful.

00:20:03.734 --> 00:20:08.414
I think that's one thing you bring to the field is a remarkable way to look

00:20:08.414 --> 00:20:14.734
at pretty woolly ideas and make them have realistic applications.

00:20:15.314 --> 00:20:18.994
Well, I'm flattered that you say so.

00:20:19.274 --> 00:20:24.434
As I said, I'm really trying to explain it to myself, to make it understandable to myself.

00:20:25.194 --> 00:20:29.994
And, of course, by doing that, you can hopefully also make it understandable to others.

00:20:31.474 --> 00:20:34.754
I've always admired people who write, Jim Reason in particular,

00:20:34.814 --> 00:20:38.994
who writes really, really well, and I still admire his writing.

00:20:39.914 --> 00:20:42.454
And there are a lot of other people who don't write so well.

00:20:42.554 --> 00:20:48.794
I shan't mention the names here, but I think I've just been lucky that I have

00:20:48.794 --> 00:20:52.674
the ability to write in an understandable way.

00:20:53.714 --> 00:20:56.494
I don't know how it happened, but that's how it is.

00:20:56.494 --> 00:21:00.234
But to come back to your question, there is

00:21:00.234 --> 00:21:11.254
a famous samurai in Japan called Miyamoto Musashi in the 17th century and was

00:21:11.254 --> 00:21:20.194
sort of the undisputed master and ended his life peacefully as a Zen philosopher, a Zen monk.

00:21:20.194 --> 00:21:25.354
And he wrote a book which has been translated into English, it's quite famous,

00:21:25.474 --> 00:21:30.414
it's called The Book of Five Rings, and it's been used on Wall Street as sort

00:21:30.414 --> 00:21:31.934
of a management Bible also.

00:21:31.934 --> 00:21:40.034
But he made some observations about what you need to do to be able to be successful

00:21:40.034 --> 00:21:44.474
in his case as in the thought fight, but in general also.

00:21:44.614 --> 00:21:50.434
And one of the advices he gives is to perceive the things that cannot be seen.

00:21:51.444 --> 00:21:54.204
And I think that's what I think a good manager should do.

00:21:54.464 --> 00:21:57.924
He should perceive the things that cannot be seen, that is the things that are

00:21:57.924 --> 00:22:03.784
not obvious or do not attract attention by themselves, but which are still there,

00:22:03.944 --> 00:22:07.624
which is sort of hidden in the background because they can be very, very important.

00:22:07.804 --> 00:22:09.904
And I think that's a skill you have to learn.

00:22:10.524 --> 00:22:15.764
And again, to come back to the etude, that's an efficiency-thorough trade-off

00:22:15.764 --> 00:22:21.004
because you have to spend some time looking at things that seemingly are unimportant,

00:22:21.004 --> 00:22:24.284
but which actually, in many cases, turn out to be very important.

00:22:24.404 --> 00:22:31.164
So in the long run, the efficiency is reduced by spending time on that.

00:22:31.304 --> 00:22:37.564
But in the short run, the efficiency is reduced because you spend time on being more thorough.

00:22:37.724 --> 00:22:41.604
But in the long run, it actually improves efficiency to do that.

00:22:44.564 --> 00:22:47.704
And I don't like giving advice to people about what they should do.

00:22:48.304 --> 00:22:55.764
But to me, that advice of a musashi to perceive the things that cannot be seen is really important.

00:22:56.124 --> 00:23:01.624
And sort of looking at the things that often stare you in the eye and are so

00:23:01.624 --> 00:23:02.844
obvious that you miss them.

00:23:03.164 --> 00:23:06.424
And you have to sort of question yourself all the time and say,

00:23:06.424 --> 00:23:09.364
what am I really doing here? What am I looking at?

00:23:10.184 --> 00:23:12.324
That's phenomenal. What's your next book going to be about?

00:23:13.275 --> 00:23:16.675
I don't have any firm plans for our next book.

00:23:16.835 --> 00:23:19.795
It may happen, may not happen.

00:23:19.935 --> 00:23:25.155
Well, I mean, yeah, there is a series of books we've been doing on resilient

00:23:25.155 --> 00:23:28.215
healthcare, and that seems to be going on.

00:23:28.355 --> 00:23:34.215
I just completed editing the fourth book, and we are in the process of working on the fifth book.

00:23:34.335 --> 00:23:39.555
But that's sort of a collective effort.

00:23:39.555 --> 00:23:43.875
And perhaps you're asking whether I would do a monograph myself,

00:23:44.115 --> 00:23:53.835
and if time permits and if I can find a way of doing it, maybe what we talked about in the beginning,

00:23:54.175 --> 00:23:57.275
the development of Safety 2,

00:23:57.555 --> 00:24:05.075
sort of getting towards an understanding of how an organization works which

00:24:05.075 --> 00:24:08.815
is not tied up to specific issues or criteria,

00:24:08.975 --> 00:24:14.715
sort of safety and quality and efficiency, but it looks at the whole in another way.

00:24:14.895 --> 00:24:20.735
That's why I like the term that I use safety synthesis because I think I don't

00:24:20.735 --> 00:24:25.315
like safety so much now, but I like synthesis because I think it's the synthesis

00:24:25.315 --> 00:24:29.435
that we need to achieve and we need to understand, we need to work at.

00:24:29.815 --> 00:24:33.695
So maybe something along those lines. Who knows?

00:24:34.255 --> 00:24:39.455
I think you're right. I think safety as a term is probably carried about as

00:24:39.455 --> 00:24:44.495
much water as it can carry and is no longer really no longer aligns with the

00:24:44.495 --> 00:24:48.235
way that we currently think about sort of reliable operations or reliable systems.

00:24:48.735 --> 00:24:52.735
Why do you think healthcare is so slow to come around to the new view?

00:24:55.512 --> 00:25:01.312
Well, I don't know why. I mean, are they slowing coming around to this issue?

00:25:01.752 --> 00:25:07.952
I think healthcare has always been about being careful and making sure that

00:25:07.952 --> 00:25:09.552
nothing happens to patients.

00:25:09.872 --> 00:25:11.812
That's a part of the Hippocratic Oath.

00:25:12.272 --> 00:25:15.272
First of all, do no harm. That's what it says.

00:25:15.612 --> 00:25:20.932
So that's 2,000 years old, so you can't really say that's slow. Okay, fair enough.

00:25:22.352 --> 00:25:25.792
But then I think healthcare has

00:25:25.792 --> 00:25:31.432
been a system that worked under its own conditions for a long, long time.

00:25:31.712 --> 00:25:39.592
And then only maybe about 50, 60 years ago, it started to be more tightly coupled

00:25:39.592 --> 00:25:41.192
to other aspects of society.

00:25:41.352 --> 00:25:49.172
And it started to get the focus of society and authorities also because it started

00:25:49.172 --> 00:25:52.112
to become more and more expensive. and then people are more and more concerned.

00:25:52.392 --> 00:25:57.612
And then the pressure came on the Institute of Medicine report to errors human

00:25:57.612 --> 00:25:59.632
and stuff like that. And then the pressure was there.

00:25:59.752 --> 00:26:04.612
And then as in every other industry, when the pressure suddenly comes on,

00:26:04.752 --> 00:26:09.092
like nuclear after TMI, like aviation and a number of cases,

00:26:09.812 --> 00:26:14.332
they sort of rush around like crazy and look for solutions wherever they can

00:26:14.332 --> 00:26:18.592
and they borrow solutions from others and apply them.

00:26:18.592 --> 00:26:22.832
And usually they don't work because they're not designed for that specific purpose.

00:26:23.052 --> 00:26:29.372
So healthcare is in that situation now, but I think it's partly because of the

00:26:29.372 --> 00:26:32.652
external pressure that came for any number of reasons.

00:26:33.404 --> 00:26:36.724
I find that really interesting, this idea that they borrow solutions from other

00:26:36.724 --> 00:26:40.064
industries and they don't work because they're not designed to work in that industry.

00:26:40.504 --> 00:26:43.684
That's actually a really good example, at least in the United States,

00:26:44.404 --> 00:26:46.764
of kind of where healthcare currently is around.

00:26:47.024 --> 00:26:50.784
I mean, but their failure rates is quite remarkable. I mean,

00:26:50.884 --> 00:26:54.884
it's hard to imagine the system getting worse.

00:26:55.604 --> 00:27:00.384
Yeah, well, yeah. But again, the thing is wrong to talk about the failure rate,

00:27:00.384 --> 00:27:05.844
because you realize what they are facing an incredibly complex situation.

00:27:06.304 --> 00:27:12.104
And very often they compare themselves and others compare healthcare with aviation,

00:27:12.104 --> 00:27:15.024
but that's really unfair because,

00:27:15.844 --> 00:27:20.104
I mean, if every patient was standardized the same way that aircraft are,

00:27:20.624 --> 00:27:24.084
then, of course, it would be a lot better in healthcare, but they aren't.

00:27:24.184 --> 00:27:27.364
And we tend to look at aviation and say, look at it, it's great.

00:27:27.364 --> 00:27:34.364
It's the probability of a facility is 1 in 7 million and 1 in 14 million, whatever.

00:27:35.524 --> 00:27:41.904
But if you look at the statistics for damage or misdirected luggage,

00:27:42.364 --> 00:27:46.944
it's 1 in 350, which is much closer to what hospitals are.

00:27:47.364 --> 00:27:50.844
So it depends on which kind of numbers you look at.

00:27:51.644 --> 00:27:55.584
I'm not saying healthcare couldn't do better. that could do a lot better.

00:27:55.724 --> 00:28:01.124
But you also need to understand how incredibly complex the situation is and how completely.

00:28:02.355 --> 00:28:06.235
How constrained it is in terms of resources and demands.

00:28:06.555 --> 00:28:13.315
And what they experience is an onslaught of new technology constantly,

00:28:13.315 --> 00:28:17.335
which means they never have time to learn and to get the system to work.

00:28:17.435 --> 00:28:19.915
They never reach an equilibrium. They're always disturbed.

00:28:20.335 --> 00:28:24.375
Whereas most other industries, certain aviation, reach an equilibrium state.

00:28:24.575 --> 00:28:26.715
And then there's a new technology in healthcare.

00:28:27.115 --> 00:28:31.815
It's sort of not daily, but weekly, monthly. Things change all the time.

00:28:32.355 --> 00:28:35.555
And you never have a chance to learn. And you can never become,

00:28:35.815 --> 00:28:39.755
start really to work on reducing the number of things that go wrong.

00:28:43.935 --> 00:28:47.255
What'd you think? Wasn't that great? It was really a great conversation.

00:28:47.815 --> 00:28:52.335
And I thank him so much. I really appreciate him taking the time to do this.

00:28:52.455 --> 00:28:55.015
And it really is something he doesn't have to do for sure.

00:28:55.255 --> 00:29:00.415
And nor does he see himself as kind of a podcast or video kind of person.

00:29:00.415 --> 00:29:04.655
But I just think this opportunity was worth its weight in gold.

00:29:04.935 --> 00:29:12.875
And I want to especially thank Eric and Chevron for supporting him on his journey

00:29:12.875 --> 00:29:16.295
over here and for allowing me some time to sort of take him away.

00:29:16.755 --> 00:29:21.935
And everybody else that was involved with this adventure, it was completely

00:29:21.935 --> 00:29:23.815
worth it. But mostly I want to thank you.

00:29:24.295 --> 00:29:28.015
Thanks for listening. Thanks for tuning in. The numbers are kind of nutty.

00:29:28.315 --> 00:29:31.855
And I can't do it without you. That's for sure. Tell your friends.

00:29:32.075 --> 00:29:34.435
Subscribe. That seems to make some kind of weird difference.

00:29:34.875 --> 00:29:37.735
Write a review. This is an Eric Hollenegel podcast. Where else are you going

00:29:37.735 --> 00:29:40.175
to find that? Just saying. That's all I'm doing is just saying.

00:29:40.495 --> 00:29:45.495
But until then, learn something new every single day. And I'll bet you did today.

00:29:45.735 --> 00:29:48.575
I'll bet you a nickel you did today. Have as much fun as you possibly can.

00:29:48.575 --> 00:29:50.835
And for goodness sakes, be safe.

00:29:50.800 --> 00:30:01.198
Music.

Contact Us

×

Got a question, press inquiry or idea you'd like to share? Contact us through the form below and let us know how we can help.

Subscribe, don't miss the next episode!

×