Show Notes
Join Todd Conklin in this engaging episode of Pre-Accident Investigation as he reflects on an unexpected experience at a bluegrass festival that was interrupted by lightning. Through this anecdote, Todd explores the intriguing concepts of risk, chance, and control, delving into the differences between them and their implications in the world of safety.
Discover how the inability to predict the future emphasizes the need for systems that are not only robust but also resilient, capable of absorbing and managing unforeseen events. Todd discusses the importance of focusing on recoverability and flexibility in operations, rather than solely on risk prediction. This episode challenges traditional views on risk assessment and encourages a shift towards enhancing control and adaptability in safety management.
As Todd celebrates his birthday, he shares personal reflections on the value of friendships and gratitude, while inviting listeners to ponder their own thinking time. Tune in for an insightful discussion that blends personal stories with professional insights, all while encouraging you to learn something new, have fun, and stay safe.
Show Transcript
WEBVTT
00:00:00.017 --> 00:00:04.797
So I was at a bluegrass festival earlier this year. Let's just go with that.
00:00:06.037 --> 00:00:10.257
And Saturday night, the big night, that's when they always have kind of the
00:00:10.257 --> 00:00:11.957
big shows on the main stage.
00:00:12.437 --> 00:00:17.137
And usually the final show, the Saturday night closeout show,
00:00:17.357 --> 00:00:21.517
is the giant act, the one everyone comes to see.
00:00:22.117 --> 00:00:27.517
And they canceled the show because of lightning, which makes sense.
00:00:27.717 --> 00:00:28.957
I mean, it's an outdoor festival.
00:00:29.637 --> 00:00:34.177
Lightning is, you know, lightning. They're standing up there with wires and
00:00:34.177 --> 00:00:38.937
microphones and all sorts of little lightning attractive things.
00:00:39.337 --> 00:00:40.797
And they canceled the show.
00:00:41.337 --> 00:00:43.477
And, of course, everyone was super disappointed.
00:00:44.037 --> 00:00:50.657
But a person I was with said, well, it makes sense because they wanted to manage the risk.
00:00:53.840 --> 00:01:03.280
Music.
00:01:02.597 --> 00:01:06.897
Hey, everybody, Todd Conklin, Pre-Accident Investigation, and it's time for
00:01:06.897 --> 00:01:10.077
the pod. So I hope you're doing well.
00:01:10.557 --> 00:01:15.717
This is a very interesting time. This is always the first part of November,
00:01:15.977 --> 00:01:20.217
which is, you know, when I was born. It's my birthday time.
00:01:20.497 --> 00:01:26.217
But it's really fun because I have loads of friends, and that's always something
00:01:26.217 --> 00:01:31.117
that I'm very grateful of. there are many of us that are all born kind of in the same time.
00:01:31.397 --> 00:01:34.377
Like there are four of us, and we hang out pretty regularly,
00:01:34.697 --> 00:01:37.997
that are all born within like a week of each other.
00:01:38.097 --> 00:01:44.017
So it's kind of a fun time, and we usually do big, giant celebrations and have all sorts of fun.
00:01:44.617 --> 00:01:48.417
And it's happening again this year. I mean, I'm sure you got your invitation.
00:01:48.937 --> 00:01:53.077
If you didn't, it's coming in the mail. It'll be there soon for some kind of
00:01:53.077 --> 00:01:59.337
giant, you know, probably on a yacht, maybe a blimp, some kind of giant party.
00:01:59.817 --> 00:02:01.477
I'm sure we'll have that planned for this year.
00:02:01.977 --> 00:02:07.017
And I'll give you a full report on what happens, which tattoos were received,
00:02:07.017 --> 00:02:11.637
who gave them, were there misspellings, you know, the normal kind of stuff we talk about.
00:02:12.017 --> 00:02:14.097
That's all exciting. So that's fun.
00:02:14.517 --> 00:02:18.537
And other than that, just, you know, kind of winding the year down,
00:02:18.897 --> 00:02:21.657
which is always kind of a good time as well.
00:02:22.197 --> 00:02:25.717
It's been interesting. I've gotten kind of a lot of bicycle time,
00:02:25.977 --> 00:02:27.897
which I always think is good time.
00:02:28.397 --> 00:02:32.097
Because I've decided when I ride my bicycle...
00:02:33.185 --> 00:02:38.625
I've got two things that I've decided. One is it's a really good time to think.
00:02:39.385 --> 00:02:45.165
And do you have a time to think? I mean, I'd be curious to kind of wonder what
00:02:45.165 --> 00:02:47.365
your thinking time is, where it is.
00:02:47.745 --> 00:02:51.205
But it's kind of nice to have a place where you can just think.
00:02:51.665 --> 00:02:58.185
And so that's always a huge fringe benefit of going out for a bike ride.
00:02:58.285 --> 00:03:02.525
Being outside is great. It's fun to get around. I see people.
00:03:02.945 --> 00:03:06.765
It's funny, when you ride a lot, you see the same people over and over and over again.
00:03:07.285 --> 00:03:11.445
So you kind of, it's kind of funny. You sort of have a relationship,
00:03:11.445 --> 00:03:15.525
even though you don't know this person at all, but you've seen him a million times.
00:03:15.745 --> 00:03:21.345
Like there's a gray-haired guy who listens to the radio who has a Copenhagen
00:03:21.345 --> 00:03:26.585
wheel on his bicycle, and I see him all over town, and I see him with great regularity.
00:03:26.805 --> 00:03:29.885
So that's kind of a fun part of bike riding. The other thing is I've decided
00:03:29.885 --> 00:03:36.945
about one out of every 100 people yell at you, you know, get off the road or some other thing.
00:03:37.485 --> 00:03:44.125
Usually you just kind of smile and wave, which kind of cuts the payoff for them
00:03:44.125 --> 00:03:49.685
and reduces the amount of exposure for something weird happening.
00:03:49.685 --> 00:03:54.765
But that's always kind of an interesting part of riding the bicycle.
00:03:54.765 --> 00:03:58.725
And so that's been really fun and a great time.
00:03:59.045 --> 00:04:01.285
And, you know, other than that, it's November.
00:04:01.685 --> 00:04:03.785
What else can I say? November is November.
00:04:04.205 --> 00:04:06.825
And November is always pretty fun.
00:04:07.265 --> 00:04:12.085
I'm recording this before the election results.
00:04:12.705 --> 00:04:15.825
So I don't have any idea what's going to happen.
00:04:16.585 --> 00:04:20.805
And as I've said to you a million times, and you've said it back to me a million
00:04:20.805 --> 00:04:24.165
and one times, human beings are relatively bad at predicting.
00:04:24.765 --> 00:04:27.725
And we're going to talk about that today a little bit. And because we're bad
00:04:27.725 --> 00:04:29.905
at predicting, I have no idea what's going to happen. But we'll see.
00:04:30.285 --> 00:04:31.925
You know, that's all going to see.
00:04:33.145 --> 00:04:37.165
It's just a really exciting time to be alive. And it's even a more exciting
00:04:37.165 --> 00:04:42.905
time to spend with you and do a quick podcast, which is exactly what I want
00:04:42.905 --> 00:04:44.645
to do. So before we go much further...
00:04:45.698 --> 00:04:48.978
Let's pick up where the story left off.
00:04:53.798 --> 00:04:57.578
So yeah, so this friend says they've got to manage risk.
00:04:57.678 --> 00:05:02.478
And I said, is it risk? Are they managing risk?
00:05:02.858 --> 00:05:05.278
Are they managing chance?
00:05:05.958 --> 00:05:10.478
Because I started thinking about this because I guess we had nothing else to
00:05:10.478 --> 00:05:11.578
do. They canceled the show.
00:05:12.058 --> 00:05:16.178
And I thought risk is defined.
00:05:17.038 --> 00:05:21.818
And I guess lightning is risky. I mean, I think I'd lose that argument if I
00:05:21.818 --> 00:05:22.998
tried to have that argument with you.
00:05:23.418 --> 00:05:28.618
But it strikes me that lightning is more of a chance than a risk.
00:05:29.278 --> 00:05:34.458
And I think it has to do with maybe the amount of control we have as human beings.
00:05:34.578 --> 00:05:38.678
Now, the term for that is agency, the amount of agency we have as human beings.
00:05:38.678 --> 00:05:42.918
But I started thinking about it, and it's really kind of the old,
00:05:43.018 --> 00:05:45.318
are you good or are you lucky?
00:05:46.118 --> 00:05:51.298
And so I think that's something we should talk about because this has really
00:05:51.298 --> 00:05:55.258
been on my mind, partially because I was super disappointed that the show got
00:05:55.258 --> 00:05:57.958
canceled because there were a couple of really good bands.
00:05:58.138 --> 00:06:01.418
And if I think of the name, see, I don't even remember the name of them.
00:06:02.058 --> 00:06:05.418
One was especially good. They're from California, kind of jazzy,
00:06:05.918 --> 00:06:08.058
bluegrass jazz kind of combo.
00:06:08.678 --> 00:06:17.738
But anyway, this idea that risk is what we control is kind of an interesting
00:06:17.738 --> 00:06:20.278
idea. So we can define all these terms.
00:06:20.458 --> 00:06:22.898
I mean, they all have definitions, and you probably know them all.
00:06:23.558 --> 00:06:30.378
Risk, traditionally, I've always looked at risk as the probability that a person
00:06:30.378 --> 00:06:35.618
will be harmed if something happens, if exposed to a hazard.
00:06:36.358 --> 00:06:42.898
So it's probability, I guess, times consequence. I hate to make fake math equations,
00:06:42.918 --> 00:06:46.598
but there's a relationship between probability and consequence,
00:06:46.598 --> 00:06:49.578
which lets us do kind of a risk ranking.
00:06:49.878 --> 00:06:53.498
But I've told you before, and because we've talked about this a bunch,
00:06:53.998 --> 00:06:58.878
that risk ranking is never really about risk. It's about resource management.
00:06:59.778 --> 00:07:05.038
But risk is really that idea, what's the chance that this bad thing could happen?
00:07:05.298 --> 00:07:09.978
Oh, man, I just used the word chance, and I'm trying to build this case between risk and chance.
00:07:10.218 --> 00:07:15.218
What's the probability that this bad thing could happen? I may solve my own
00:07:15.218 --> 00:07:16.998
problem in the middle of this podcast. Who knows?
00:07:17.818 --> 00:07:23.038
And so risk is really interesting. And a lot of people think about this a lot.
00:07:23.038 --> 00:07:26.438
I mean, this really, like John Adams, if you haven't read him,
00:07:26.598 --> 00:07:29.838
his work on risk is very, very interesting.
00:07:29.958 --> 00:07:32.098
And there's a lot of people that think about this.
00:07:32.498 --> 00:07:40.198
It's funny, you don't hear it so much amongst the scholars of kind of the new safety thinking.
00:07:40.678 --> 00:07:43.678
They don't really talk a lot about risk. And partially because,
00:07:43.958 --> 00:07:48.718
generally speaking, risk has been in the domain of kind of the safety engineers.
00:07:48.718 --> 00:07:56.578
Engineers so they can do a calculation and determine numerically what risk is going to be.
00:07:56.798 --> 00:08:01.178
And I'm not saying these are bad things. I'm just saying that's kind of how
00:08:01.178 --> 00:08:02.178
this has worked traditionally.
00:08:02.978 --> 00:08:10.878
The new safety scholars will tell you that in a complex world with highly adaptive sort of systems,
00:08:11.378 --> 00:08:17.318
that the ability to do a calculation is really leaning on kind of old school
00:08:17.318 --> 00:08:21.978
thinking, kind of linear thinking in a nonlinear world that it'd be hard to calculate.
00:08:22.418 --> 00:08:25.478
So we talked about this. I mean, we've struggled with this for a while.
00:08:25.898 --> 00:08:29.298
And one of the things that happened, at least in the world I live in,
00:08:29.478 --> 00:08:36.818
is we realized that risk ranking, because it's based upon resource availability,
00:08:36.818 --> 00:08:41.518
how much are we going to spend to manage this potential thing that has not happened yet?
00:08:42.138 --> 00:08:45.478
It really was sort of dictated by potential consequence.
00:08:45.878 --> 00:08:52.038
So that's where consequence comes in. So if it's highly likely that it'll happen,
00:08:52.038 --> 00:08:57.818
but the consequence is super low, then that didn't warrant a lot of attention.
00:08:58.018 --> 00:09:00.578
But if it's highly unlikely that it'll happen, but if it did,
00:09:00.678 --> 00:09:02.078
the consequence was super high,
00:09:02.438 --> 00:09:10.538
then we found the ability to resource that and provide control or barriers or
00:09:10.538 --> 00:09:13.618
somehow mitigate the hazard as it was.
00:09:13.778 --> 00:09:18.058
And that is kind of interesting because is risk different than hazard?
00:09:18.378 --> 00:09:19.698
Well, they're very different words.
00:09:20.458 --> 00:09:24.498
Risk is that probability of something bad happening. The hazard is the actual
00:09:24.498 --> 00:09:26.438
thing. So you can take a picture of a hazard.
00:09:26.938 --> 00:09:28.858
But then where does chance come in?
00:09:29.567 --> 00:09:34.147
And chance is really pretty interesting, because one of the things that I'm
00:09:34.147 --> 00:09:36.427
sure you're doing with your organization, and if you're not,
00:09:36.527 --> 00:09:40.847
you should be, is every time a near miss is reported, simply ask,
00:09:41.047 --> 00:09:43.827
were we good or were we lucky?
00:09:44.547 --> 00:09:48.227
And no matter what the answer, if they say, well, you had a near miss and you
00:09:48.227 --> 00:09:50.787
say, were you good or you're lucky? And they say we were good.
00:09:51.267 --> 00:09:54.307
We had the right controls in place. We had the right barriers in place.
00:09:54.487 --> 00:09:58.027
We had the right systems in place that when this bad thing happened,
00:09:58.027 --> 00:10:03.127
we were able to manage it and control the consequence.
00:10:03.387 --> 00:10:05.747
Well, that's a good and you should reward that.
00:10:06.167 --> 00:10:09.707
But if they say we had a near miss, you say, were we good or lucky?
00:10:09.867 --> 00:10:10.887
And they say we were lucky.
00:10:11.547 --> 00:10:14.467
And you'll say what happened. and they'll say, well, we had no idea this could
00:10:14.467 --> 00:10:18.787
fail. We never imagined this would fail at this place at this time, and it did.
00:10:19.447 --> 00:10:24.347
Fortunately, when it failed, nobody was around, and so it had very little consequence.
00:10:24.807 --> 00:10:26.407
Henceforth, it's a near miss.
00:10:27.127 --> 00:10:32.427
That's also pretty valuable data because that tells you that you have a hazard
00:10:32.427 --> 00:10:38.567
in the field, an unimagined hazard in the field that you really have no controls over.
00:10:38.687 --> 00:10:44.487
So the system's really brittle there. And learning that in kind of a free way,
00:10:44.707 --> 00:10:48.587
because near misses are gifts, learning that without the consequence,
00:10:48.747 --> 00:10:51.847
without the mess, is actually pretty incredible news.
00:10:51.847 --> 00:10:55.587
And because you got that pretty incredible news, well, you can go out there
00:10:55.587 --> 00:10:58.527
and put some controls around it and make it safer.
00:10:59.267 --> 00:11:04.147
Build a case for it to fail as elegantly and as extensively as it possibly can.
00:11:04.667 --> 00:11:09.527
But I think about this in really terms that we have to practically use.
00:11:10.127 --> 00:11:16.067
Because risk, hazard, and chance are all this need to sort of predict the future.
00:11:16.227 --> 00:11:21.027
And the crazy thing about this is that the amount of effort we spend trying
00:11:21.027 --> 00:11:26.447
to predict the future takes directly away from the amount of effort we spend
00:11:26.447 --> 00:11:29.507
currently trying to affect the future.
00:11:29.507 --> 00:11:35.827
And so it's always kind of better to not use your resources in predicting what
00:11:35.827 --> 00:11:40.147
will happen next in an uncertain world, because we don't really know what that answer is.
00:11:40.427 --> 00:11:46.867
It's probably always better to actually look at carefully and try to affect
00:11:46.867 --> 00:11:52.087
what's happening now in a way that has potential positive outcomes.
00:11:52.787 --> 00:11:56.187
Long story short, because this is kind of becoming a long story,
00:11:56.507 --> 00:12:03.827
that's what shifted the thinking around risk in super high consequence work.
00:12:04.802 --> 00:12:12.402
So instead of managing by probability, chance, lightning, the decision was made
00:12:12.402 --> 00:12:17.622
to assume that the probability of the bad thing happening is 100%.
00:12:17.622 --> 00:12:22.942
Now, that's expensive, and that's time-consuming, and that's resource-intensive.
00:12:23.422 --> 00:12:27.382
But if the consequence is significant, assume it will happen.
00:12:27.602 --> 00:12:32.962
And the best example I can sort of use for this, The one that comes to mind,
00:12:33.162 --> 00:12:38.102
the one that came to mind when we were walking back from the main stage after the show was canceled.
00:12:38.322 --> 00:12:40.502
And by the way, it rained like crazy.
00:12:41.002 --> 00:12:43.682
So it was a good call. A lot of times when they cancel a show,
00:12:43.782 --> 00:12:47.442
then it doesn't rain and you're like, aha, you're overly conservative and you
00:12:47.442 --> 00:12:50.022
ruin the weekend for us. But we totally got the storm out of it.
00:12:50.122 --> 00:12:55.442
So it was a good call. When we were walking back, the discussion really became
00:12:55.442 --> 00:13:01.582
one of thinking in great detail about what this means.
00:13:02.002 --> 00:13:10.002
Because ultimately, probability is the desperate need to predict the future.
00:13:10.702 --> 00:13:18.502
And following data, whatever data we make up or assume, in order to determine
00:13:18.502 --> 00:13:23.902
probability, really assumes that there's an answer and there really isn't an answer.
00:13:24.502 --> 00:13:27.522
We don't know what the future will hold.
00:13:28.022 --> 00:13:34.302
And so if you assume it'll happen, then you can draw yourself to the comparison
00:13:34.302 --> 00:13:35.922
that I gave on the way back.
00:13:36.742 --> 00:13:40.322
And that's fall protection. So why do we put people in fall protection?
00:13:40.542 --> 00:13:43.282
Well, the answer is, and almost always when I ask this question,
00:13:43.402 --> 00:13:44.702
people say, so they don't fall.
00:13:45.162 --> 00:13:49.002
Okay, so that's true, but not really.
00:13:49.482 --> 00:13:53.642
Because fall protection is really weird in that it doesn't really care about the fall part.
00:13:54.102 --> 00:13:58.182
What fall protection cares about is the recovery part, the landing part.
00:13:58.602 --> 00:14:04.882
So we put people in fall protection around the belief that there's 100% chance
00:14:04.882 --> 00:14:12.862
they will fall. And when they do fall, our system's robust enough to provide recoverability.
00:14:13.062 --> 00:14:22.382
We can recover and manage the consequence before it has ultimate and dire outcomes.
00:14:22.782 --> 00:14:30.662
So you fall, you jerk your lanyard, it catches you, you swing against the wall,
00:14:30.902 --> 00:14:35.302
maybe you bruise your ribs, but you didn't land on your head and die.
00:14:35.862 --> 00:14:41.322
Now, that's a really interesting example, because the assumption is,
00:14:41.382 --> 00:14:44.882
is that the probability of the fall is one.
00:14:45.162 --> 00:14:52.042
So when engineers or physicists talk about probability, they always say a probability of one.
00:14:52.402 --> 00:14:56.242
If you're not an engineer or a physicist, what they're really saying is there's
00:14:56.242 --> 00:14:59.642
100 percent chance that's going to happen. We don't know when.
00:15:00.022 --> 00:15:04.442
We just don't know when. But we know it will happen.
00:15:04.442 --> 00:15:09.042
And when it does happen, because we assumed 100% chance of this happening,
00:15:09.402 --> 00:15:16.522
we've actually built into our systems that graceful extensibility that our friend
00:15:16.522 --> 00:15:20.382
David Woods talks about so elegantly in so much of his work.
00:15:20.682 --> 00:15:22.722
We've built recovery.
00:15:23.162 --> 00:15:28.182
And it's really that relationship between robust and resilient. Right.
00:15:28.802 --> 00:15:33.702
Or probability and chance. They all have to exist with each other.
00:15:34.042 --> 00:15:40.782
And so the challenge is, is not that we manage risk, because risk is really
00:15:40.782 --> 00:15:42.702
normal and it's highly dynamic.
00:15:43.102 --> 00:15:47.562
Even if we could predict everything we could think of, we're missing all the
00:15:47.562 --> 00:15:49.102
things we can't think of.
00:15:49.462 --> 00:15:55.342
So if risk assessment really counts on imagination, I can imagine all the ways
00:15:55.342 --> 00:15:59.902
this system will fail, then we're limited by the fact that we can't imagine
00:15:59.902 --> 00:16:01.702
all the things that we need to imagine.
00:16:02.122 --> 00:16:08.522
There will always be the unimaginable. And we're often caught by the fact that
00:16:08.522 --> 00:16:10.322
there are things beyond our imagination.
00:16:10.942 --> 00:16:14.542
And you know that's true. I mean, you learn new stuff all the time.
00:16:14.642 --> 00:16:17.362
You learn new ways for things to happen.
00:16:17.482 --> 00:16:21.462
I talked to somebody just the other day who said they burnt their kitchen in
00:16:21.462 --> 00:16:25.122
their house because they left their crock pot on the counter.
00:16:25.582 --> 00:16:29.522
Well, here's what I'm going to tell you. I never in a million billion years
00:16:29.522 --> 00:16:35.302
imagined a crock pot left on a counter could start a fire because I always assumed
00:16:35.302 --> 00:16:40.662
a crock pot by design was built to be left on the counter.
00:16:41.042 --> 00:16:44.502
It's a slow cooker. I mean, that's what its job is.
00:16:44.682 --> 00:16:49.102
I never thought that it could create enough heat out of the bottom of the crock
00:16:49.102 --> 00:16:52.622
pot to potentially start a fire. but clearly I never imagined that,
00:16:52.902 --> 00:16:55.262
but I talked to somebody that it happened to.
00:16:56.302 --> 00:17:02.322
This idea that somehow risk is the thing we can control, or worse yet,
00:17:02.562 --> 00:17:06.602
chance is the thing we can control, I think that's beyond our scope.
00:17:06.762 --> 00:17:09.542
I just don't think we have the ability to do that.
00:17:09.922 --> 00:17:12.962
And because we can't do it, it sort
00:17:12.962 --> 00:17:19.902
of puts the onus on managing the recoverability. We want robust systems.
00:17:20.182 --> 00:17:25.842
We want strong, preventive-focused systems that stop bad things from happening.
00:17:26.042 --> 00:17:30.382
But we need resilient systems at the same time.
00:17:31.368 --> 00:17:36.188
So that when something unimaginable happens, our system is flexible enough,
00:17:36.508 --> 00:17:42.248
extensible enough, that when it does happen, the system can actually bend and
00:17:42.248 --> 00:17:49.988
sway and actually manage in a positive way the potential failure consequence.
00:17:51.108 --> 00:17:57.588
That's what you think about when you're coming back right before a giant storm
00:17:57.588 --> 00:18:01.828
happens from stage one of a bluegrass festival.
00:18:02.088 --> 00:18:06.188
You think about risk and chance.
00:18:06.388 --> 00:18:10.248
And you think about the relationship that that has.
00:18:10.508 --> 00:18:17.608
The probability of lightning striking that stage was probably gargantuanly high.
00:18:17.608 --> 00:18:20.208
Or maybe gargantuanly low.
00:18:20.728 --> 00:18:22.888
I don't know, because it's lightning.
00:18:23.468 --> 00:18:28.508
The challenge is to build a system that has enough recoverability in it that,
00:18:28.628 --> 00:18:34.048
in fact, we can function elegantly in the midst of uncertainty.
00:18:34.668 --> 00:18:39.788
And to be fair, not to be critical, but to be fair, they didn't have any recoverability
00:18:39.788 --> 00:18:42.088
in the bluegrass. They didn't have another Saturday night.
00:18:42.588 --> 00:18:47.128
So unfortunately, when they canceled those shows, there was no making up for
00:18:47.128 --> 00:18:48.668
it. I mean, there was no other time.
00:18:49.328 --> 00:18:52.048
They couldn't start them again at three o'clock in the morning,
00:18:52.328 --> 00:18:55.228
although that would have been a great idea. I kind of wish they would have done that.
00:18:55.848 --> 00:18:59.228
They couldn't start them the next weekend because, you know,
00:18:59.388 --> 00:19:03.848
people had to go to work and do stuff, which is kind of always what this stupid
00:19:03.848 --> 00:19:10.368
job does to us, creates a never-ending opportunity to go to work and do stuff.
00:19:11.048 --> 00:19:14.308
So we just kind of moved on and function that way.
00:19:15.246 --> 00:19:24.866
That's a little discussion about risk, chance, control, robustness, and resilience.
00:19:28.126 --> 00:19:32.366
What do you think, my friends? That's an interesting pod.
00:19:32.566 --> 00:19:38.266
It's a birthday pod, so I can do whatever I want to. It's my birthday, so I can get away with it.
00:19:38.366 --> 00:19:42.086
But I do think it's a very interesting discussion.
00:19:42.086 --> 00:19:46.206
And I do think it's telling us, and this is one of the benefits,
00:19:46.386 --> 00:19:53.746
I guess, of this journey we're all on together, is that our traditional linear risk assessment tools,
00:19:54.246 --> 00:19:58.326
the risk ranking, the hazard grids that we have,
00:19:58.746 --> 00:20:08.026
the numbering, the valuing we put on risk probability versus consequence is
00:20:08.026 --> 00:20:11.246
probably a good starting place for the conversation.
00:20:12.086 --> 00:20:17.866
But probably not sufficient enough to carry us safely into high-risk work when
00:20:17.866 --> 00:20:19.166
we want it to matter the most.
00:20:19.866 --> 00:20:23.446
So that's a pretty important thing to think about. I don't know.
00:20:23.626 --> 00:20:28.786
It's made me think that risk isn't the part of the equation that we should be all freaked out about.
00:20:29.666 --> 00:20:33.866
It's really control. It's not, are we exposed to risk?
00:20:34.306 --> 00:20:39.006
That's not the right question. The right question is, is when the bad thing
00:20:39.006 --> 00:20:43.726
happens, do we have enough of this graceful extensibility?
00:20:44.166 --> 00:20:51.066
Can our system bend and sway and flex and adapt in such a way that the consequence
00:20:51.066 --> 00:20:55.746
is managed, recovered, absorbed?
00:20:56.226 --> 00:21:00.426
It's a pretty cool way to think about it. I mean, I think it's worth our time.
00:21:00.566 --> 00:21:07.046
And it's definitely warranting some thinking time.
00:21:07.126 --> 00:21:12.106
The next time you're out on your bicycle or driving someplace far or wherever
00:21:12.106 --> 00:21:16.366
it is you think, I asked you earlier to think about this. You should have an answer now.
00:21:17.086 --> 00:21:22.486
Think about this and think about what it means to the way we manage operations.
00:21:23.366 --> 00:21:29.306
It's crazy because risk in the finance sector is how they make money.
00:21:30.266 --> 00:21:34.906
So risk in finance is an opportunity.
00:21:35.746 --> 00:21:40.106
Maybe the same is true for us in safety.
00:21:40.706 --> 00:21:45.606
Who knows? Think about it. That's the pod. Short and sweet, baby.
00:21:46.006 --> 00:21:48.926
But that's the way to do it. I'll see you again next week. Thanks for hanging
00:21:48.926 --> 00:21:54.486
out with me. It's always super fun to spend time with you. And I'm glad I'm there.
00:21:55.066 --> 00:21:58.566
I'm glad you're there, actually. That's even a better thing to say.
00:21:59.106 --> 00:22:01.546
Until then, my friends, learn something new every single day.
00:22:01.666 --> 00:22:03.926
Have as much fun as you possibly can. Be good to each other.
00:22:04.106 --> 00:22:07.606
Be kind to each other. And for goodness sakes, you guys, be safe.
00:22:08.080 --> 00:22:20.099
Music.