News
« The Agile toolbox | Main | News of the month »
Tuesday
Apr172012

Checklists for everyone

Avoidable failures are common and persistent, not to mention demoralizing and frustrating, across many fields — from medicine to finance, business to government. And the reason is increasingly evident: the volume and complexity of what we know has exceeded our individual ability to deliver its benefits correctly, safely, or reliably. Knowledge has both saved and burdened us.

I first learned about Atul Gawande from Bill Murphy's talk at the 1IWRP conference last August, where he offered the surgeon's research model for all imperfect sciences; casting the spectrum of problems in a simple–complicated–complex ternary space. In The Checklist Manifesto, Gawande writes about a topic that is relevant to all all geoscience: the problem of extreme complexity. And I have been batting around the related ideas of cookbooks, flowcharts, recipes, and to-do lists for maximizing professional excellence ever since. After all, it is challenging and takes a great deal of wisdom to cut through the chaff, and reduce a problem to its irreducible and essential bits. Then I finally read this book.

The creation of the now heralded 19-item surgical checklist found its roots in three places — the aviation industry, restaurant kitchens, and building construction:

Thinking about averting plane crashes in 1935, or stopping infections in central lines in 2003, or rescuing drowning victims today, I realized that the key problem in each instance was essentially a simple one, despite the number of contributing factors. One needed only to focus attention on the rudder and elevator controls in the first case, to maintain sterility in the second, and to be prepared for cardiac bypass in the third. All were amenable, as a result, to what engineers call "forcing functions": relatively straightforward solutions that force the necessary behavior — solutions like checklists.

What is amazing is that it took more than two years, and a global project sponsored by the World Health Organization, to devise such a seemingly simple piece of paper. But what a change it has had. Major complications fell by 36%, and deaths fells by 47%. Would you adopt a technology that had a 36% improvement in outcomes, or a 36% reduction in complications? Most would without batting an eye.

But the checklist paradigm is not without skeptics. There is resistance to the introduction of the checklist because it threatens our autonomy as professionals, our ego and intelligence that we have trained hard to attain. An individual must surrender being the virtuoso. It enables teamwork and communication, which engages subordinates and empowers them at crucial points in the activity. The secret is that a checklist, done right, is more than just tick marks on a piece of paper — it is a vehicle for delivering behavioural change.

I can imagine huge potential for checklists in the problems we face in petroleum geoscience. But what would such checklists look like? Do you know of any in use today?

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (9)

I know of safety checklists routinely done daily, don't know if they were designed to curb particular problems the way airplane/flight checklists were; don't know of anything specifically geology related -- but I find the idea intriguing.

April 18, 2012 | Unregistered CommenterSilver Fox

@Silver Fox
I think this notion of safety and emergency checklists have been employed because in dangerous or crisis situations, there are basic things that people routinely forget. I think that's the power of the checklist for any environment. It also stresses humility and teamwork as being the best chance of controlling outcomes.
Thanks for the comment!

April 18, 2012 | Unregistered Commenterevan

IDK,

My problem with this is there can be many ways of skinning the cat. And sometime different methods may have benefits for different reasons.

Look at this scenario, I just got a velocity volume in from the vendor. But it's average Velocity format, I'd prefer it in Check-shot format for easy Map depth conversion. Without converting the volume, I'll have to do a math operation on every Map that needs conversion.

Workflow 1: Prep the volume(1 hr) + extract each map(5 min)

Workflow 2: Extract each map(5 min) + Math on each map(10 min)


Now usually I'd prefer the first option, It makes my life way easier. And this is what I'd build a checklist around.

But when you need a isopach out asap because nape is tomorrow, Suddenly workflow 2 only taking 30 min looks pretty attractive.

But A bigger issue is there is no one size fits all in this business. What might be a no brainer on a 150 million dollar deepwater well , might be a very hard decision on a million dollar well picking up attic oil.

April 18, 2012 | Unregistered CommenterToastar

@Toastar

You've touched on an important idea. Any checklist should, I think, be engineered around an entire system. Checklists may be more important for teams, or between different working parties; geologist to engineer, vendor to customer, etc. What you need (it sounds like), is not a one-size-fits all procedure, but a constrained system that allows enough flexibility for your to exercise judgement when you are in a pinch (NAPE is in 2 days), and protects against things that are easily missed, skipped, or forgotten. Maybe it is something that starts with the vendor, and it is a verbal confirmation before you get your velocity data, "Has the vendor shipped BOTH interval velocity format and checkshot velocity format?". Check. Check. Built into the work agreement.

I certainly agree that there is no one-size-fits-all, and checklists should take advantage of this notion. There is certainly, no one-operation-fits-all, in surgical procedures, yet Gawande showed grand improvements by promoting this systems approach.

It's a good distinction that I should stress again. A checklist is not a recipe or procedure for your personalized workflow. It is not the sequence of button clicks to make your depth map. Those procedure or recipes that can and should fluctuate based on judgement, deadlines, etc. Instead, imagine a checklist for the system, pointing at the pause points between people.

April 22, 2012 | Unregistered Commenterevan

Posting this comment rather late, but I have been trying to catch up. I read the checklist book a year ago and found it compelling - and yes, amazing that the idea is not more broadly applied. In my current work environment (exploration, SE Asia) I see it applied (on paper at least) to various HSE management procedures but nowhere in the technical arena.

You know what I would like to see? A checklist for interpreters once they have generated a map. Over the countless years (I'm too old to recall precisely) that I have been working in this industry, I can't remember seeing a map that didn't have obvious questions to be asked about it. We have to accept that all maps are always wrong, but endeavour to make them as un-wrong as possible. There are simply some basic principles of three-dimensional geometry and what faults can and can't do, along with the necessary internal geometrical consistency between the seismic and the map, that, with depressing regularity, are violated. The level of interrogation of a map, once the workstation has spat it out, is often inadequate. Perhaps I'm a luddite, but, while admiring and relishing the extraordinary technology that we have at our disposal these days, I can't help but remember the comment (source lost in the mists of time) that the technology is brilliant at answering questions, but it doesn't ask them. Plus I profoundly miss the ability to display uncertainty on a map - the old dashed line was the product of interrogating the robustness of the data and the interpretation. But enough of my soapbox.

What I am wondering about is the possibility that a map interrogation checklist could be an important tool for improving consistency and quality, and for requiring the interpreter to perhaps think a little more about that beautiful, colourful, powerpoint slide for management presentation (not to mention the implications for the prospect). It might well be more than one checklist (perhaps a structural one, a stratigraphic one, an attributes one etc.). I really haven't thought this through in any detail, but would be very interested in any comments and ideas.

May 3, 2012 | Unregistered CommenterMichael

Michael,

Although I am glad that you feel that you have some catching up to do, you are certainly not late. You can only be late to something that has a finishing time. I am glad these posts engage discussions long after the next one is up.

How to validate your map: a checklist. That is a great idea. One thing I always like to see on a map is a visual distinction between where the "hard" data points are, and where the interpolated data is. You can see this idea in a cross section 'map' of borehole temperature measurements in post entitled, Seeing red

I think what you propose is particularly crucial to the world of geocellular modeling. I have been involved in taking someone elses maps (grids) and trying to make a compliant 3-dimensional cellular container. The process was essentially broken from the get go, because many of the surfaces were un-physical to start with. The fascinating part, was that that, it seemed to me, the geomodeling had to be done, and it had to fail, in order to figure out what parts of the map needed fixing; It was a necessary part of making a map. I think we need to allow for that type of iteration in our work, which can be a tough sell. I have written an essay on this topic and it is going to be published soon; I call this the "interpretation-to-data trap" and we all fall into it.

Where to head next? One thing I took away from the book, is that, in making an effective checklist, it is equally important to know what to omit as to what to include. Maybe there are principles across all subsurface mapping endeavors that can be extracted and placed on a checklist; "If your map has faults, check this and this. If your map has subcrops, check this and this. If your map is in the depth domain, what is the error in velocities? If there are wells, label the mis-ties." And so on.

I would love to do some more blogging or presenting examples addressing the question, "why does this map fail?". Or "what makes this a better map than this map?". We might want to let actual examples lead the discussion here, whenever possible.

So, if you are willing, let's keep the ball rolling, either in an editable document (like Google documents), or perhaps SubSurfWiki.

May 3, 2012 | Registered CommenterEvan Bianco

Evan - thanks for following up on my ramblings, and I'm pleased that you find the idea of interest.

Yes, I believe there are basic principles for subsurface mapping that could be built into a checklist; for example, since a map and a seismic section are but two different slices through the same three dimensional architecture, if two faults intersect on a map then they must also intersect somewhere in the seismic data (and vice versa) - and the plunge of that line of intersection must be consistent. Remember the old "down-structure method of viewing geological maps"? That was a hugely valuable tool for thinking in 3D, but it seems lost.

And your point about iteration gets to the heart of this issue - the time to do so, and the associated value of "multiple working hypotheses" seems to be neither available nor encouraged.

I would very much enjoy continuing this discussion - but you have to help an old luddite with how to best use the kinds of platforms you suggest.

May 5, 2012 | Unregistered CommenterMichael

Those working in drilling operations will be familiar with Emergency Response plans (cf. Silverfox), which are built around well-defined workflows and checklists (Call Country Manager on this phone number; Status update from Rig; Advise Emergency Response Team to Assemble; Status update from Rig; Country Manager to call this Govt department etc etc). The human-element always plays a critical role in the success of these workflows, and the workflows and checklists, defined on paper, need to be flexible enough (along with the personnel involved) to adapt to each unique situation (cf. Toaster’s cat-skinning analogy, "so many ways to"…allows the flexibility to adapt to new problems as they arise, ie If A, then do X; If B then do Y decision points).
As for Maps, good points raised by Michael and Evan. Workflow and checklist flexibility also required. All maps presented raise question’s, as Mike comments. Where the hard-data is is critical to QCing any map or interpretation (cf Evan). There are a few map presentation techniques which allow some of these questions to be addressed quickly, some of which identified by Mike and Evan:
-Where grid-flexxing has not been used to force a grid to match well-tops, posting well-tops and gridded horizon depths in the well-intersection allow some evaluation of gridding uncertainty.
-seismic layouts: 2D seismic layouts, potentially with ribbon interpretations, or 3D coverage polygons should be displayed (with autopicking seed-line interpretation ribbons). With the seismic layout, especially on 2D interpretations, it is very easy to identify alternate fault linkages within the 2D-line spacing, highlighting areas which require particular structural review.
-Alternate fault-linkages can be displayed by displaying one set of fault polygons (setA) on map, copying it and editing it (as setB) and displaying it as a different colour behind setA. The different polygon sets can be traffic light coloured to identify level of uncertainty of fault interpretation, data control, or resulting resources on a structure, or risk, or…etc
-The ‘dotted-line’ is always useful in defining pinchouts, fault heaves, subcrops (a particular kind of pinchout!) etc, and most software can display these, but interpreters don’t use enough, as is rightly identified by Mike.
-posting fault-throws along faults allow some degree of review of whether a single fault polygon should actually be two discrete faults, with some sort of ‘soft’-linkage.
-Software should allow easier draughting of alternate models (dashed-line pinchouts; alternate fault polygon sets)
For those working in the ‘guts’ of the data, there is a tendency to put more data on a single map display. For management decision-making, there is a requirement to focus on a single specific problem, so the maps are ‘cleaned-up’ to focus the discussion. The more data you post, the less ‘clear’ (for management) the map becomes. The less data that is posted (for the technical worker), the less clear the map becomes. Maps are unfortunately required to be tailored for the audience. So Map Title is important to highlight the purpose of the map.
A map is rarely presented on its sole merit nowadays, with type dip- and strike-lines defining the structural model, stratigraphy definition, well-intersections, horizon interpretation and gridding, fault modeling, all now done interactively in the 3D-workspace. The workstation itself, culminating in the 3D/4D-immersion-rooms complete (with special goggles!) are an attempt to put all the data in one interactive place with the whole interpretation team to solve the 3D problem. However we are still a long way applying/using this as an industry standard due to costs, time, scale-of-venture, multiple models etc and from understanding the limits of the data and the alternative models which can be derived from it (cf Mike’s ever-important ‘Multiple Working Hypotheses’).
For myself, it is important to remember that good creative exploration is developed in the ‘data-undefined’ spaces, beyond the limits of the currently available data. Understanding the gridded-over ‘data-holes’, and the regression-fitted sampled datasets in the machine models, and developing alternate models which still fit current data, is where significant exploration opportunity lies.

May 6, 2012 | Unregistered CommenterMark

Mark - many interesting points there. And I particularly like your concluding paragraph - "creative exploration" is what it should be all about, but it is often forgotten that creativity requires intellectual discipline, interrogating the data in order to find the opportunities in the spaces in between.

Do you think that there is somewhere useful to go with this whole discussion?

May 9, 2012 | Unregistered CommenterMichael

PostPost a New Comment

Enter your information below to add a new comment.
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>