The big data eye-roll

First, let's agree on one thing: 'big data' is a half-empty buzzword. It's shorthand for 'more data than you can look at', but really it's more than that: it branches off into other hazy territory like 'data science', 'analytics', 'deep learning', and 'machine intelligence'. In other words, it's not just 'large data'. 

Anyway, the buzzword doesn't bother me too much. What bothers me is when I talk to people at geoscience conferences about 'big data', about half of them roll their eyes and proclaim something like this: "Big data? We've been doing big data since before these punks were born. Don't talk to me about big data."

This is pretty demonstrably a load of rubbish.

What the 'big data' movement is trying to do is not acquire loads of data then throw 99% of it away. They are not processing it in a purely serial pipeline, making arbitrary decisions about parameters on the way. They are not losing most of it in farcical enterprise data management train-wrecks. They are not locking most of their data up in such closed systems that even they don't know they have it.

They are doing the opposite of all of these things.

If you think 'big data', 'data' science' and 'machine learning' are old hat in geophysics, then you have some catching up to do. Sure, we've been toying with simple neural networks for years, eg probabilistic neural nets with 1 hidden layer — though this approach is very, very far from being mainstream in subsurface — but today this is child's play. Over and over, and increasingly so in the last 3 years, people are showing how new technology — built specifically to handle the special challenge that terabytes bring — can transform any quantitative endeavour: social media and online shopping, sure, but also astronomy, robotics, weather prediction, and transportation. These technologies will show up in petroleum geoscience and engineering. They will eat geostatistics for breakfast. They will change interpretation.

So when you read that Google has open sourced its TensorFlow deep learning library (9 November), or that Microsoft has too (yesterday), or that Facebook has too (months ago), or that Airbnb has too (in August), or that there are a bazillion other super easy-to-use packages out there for sophisticated statistical learning, you should pay a whole heap of attention! Because machine learning is coming to subsurface.

Moving ahead with social interpretation

After quietly launching Pick This — our social image interpretation tool — in February, we've been busily improving the tool and now we're moving into 2016 with a plan for world domination. I summed up the first year of development in one of the interpretation sessions at SEG 2015. Here's a 13-minute version of my talk:

In 2016 we'll be exploring ways to adapt the tool to in-house corporate use, mainly by adding encryption and private groups. This way, everyone with email addresses, say, would be connected to each other, and their stuff would only be shared among the group, not with the general public.

Some other functionality is on the list of things to do:

  • Other types of interpretation than points, lines and polygons.
  • Ways to find content more easily, for example with tags like 'Seismic' or 'Outcrop'.
  • Ways to follow individuals, or get notifications of new interpretations on an image.
  • More ways to visualize and generally get at the data Pick This produces.

We're always open to suggestions. Please get in touch if you have a neat idea!

What now?

Times are rock hard in industry right now.

If you have a job, you're lucky — you have probably already survived one round of layoffs. There will likely be more, especially when the takeovers start, which they will. I hope you survive those too. 

If you don't have a job, you probably feel horrible, but of course that won't get you anywhere. I heard one person call it an 'involuntary sabbatical', and I love that: it's the best chance you'll get to re-invent, re-learn, and find new direction. 

If you're a student, you must be looking out over the wasteland and wondering what's in store for you. What on earth?

More than one person has asked me recently about Agile. "You got out," they say, "how did you do it?" So instead of bashing out another email, I thought I'd blog about it.

Consulting in 2015

I didn't really get out, of course, I just quit and moved to rural Nova Scotia.

Living out here does make it harder to make a living, of course, and things on this side of the fence, so to speak, are pretty gross too I'm afraid. Talking to others at SEG suggested that I'm not alone among small companies in this view. A few of the larger outfits seem to be doing well: IKON and GeoTeric for instance, but they also have product, which at least offers some income diversity. 

Agile started as a 100% bootstrapped effort to be a consulting firm that's more directly useful to individual professional geoscientists than anyone else. Most firms target corporate accounts and require permission, a complicated contract, an AFE, and 3 months of bureaucracy to hire. It turns out that professionals are unable or unwilling to engage on that lower, grass-roots level, though — turns out almost everyone thinks you actually need permission, contracts, AFEs, etc, to get hired in any capacity, even just "Help me tie this well." So usually we are hired into larger, longer-term projects, just like anyone else.

I still think there's something in this original idea — the Uberification of consulting services, if you will — maybe we'll try again in a few years.

But if you are out of work and were thinking of getting out there as a consultant, I'm an optimistic person but unless you are very well known (for being awesome), it's hard for me to honestly recommend even trying. It's just not the reality right now. We've been lucky so far, because we work in geothermal and government as well as in petroleum, but oil & gas was over half our revenue last year. It will be about 0% of it this year, maybe slightly less.

The transformation of Agile

All of which is to explain why we are now, since January, consciously and deliberately turning ourselves into a software technology R&D company. The idea is to be less dependent on our dysfunctional industry, and less dependent on geotechnical work. We build new tools for hard problems — data problems, interpretation problems, knowledge sharing problems. And we're really good at it.

We hired another brilliant programmer in August, and we're all learning more every day about our playground of scientific computing and the web — machine learning, cloud services, JavaScript frameworks, etc. The first thing we built was, which is still in active development. Our latest project is around our tool I hope it works out because it's the most fun I've had on a project in ages. Maybe these projects spin out of Agile, maybe we keep them in-house.

So that's our survival plan: invent, diversify, and re-tool like crazy. And keep blogging.

F**k it

Some people are saying, "things will recover, sit it out" but I think that's awful — the very worst — advice. I honestly think your best bet right now* is to find an accomplice, set aside 6 months and some of your savings, push everything off your desk, and do something totally audacious. 

Something you can't believe no-one has thought of doing yet.

Whatever it was you just thought of — that's the thing.

You might as well get started.

* Unless you have just retired, are very well connected in industry, have some free time, and want to start a new, non-commercial project that will profoundly benefit the subsurface community for the next several decades at least. Because I'd like to talk to you about another audacious plan...

Notes from a hackathon

The spirit of invention is alive and well in exploration geophysics! Last weekend, Agile hosted the 3rd annual Geophysics Hackathon at Propeller, a large and very cool co-working space in New Orleans, Louisiana.

A community of creative scientists

Commensurate with the lower-than-usual turnout at the SEG Annual Meeting, which our event preceded, we had 15 hackers. The remaining hackers were not competing, but hanging out and self-teaching or hacking around with code.

As in Denver, we had an amazing showing from Colorado School of Mines, with 6 participants. I don't know what's in the water over there in the Rockies, or what the profs have been feeding these students, but it works. Such smart, creative talent. But it can't stay this one-sided... one day we'll provoke Stanford into competitive geophysics programming.

Other than the Mines crew, we had one other student (Agile's Ben Bougher, who's at UBC), the dynamic wiki duo from SEG, and the rest were professional geoscientists from large and small companies, so it was pretty well balanced between academia and industry.

Thank you

As always, we are indebted to the sponsors and supporters of the hackathon. The event would be impossible without their financial support, and much less fun without their eager participation. This year we teamed up with three companies:

  • OpenGeoSolutions, a fantastic group of geophysicists based in Calgary. You won't find better advice on signal processing problems. Jamie Alison and Greg Partyka also regularly do us the honour of judging our hackathon demos, which is wonderful.
  • EMC, a huge cloud computing company, generously supported us through David Holmes, their representative for our industry, and a fellow Landmark alum. David also kindly joined us for much of the hackathon, including the judging, which was great for the teams.
  • Palladium Consulting, a Houston-based bespoke software house run by Sebastian Good, were a new sponsor this year. Sebastian reached out to a New Orleans friend and business partner of his, Graham Ganssle, to act as a judge, and he was beyond generous with his time and insight all weekend. He also acted as a rich source of local knowledge.

Although he craves no spotlight, I have to recognize the personal generosity of Karl Schleicher of UT Austin, who is one of the most valuable assets our community has. His tireless promotion of open data and open source software is an inspiration.

And finally, Maitri Erwin again visited to judge the demos on Sunday. She brings the perfect blend of a deep and rigorous expertise in exploration geoscience and a broad and futuristic view of technology in the service of humankind. 

I will do a round up of the projects in the next couple of weeks. Look out for that because all of the projects this year were 'different'. In a good way.

If this all sounds like fun, mark your calendars for 2016! I think we're going to try running it after SEG next year, so set aside 22 and 23 October 2016, and we'll see you there. Bring a team!

PS You can already sign up for the hackathon in Europe at EAGE next year!

More highlights from SEG

On Monday I wrote that this year's Annual Meeting seemed subdued. And so it does... but as SEG continued this week, I started hearing some positive things. Vendors seemed pleasantly surprised that they had made some good contacts, perhaps as many as usual. The technical program was as packed as ever. And of course the many students here seemed to be enjoying themselves as much as ever. (New Orleans might be the coolest US city I've been to; it reminds me of Montreal. Sorry Austin.)

Quieter acquisition

Pramik et al. (of Geokinetics) reported on a new marine vibrator acquisition using their AquaVib source. This instrument has been around for a while, indeed it was first tested over 20 years ago by IVI and later Geco (e.g. see J Bird, TLE, June 2003). If perfected, it will allow for much quieter marine seismic acquisition, reducing harm to marine mammals, with no loss of quality (images below from their abstract and their copyright with SEG):

Ben told me one of his favourite talks was Schostak & Jenkerson with a report from a JIP (Shell, ExxonMobil, Total, and Texas A&M) trying to build a new marine vibrator.  Three designs are being tested by the current consortium, respectively manufactured by PGS with an electrical model, APS with a mechanical piston, and Teledyne with a bubble resonator.

In other news:

  • Talks at Dallas 2016 will only be 15 minutes long. Hopefully this is to allow room in the schedule for something else, not just more talks.
  • Dave Hale has retired from Colorado School of Mines, and apparently now 'writes software with Dean Witte'. So watch out for that!
  • A sure sign of industry austerity: "Would you like Bud Light, or Miller Light?"
  • Check out the awesome ribbons that some clever student thought of. I'm definitely pinching that idea.

That's all I have for now, and I'm flying home today so that's it for SEG 2015. I will be reporting on the hackathon soon I promise, and I'll try to get my paper on Pick This recorded next week (but here's a sneak peek). Stay tuned!


Bill Pramik, M. Lee Bell, Adam Grier, and Allen Lindsay (2015) Field testing the AquaVib: an alternate marine seismic source. SEG Technical Program Expanded Abstracts 2015: pp. 181-185. doi: 10.1190/segam2015-5925758.1

Brian Schostak* and Mike Jenkerson (2015) The Marine Vibrator Joint Industry Project. SEG Technical Program Expanded Abstracts 2015: pp. 4961-4962. doi: 10.1190/segam2015-6026289.1

Monday highlights from SEG

Ben and I are in New Orleans at the 2015 SEG Annual Meeting, a fittingly subdued affair, given the industry turmoil recently. Lots of people are looking for work, others are thankful to have it.

We ran our annual Geophysics Hackathon over the weekend; I'll write more about that later this week. In a nutshell: despite a low-ish turnout, we had 6 great projects, all of them quite different from anything we've seen before. Once again, Colorado School of Mines dominated.

Beautiful maps

One of the most effective ways to make a tight scientific argument is to imagine trying to convince the most skeptical person you know that your method works. When it comes to seismic attribute analysis, I am that skeptical person.

Some of the nicest images I saw today were in the 'Attributes for Stratigraphic Analysis' session, chaired by Rupert Cole and Yuefeng Sun. For example, Tao Zhao, one of Kurt Marfurt's students, showed some beautiful images from the Waka 3D offshore New Zealand (Zhao & Marfurt). He used 2D colourmaps to co-render two attributes together, along with semblance mapped to opacity on a black layer, and were very nice to look at. However I was left wondering, and not for the first time, how we can do a better job calibrating those maps to geology. We (the interpretation community) need to stop side-stepping that issue; it's central to our credibility. Even if you have no wells, as in this study, you can still use forward models, analogs, or at least interpretation by a sedimentologist, preferably two.

© SEG and Zhao & Marfurt. Left to right: Peak spectral frequency and peak spectral magnitude; GLCM homogeneity; shape index and curvedness. All of the attributes are also corendered with Sobel edge detection.

© SEG and Zhao & Marfurt. Left to right: Peak spectral frequency and peak spectral magnitude; GLCM homogeneity; shape index and curvedness. All of the attributes are also corendered with Sobel edge detection.

Pavel Jilinski at GeoTeric gave a nice talk (Calazans Muniz et al.) about applying some of these sort of fancy displays to a large 3D dataset in Brazil, in a collaboration with Petrobras. The RGB displays of spectral attributes were as expected, but I had not seen their cyan-magenta-yellow (CMY) discontinuity displays before. They map dip to the yellow channel, similarity to the magenta channel, and 'tensor discontinuity' to the cyan channel. No, I don't know what that means either, but the displays were pretty cool.

Publications news

This evening we enjoyed the Editor's Dinner (I coordinate a TLE column and review for Geophysics and Interpretation, so it's totally legit). Good things are coming to the publication world: adopted Canadian Mauricio Sacchi is now Editor-in-Chief, there are no more page charges for colour in Geophysics (up to 10 pages), and watch out for video abstracts next year. Also, Chris Liner mentioned that Interpretation gets 18% of its submissions from oil companies, compared to only 5% for Geophysics. And I heard, but haven't verified, that downturns result in more papers. So at least our journals are healthy. (You do read them, right?)

That's it for today (well, yesterday). More tomorrow!


Calazans Muniz, Moises, Thomas Proença, and Pavel Jilinski (2015). Use of Color Blend of seismic attributes in the Exploration and Production Development - Risk Reduction. SEG Technical Program Expanded Abstracts 2015: pp. 1638-1642. doi: 10.1190/segam2015-5916038.1

Zhao, Tao, and Kurt J. Marfurt (2015). Attribute assisted seismic facies classification on a turbidite system in Canterbury Basin, offshore New Zealand. SEG Technical Program Expanded Abstracts 2015: pp. 1623-1627. doi: 10.1190/segam2015-5925849.1

The Rock Property Catalog again

Do you like data? Data about rocks? Open, accessible data that you can use for any purpose without asking? Read on.

After writing about anisotropy back in February, and then experimenting with storing rock properties in SubSurfWiki later that month, a few things happened:

  • The server I run the wiki on — legacy Amazon AWS infrastructure — crashed, and my backup strategy turned out to be <cough> flawed. It's now running on state-of-the-art Amazon servers. So my earlier efforts were mostly wiped out... Leaving the road clear for a new experiment!
  • I came across an amazing resource called Mudrock Anisotropy, or — more appealingly — Mr Anisotropy. Compiled by Steve Horne, it contains over 1000 records of rocks, gathered from the literature. It is also public domain and carries only a disclaimer. But it's a spreadsheet, and emailing a spreadsheet around is not sustainable.
  • The Common Ground database that was built by John A. Scales, Hans Ecke and Mike Batzle at Colorado School of Mines in the late 1990s, is now defunct and has been officially discontinued, as of about two weeks ago. It contains over 4000 records, and is public domain. The trouble is, you have to restore a SQLite database to use it.

All this was pointing towards a new experiment. I give you: the Rock Property Catalog again! This time it contains not 66 rocks, but 5095 rocks. Most of them have \(V_\mathrm{P}\), \(V_\mathrm{S}\) and  \(\rho\). Many of them have Thomsen's parameters too. Most have a lithology, and they all have a reference. Looking for Cretaceous shales in North America to use as analogs on your crossplots? There's a rock for that.

As before, you can query the catalog in various ways, either via the wiki or via the web API. Let's say we want to find shales with a velocity over 5000 m/s. You have a few options:

  1. Go to the semantic search form on the wiki and type [[lithology::shale]][[vp::>5000]]
  2. Make a so-called inline query on your own wiki page (you need an account for this).
  3. Make a query via the web API with a rather long URL:[[RPC:%2B]][[lithology::shale]][[Vp::>5000]]|%3FVp|%3FVs|%3FRho&format=jsonfm

I updated the Jupyter Notebook I published last time with a new query. It's pretty hacky. I'll work on this to produce a more robust method, with some error handling and cleaner code — stay tuned.

The database supports lots of properties, including:

  • Citation and reference
  • Description, lithology, colour (you can have pictures if you want!)
  • Location, lat/lon, basin, age, depth
  • Vp, Vs, \(\rho\), as well as \(\rho_\mathrm{dry}\) and \(\rho_\mathrm{grain}\)
  • Thomsen's \(\epsilon\), \(\delta\), and \(\gamma\)
  • Static and dynamic Young's modulus and Poisson ratio
  • Confining pressure, pore pressure, effective stress, axial stress
  • Frequency
  • Fluid, saturation type, saturation
  • Porosity, permeability, temperature
  • Composition

There is more from the Common Ground data to add, especially photographs. But for now, I'd love some feedback: is this the right set of properties? Do we need more? I want this to be useful — what kind of data and metadata would you like to see? 

I'll end with the usual appeal — I'm open to any kind of suggestions or help with this. Perhaps you can contribute new rocks, or a paper containing data? Or maybe you have some wiki skills, or can help write bots to improve the data? What can you bring? 

What is AVO-friendly processing?

It's the Geophysics Hackathon next month! Come down to Propeller in New Orleans on 17 and 18 October, and we'll feed you and give you space to build something cool. You might even win a prize. Sign up — it's free!

Thank you to the sponsors, OpenGeoSolutions and Palladium Consulting — both fantastic outfits. Hire them.

AVO-friendly processing gets called various things: true amplitude, amplitude-friendly, and controlled amplitude, controlled phase (or just 'CACP'). And, if you've been involved in any processing jobs you'll notice these phrases get thrown around a lot. But seismic geophysics has a dirty little secret... we don't know exactly what it is. Or, at least, we can't agree on it.

A LinkedIn discussion in the Seismic Data Processing group earlier this month prompted this post:

I can't compile a list of exactly which processes will harm your AVO analysis (can anyone? Has anyone??), but I think I can start a list of things that you need to approach with caution and skepticism:

  • Anything that is not surface consistent. What does that mean? According to Oliver Kuhn (now at Quantec in Toronto):
Surface consistent: a shot-related [process] affects all traces within a shot gather in the same way, independent of their receiver positions, and, a receiver-related [process] affects all traces within a receiver gather in the same way, independent of their shot positions.
  • Anything with a window — spatial or temporal. If you must use windows, make them larger or longer than your areas and zones of interest. In this way, relative effects should be preserved.
  • Anything that puts the flattening of gathers before the accuracy of the data (<cough> trim statics). Some flat gathers don't look flat. (The thumbnail image for this post is from Duncan Emsley's essay in 52 Things.)
  • Anything that is a sort of last resort, post hoc attempt to improve the data — what we might call 'cosmetic' treatments. Things like wavelet stretch correction and spectral shaping are good for structural interpreters, but not for seismic analysts. At the very least, get volumes without them, and convince yourself they did no harm.
  • Anything of which people say, "This should be fine!" but offer no evidence.

Back to my fourth point there... spectral shaping and wavelet stretch correction (e.g. this patented technique I was introduced to at ConocoPhillips) have been the subject of quite a bit of discussion, in my experience. I don't know why; both are fairly easy to model, on the face of it. The problem is that we start to get into the sticky question of what wavelets 'see' and what's a wavelet anyway, and hang on a minute why does seismic reflection even work? Personally, I'm skeptical, especially as we get more used to, and better at, looking at spectral decompositions of stacked and pre-stack data.

Divergent paths

I have seen people use seismic data with very different processing paths for structural interpretation and for AVO analysis. This can happen on long-term projects, where the structural framework depends on an old post-stack migration that was later reprocessed for AVO friendliness. This is a bad idea — you won't be able to put the quantitative results into the structural framework without introducing substantial error.

What we need is a clinical trial of processing algorithms, in which they are tested against a known model like Marmousi, and their effect on attributes is documented. If such studies exist, I'd love to hear about them. Come to think of it, this would make a good topic for a hackathon some day... Maybe Dallas 2016?

The hack is back: learn new skills in New Orleans

Looking for a way to broaden your skills for the next phase of your career? Need some networking that isn't just exchanging business cards? Maybe you just need a reminder that subsurface geoscience is the funnest thing ever? I have something for you...

It's the third Geophysics Hackathon! The most creative geoscience event of the year. Completely free, as always, and fun for everyone — not just programmers. So mark your calendar for the weekend of 17 and 18 October, sign up on your own or with a team, and come to New Orleans for the most creative 48 hours of your career so far.

What is a hackathon?

It's a fun, 2-day event full of geophysics and tech. Most people participate in teams of up to 4 people, but you can take part on your own too. There's plenty of time on the first morning to find projects to work on, or maybe you already have something in mind. At the end of the second day, we show each other what we've been working on with a short demo. There are some fun prizes for especially interesting projects.

You don't have to be a programmer to join the fun. If you're more into geological interpretation, or reservoir engineering, or graphic design, or coming up with amazing ideas — there's a place for you at the hackathon. 


  • How much does it cost? It's completely free!
  • I don't believe you. Believe it. Coffee and tacos will be provided. Just bring a laptop.
  • When is it? 17 and 18 October, doors open at 8 am each day, and we go till about 5.30.
  • So I won't miss the SEG Icebreaker? No, we'll all go!
  • Where is it? Propeller, 4035 Washington Avenue, New Orleans
  • How do I sign up? Find out more and register for the event at

Being part of it all

If this all sounds awesome to you, and you'll be in New Orleans this October, sign up! If you don't think it's for you, please drop in for a visit and a coffee — give me a chance to convince you to sign up next time.

If you own or work for an organization that wants to see more innovation in the world, please think about sponsoring this event, or a future one.

Last thing: I'd really appreciate any signal boost you can offer — please consider forwarding this post to the most creative geoscientist you know, especially if they're in the Houston and New Orleans area. I'm hoping that, with your help, this can be our biggest event ever.

How to QC a seismic volume

I've had two emails recently about quality checking seismic volumes. And last month, this question popped up on LinkedIn:

We have written before about making a data quality volume for your seismic — a handy way to incorporate uncertainty into risk maps — but these recent questions seem more concerned with checking a new volume for problems.

First things first

Ideally, you'd get to check the volume before delivery (at the processing shop, say), otherwise you might have to actually get it loaded before you can perform your QC. I am assuming you've already been through the processing, so you've seen shot gathers, common-offset gathers, etc. This is all about the stack. Nonetheless, the processor needs to prepare some things:

  • The stack volume, of course, with and without any 'cosmetic' filters (eg fxy, fk).
  • A semblance (coherency, similarity, whatever) volume.
  • A fold volume.
  • Make sure the processor has some software that can rapidly scan the data, plot amplitude histograms, compute a spectrum, pick a horizon, and compute phase. If not, install OpendTect (everyone should have it anyway), or you'll have to load the volume yourself.

There are also some things you can do ahead of time. 

  1. Be part of the processing from the start. You don't want big surprises at this stage. If a few lines got garbled during file creation, no problem. If there's a problem with ground-roll attentuation, you're not going to be very popular.
  2. Make sure you know how the survey was designed — where the corners are, where you would expect live traces to be, and which way the shot and receiver lines went (if it was an orthogonal design). Get maps, take them with you.
  3. Double-check the survey parameters. The initial design was probably changed. The PowerPoint presentation was never updated. The processor probably has the wrong information. General rule with subsurface data: all metadata is probably wrong. Ideally, talk to someone who was involved in the planning of the survey.
  4. You didn't skip (2) did you? I'm serious, double check everything.

Crack open the data

OK, now you are ready for a visit with the processor. Don't fall into the trap of looking at the geology though — it will seduce you (it's always pretty, especially if it's the first time you've seen it). There is work to do first.

  1. Check the cornerpoints of the survey. I like the (0, 0) trace at the SW corner. The inline and crossline numbering should be intuitive and simple. Make sure the survey is the correct way around with respect to north.
  2. Scan through timeslices. All of them. Is the sample interval what you were expecting? Do you reach the maximum time you expected, based on the design? Make sure the traces you expect to be live are live, and the ones you expect to be dead are dead. Check for acquisition footprint. Start with greyscale, then try another colourmap.
  3. Repeat (5) but in a similarity volume (or semblance, coherency, whatever). Look for edges, and geometric shapes. Check again for footprint.
  4. Look through the inlines and crosslines. These usually look OK, because it's what processors tend to focus on.
  5. Repeat (7) but in a similarity volume.

Dive into the details

  1. Check some spectrums. Select some subsets of the data — at least 100 traces and 1000 ms from shallow, deep, north, south, east, west — and check the average spectrums. There should be no conspicuous notches or spikes, which could be signs of all sorts of things from poorly applied filters to reverberation.
  2. Check the amplitude histograms from those same subsets. It should be 32-bit data — accept no less. Check the scaling — the numbers don't mean anything, so you can make them range over whatever you like. Something like ±100 or ±1000 tends to make for convenient scaling of amplitude maps and so on; ±1.0 or less can be fiddly in some software. Check for any departures from an approximately Laplacian (double exponential) distribution: clipping, regular or irregular spikes, or a skewed or off-centre distribution:
  1. Interpret a horizon and check its phase. See Purves (Leading Edge, October 2014) or SubSurfWiki for some advice.
  2. By this time, the fold volume should yield no surprises. If any of the rest of this checklist throws up problems, the fold volume might help troubleshoot.
  3. Check any other products you asked for. If you asked for gathers or angle stacks (you should), check them too.

Last of all, before actual delivery, talk to whoever will be loading the data about what kind of media they prefer, and what kind of file organization. They may also have some preferences for the contents of the SEG-Y file and trace headers. Pass all of this on to the processor. And don't forget to ask for All The Seismic

What about you?

Have I forgotten anything? Are there things you always do to check a new seismic volume? Or if you're really brave, maybe you have some pitfalls or even horror stories to share...