News
Thursday
Aug072014

The Blangy equation

After reading Chris Liner's recent writings on attenuation and negative Q — both in The Leading Edge and on his blog — I've been reading up a bit on anisotropy. The idea was to stumble a little closer to writing the long-awaited Q is for Q post in our A to Z series. As usual, I got distracted...

In his 1994 paper AVO in tranversely isotropic media—An overview, Blangy (now the chief geophysicist at Hess) answered a simple question: How does anisotropy affect AVO? Stigler's law notwithstanding, I'm calling his solution the Blangy equation. The answer turns out to be: quite a bit, especially if impedance contrasts are low. In particular, Thomsen's parameter δ affects the AVO response at all offsets (except zero of course), while ε is relatively negligible up to about 30°.

The key figure is Figure 2. Part (a) shows isotropic vs anisotropic Type I, Type II, and Type III responses:

Unpeeling the equation

Converting the published equation to Python was straightforward (well, once Evan pointed out a typo — yay editors!). Here's a snippet, with the output (here's all of it):

For the plot below, I computed the terms of the equation separately for the Type II case. This way we can see the relative contributions of the terms. Note that the 3-term solution is equivalent to the Aki–Richards equation.

Interestingly, the 5-term result is almost the same as the 2-term approximation.

Reproducible results

One of the other nice features of this paper — and the thing that makes it reproducible — is the unambiguous display of the data used in the models. Often, this sort of thing is buried in the text, or not available at all. A table makes it clear:

Last thought: is it just me, or is it mind-blowing that this paper is now over 20 years old?

Reference

Blangy, JP (1994). AVO in tranversely isotropic media—An overview. Geophysics 59 (5), 775–781.

Don't miss the IPython Notebook that goes with this post.

Thursday
Jul312014

July linkfest

It's linkfest time again. All the links, in one handy post.

First up — I've seen some remarkable scientific visualizations recently. For example, giant ocean vortices spiralling across the globe (shame about the rainbow colourbar though). Or the trillion-particle Dark Sky Simulation images we saw at SciPy. Or this wonderful (real, not simulated) video by the Perron Group at MIT:

Staying with visuals, I highly recommend reading anything by Mike Bostock, especially if you're into web technology. The inventor of D3.js, a popular data viz library, here's his exploration of algorithms, from sampling to sorting. It's more conceptual than straight up visualization of data, but no less insightful. 

And I recently read about some visual goodness combined with one of my favourite subjects, openness. Peter Falkingham, a palaeontologist at the Royal Vetinary College and Brown University, has made a collection of 3D photographs of modern tracks and traces available to the world. He knows his data is more impactful when others can use it too.

Derald Smith and sedimentology

From Smith et al. (2009) in SEPM Special Publication No. 97.The geological world was darkened by the death of Derald Smith on 18 June. I met Derald a few times in connection with working on the McMurray Formation of Alberta, Canada during my time at ConocoPhillips. We spent an afternoon examining core and seismic data, and speculating about counter-point-bars, a specialty of his. He was an intuitive sedimentologist whose contributions will be remembered for many years.

Another geological Smith is being celebrated in September at the Geological Society of London's annual William Smith Meeting. The topic this year is The Future of Sequence Stratigraphy: Evolution or Revolution? Honestly, my first thought was "hasn't that conversation been going on since 1994?", but on closer inspection, it promises to be an interesting two days on 'source-to-sink', 'landscape into rock', and some other recent ideas.

The issue of patents reared up in June when Elon Musk of Tesla Motors announced the relaxation of their patents — essentially a promise not to sue anyone using one of their patented technology. He realizes that a world where lots of companies make electric vehicles is better for Tesla. I wrote a piece about patents in our industry.

Technology roundup

A few things that caught our eye online:

Last thing: did you know that the unit of acoustic impedance is the Rayl? Me neither. 


Previous linkfests: AprilJanuaryOctober.

The figure is from Smith et al. (2009), Stratigraphy of counter-point-bar and eddy accretion deposits in low-energy meander belts of the Peace–Athabasca delta, northeast Alberta, Canada. In: SEPM Special Publication No. 97, ISBN 978-1-56576-305-0, p. 143–152. It is copyright of SEPM, and used here in accordance with their terms.

Tuesday
Jul292014

Graphics that repay careful study

The Visual Display of Quantitative Information by Edward Tufte (2nd ed., Graphics Press, 2001) celebrates communication through data graphics. The book provides a vocabulary and practical theory for data graphics, and Tufte pulls no punches — he suggests why some graphics are better than others, and even condemns failed ones as lost opportunities. The book outlines empirical measures of graphical performance, and describes the pursuit of graphic-making as one of sequential improvement through revision and editing. I see this book as a sort of moral authority on visualization, and as the reference book for developing graphical taste.

Through design, the graphic artist allows the viewer to enter into a transaction with the data. High performance graphics, according to Tufte, 'repay careful study'. They support discovery, probing questions, and a deeper narrative. These kinds of graphics take a lot of work, but they do a lot of work in return. In later books Tufte writes, 'To clarify, add detail.'

A stochastic AVO crossplot

Consider this graphic from the stochastic AVO modeling section of modelr. Its elements are constructed with code, and since it is a program, it is completely reproducible.

Let's dissect some of the conceptual high points. This graphic shows all the data simultaneously across 3 domains, one in each panel. The data points are sampled from probability density estimates of the physical model. It is a large dataset from many calculations of angle-dependent reflectivity at an interface. The data is revealed with a semi-transparent overlay, so that areas of certainty are visually opaque, and areas of uncertainty are harder to see.

At the same time, you can still see every data point that makes the graphic giving a broad overview (the range and additive intensity of the lines and points) as well as the finer structure. We place the two modeled dimensions with templates in the background, alongside the physical model histograms. We can see, for instance, how likely we are to see a phase reversal, or a Class 3 response subject to the physical probability estimates. The statistical and site-specific nature of subsurface modeling is represented in spirit. All the data has context, and all the data has uncertainty.

Rules for graphics that work

Tufte summarizes that excellent data graphics should:

  • Show all the data.
  • Provoke the viewer into thinking about meaning.
  • Avoid distorting what the data have to say.
  • Present many numbers in a small space.
  • Make large data sets coherent.
  • Encourage the eye to compare different pieces of the data.
  • Reveal the data at several levels of detail, from a broad overview to the fine structure.
  • Serve a reasonably clear purpose: description, exploration, tabulation, or decoration.
  • Be closely integrated with the statistical and verbal descriptions of a data set.

The data density, or data-to-ink ratio, looks reasonably high in my crossplot, but it could like still be optimized. What would you remove? What would you add? What elements need revision?

Wednesday
Jul232014

Whither technical books?

Pile of geophysics booksLeafing through our pile of new books on seismic analysis got me thinking about technical books and the future of technical publishing. In particular:

  • Why are these books so expensive? 
  • When will we start to see reproducibility?
  • Does all this stuff just belong on the web?

Why so expensive?

Should technical books really cost several times what ordinary books cost? Professors often ask us for discounts for modelr, our $9/mo seismic modeling tool. Students pay 10% of what pros pay in our geocomputing course. Yet academic books cost three times what consumer books cost. I know it's a volume game — but you're not going to sell many books at $100 a go! And unlike consumer books, technical authors usually don't make any money — a star writer may score 6% of net sales... once 500 books have been sold (see Handbook for Academic Authors).

Where's the reproducibility?

Compared to the amazing level of reproducibility we saw at SciPy — where the code to reproduce virtually every tutorial, talk, and poster was downloadable — books are still rather black box. For example, the figures are often drafted, not generated. A notable (but incomplete) exception is Chris Liner's fantastic (but ridiculously expensive) volume, Elements of 3D Seismology, in which most of the figures seem to have been generated by Mathematica. The crucial final step is to share the code that generated them, and he's exploring this in recent blog posts (e.g. right).

I can think of three examples of more reproducible geophysics in print:

  1. Gary Mavko has shared a lot of MATLAB code associated with Quantitative Seismic Interpretation and The Rock Physics Handbook. The code to reproduce the figures is not provided, and MATLAB is not really open, but it's a start.
  2. William Ashcroft's excellent book, A Petroleum Geologist's Guide to Seismic Reflection contains (proprietary, Windows only) code on a CD, so you could in theory make some of the figures yourself. But it wouldn't be easy.
  3. The series of tutorials I'm coordinating for The Leading Edge has, so far, includes all code to reproduce figures, exclusively written in open languages and using open or synthetic data. Kudos to SEG!

Will the web win?

None of this comes close to Sergey Fomel's brand of fully reproducible geophysics. He is a true pioneer in this space, up there with Jon Claerbout. (You should definitely read his blog!). One thing he's been experimenting with is 'live' reproducible documents in the cloud. If we don't see an easy way to publish live, interactive notebooks in the cloud this year, we'll see them next year for sure.

So imagine being able to read a technical document, a textbook say, with all the usual features you get online — links, hover-over, clickable images, etc. But then add the ability to not only see the code that produced each figure, but to edit and re-run that code. Or add slider widgets for parameters — "What happens to the gather if if I change Poisson's ratio?" Now, since you're on the web, you can share your modification with your colleagues, or the world.

Now that's a book I'd be glad to pay double for.

Some questions for you

We'd love to know what you think of technical books. Leave a comment below, or get in touch

  • Do you purchase technical books regularly? What prompts you to buy a book?
  • What book keeps getting pulled off your shelf, and which ones collect dust?
  • What's missing from the current offerings? Workflows, regional studies, atlases,...?
  • Would you rather just consume everything online? Do you care about reproducibility?

400 posts

The last post was our 400th on this blog. At an average of 500 words, that's about 200,000 words since we started at the end of 2010. Enough for a decent-sized novel, but slightly less likely to win a Pulitzer. In that time, according to Google, almost exactly 100,000 individuals have stopped by agilegeoscience.com — most of them lots of times — thank you readers for keeping us going! The most popular posts: Shale vs tight, Rock physics cheatsheet, and Well tie workflow. We hope you enjoy reading at least half as much as we enjoy writing.

Friday
Jul182014

Six books about seismic analysis

Last year, I did a round-up of six books about seismic interpretation. A raft of new geophysics books recently, mostly from Cambridge, prompts this look at six volumes on seismic analysis — the more quantitative side of interpretation. We seem to be a bit hopeless at full-blown book reviews, and I certainly haven't read all of these books from cover to cover, but I thought I could at least mention them, and give you my first impressions.

If you have read any of these books, I'd love to hear what you think of them! Please leave a comment. 

Observation: none of these volumes mention compressive sensing, borehole seismic, microseismic, tight gas, or source rock plays. So I guess we can look forward to another batch in a year or two, when Cambridge realizes that people will probably buy anything with 3 or more of those words in the title. Even at $75 a go.


Quantitative Seismic Interpretation

Per Avseth, Tapan Mukerji and Gary Mavko (2005). Cambridge University Press, 408 pages, ISBN 978-0-521-15135-1. List price USD 91, $81.90 at Amazon.com, £45.79 at Amazon.co.uk

You have this book, right?

Every seismic interpreter that's thinking about rock properties, AVO, inversion, or anything beyond pure basin-scale geological interpretation needs this book. And the MATLAB scripts.

Rock Physics Handbook

Gary Mavko, Tapan Mukerji & Jack Dvorkin (2009). Cambridge University Press, 511 pages, ISBN 978-0-521-19910-0. List price USD 100, $92.41 at Amazon.com, £40.50 at Amazon.co.uk

If QSI is the book for quantitative interpreters, this is the book for people helping those interpreters. It's the Aki & Richards of rock physics. So if you like sums, and QSI left you feeling unsatisifed, buy this too. It also has lots of MATLAB scripts.

Seismic Reflections of Rock Properties

Jack Dvorkin, Mario Gutierrez & Dario Grana (2014). Cambridge University Press, 365 pages, ISBN 978-0-521-89919-2. List price USD 75, $67.50 at Amazon.com, £40.50 at Amazon.co.uk

This book seems to be a companion to The Rock Physics Handbook. It feels quite academic, though it doesn't contain too much maths. Instead, it's more like a systematic catalog of log models — exploring the full range of seismic responses to rock properies.

Practical Seismic Data Analysis

Hua-Wei Zhou (2014). Cambridge University Press, 496 pages, ISBN 978-0-521-19910-0. List price USD 75, $67.50 at Amazon.com, £40.50 at Amazon.co.uk

Zhou is a professor at the University of Houston. His book leans towards imaging and velocity analysis — it's not really about interpretation. If you're into signal processing and tomography, this is the book for you. Mostly black and white, the book has lots of exercises (no solutions though).

Seismic Amplitude: An Interpreter's Handbook

Rob Simm & Mike Bacon (2014). Cambridge University Press, 279 pages, ISBN 978-1-107-01150-2 (hardback). List price USD 80, $72 at Amazon.com, £40.50 at Amazon.co.uk

Simm is a legend in quantitative interpretation and the similarly lauded Bacon is at Ikon, the pre-eminent rock physics company. These guys know their stuff, and they've filled this superbly illustrated book with the essentials. It belongs on every interpreter's desk.

Seismic Data Analysis Techniques...

Enwenode Onajite (2013). Elsevier. 256 pages, ISBN 978-0124200234. List price USD 130, $113.40 at Amazon.com. £74.91 at Amazon.co.uk.

This is the only book of the collection I don't have. From the preview I'd say it's aimed at undergraduates. It starts with a petroleum geology primer, then covers seismic acquisition, and seems to focus on processing, with a little on interpretation. The figures look rather weak, compared to the other books here. Not recommended, not at this price.

NOTE These prices are Amazon's discounted prices and are subject to change. The links contain a tag that gets us commission, but does not change the price to you. You can almost certainly buy these books elsewhere. 

Page 1 ... 5 6 7 8 9 ... 87 Next 5 Entries »