Images as data

I was at the Atlantic Geoscience Society's annual meeting on Friday and Saturday, held this year in a cold and windy Truro, Nova Scotia. The AGS is a fairly small meeting — maybe a couple of hundred geoscientists make the trip — but usually good value, especially if you're working in the area. 

A few talks and posters caught my attention, as they were all around a similar theme: getting data from images. Not in an interpretive way, though — these papers were about treating images fairly literally. More like extracting impedance from seismic than, say, making a horizon map.

Drone to stereonet

Amazing 3D images generated from a large number of 2D images of outcrop. LEft: the natural colour image. Middle: all facets generated by point cloud analysis. Right: the final set of human-filtered facets. © Joseph Cormier 2016

Amazing 3D images generated from a large number of 2D images of outcrop. LEft: the natural colour image. Middle: all facets generated by point cloud analysis. Right: the final set of human-filtered facets. © Joseph Cormier 2016

Probably the most eye-catching poster was that of Joseph Cormier (UNB), who is experimenting with computer-assisted structural interpretation. Using dozens of high-res photographs collected by a UAV, Joseph combines them to create reconstruct the 3D scene of the outcrop — just from photographs, no lidar or other ranging technology. The resulting point cloud reveals the orientations of the outcrop's faces, as well as fractures, exposed faults, and so on. A human interpreter can then apply her judgment to filter these facets to groups of tectonically significant sets, at which point they can be plotted on a stereonet. Beats crawling around with a Brunton or Suunto for days!

Hyperspectral imaging

There was another interesting poster by a local mining firm that I can't find in the abstract volume. They had some fine images from CoreScan, a hyperspectral imaging and analysis company operating in the mining industry. The technology, which can discern dozens of rock-forming minerals from their near infrared and shortwave infrared absorption characteristics, seems especially well-suited to mining, where mineralogical composition is usually more important than texture and sedimentological interpretation. 

Isabel Chavez (SMU) didn't need a commercial imaging service. To help correlate Laurasian shales on either side of the Atlantic, she presented results from using a handheld Konica-Minolta spectrophotometer on core. She found that CIE L* and a* colour parameters correlated with certain element ratios from ICP-MS analysis. Like many of the students at AGS, Isabel was presenting her undergraduate thesis — a real achievement.

Interesting aside: one of the chief applications of colour meters is measuring the colour of chips. Fascinating.

The hacker spirit is alive and well

The full spectrum (top), and the CCD responses with IR filter, Red filter, green filter, and blue filter (bottom). All of the filters admitted some infrared light, causing problems for calibration. © Robert McEwan 2016.

The full spectrum (top), and the CCD responses with IR filter, Red filter, green filter, and blue filter (bottom). All of the filters admitted some infrared light, causing problems for calibration. © Robert McEwan 2016.

After seeing those images, and wishing I had a hyperspectral imaging camera, Rob McEwan (Dalhousie) showed how to build one! In a wonderfully hackerish talk, he showed how he's building a $100 mineralogical analysis tool. He started by removing the IR filter from a second-hand Nikon D90, then — using a home-made grating spectrometer — measured the CCD's responses in the red, green, blue, and IR bands. After correcting the responses, Rob will use the USGS spectral library (Clark et al. 2007) to predict the contributions of various minerals to the image. He hopes to analyse field and lab photos at many scales. 

Once you have all this data, you also have to be able to process it. Joshua Wright (UNB) showed how he has built a suite of VisualBasic Macros to segment photomicrographs into regions representing grains using FIJI, then post-process the image data as giant arrays in an Excel spreadsheet (really!). I can see how a workflow like this might initially be more accessible to someone new to computer programming, but I felt like he may have passed Excel's sweetspot. The workflow would be much smoother in Python with scikit-image, or MATLAB with the Image Processing Toolbox. Maybe that's where he's heading. You can check out his impressive piece of work in a series of videos; here's the first:

Looking forward to 2016

All in all, the meeting was a good kick off to the geoscience year — a chance to catch up with some local geoscientists, and meet some new ones. I also had the chance to update the group on striplog, which generated a bit of interest. Now I'm back in Mahone Bay, enjoying the latest winter storm, enjoying the feeling of having something positive to blog about!

Please be aware that, unlike the images I usually include in posts, the images in this post are not open access and remain the copyright of their respective authors.


Isabel Chavez, David Piper, Georgia Pe-Piper, Yuanyuan Zhang, St Mary's University (2016). Black shale Selli Level recorded in Cretaceous Naskapi Member cores in the Scotian Basin. Oral presentation, AGS Colloquium, Truro NS, Canada.

Clark, R.N., Swayze, G.A., Wise, R., Livo, E., Hoefen, T., Kokaly, R., Sutley, S.J., 2007, USGS digital spectral library splib06a: U.S. Geological Survey, Digital Data Series 231

Joseph Cormier, Stefan Cruse, Tony Gilman, University of New Brunswick (2016). An optimized method of unmanned aerial vehicle surveying for rock slope analysis, 3D modeling, and structural feature extraction. Poster, AGS Colloquium, Truro NS, Canada.

Robert McEwan, Dalhousie University (2016). Detecting compositional variation in granites – a method for remotely sensed platform. Oral presentation, AGS Colloquium, Truro NS, Canada.

Joshua Wright, University of New Brunswick (2016). Using macros and advanced functions in Microsoft ExcelTM to work effectively and accurately with large data sets: An example using sulfide ore characterizatio. Oral presentation, AGS Colloquium, Truro NS, Canada.

Is subsurface software too pricey?

Amy Fox of Enlighten Geoscience in Calgary wrote a LinkedIn post about software pricing a couple of weeks ago. I started typing a comment... and it turned into a blog post.

I have no idea if software is 'too' expensive. Some of it probably is. But I know one thing for sure: we subsurface professionals are the only ones who can do anything about the technology culture in our industry.

Certainly most technical software is expensive. As someone who makes software, I can see why it got that way: good software is really hard to make. The market is small, compared to consumer apps, games, etc. Good software takes awesome developers (who can name their price these days), and it takes testers, scientists, managers.

But all is not lost. There are alternatives to the expensive software. We — practitioners in industry — just do not fully explore them. OpendTect is a great seismic interpretation tool, but many people don't take it seriously because it's free. QGIS is an awesome GIS application, arguably better than ArcGIS and definitely easier to use.

Sure, there are open source tools we have embraced, like Linux and MediaWiki. But on balance I think this community is overly skeptical of open source software. As evidence of this, how many oil and gas companies donate money to open source projects they use? There's just no culture for supporting Linux, MediaWiki, Apache, Python, etc. Why is that?

If we want awesome tools, someone, somewhere, has to pay the people who made them, somehow.


So why is software expensive and what can we do about it?

I used to sell Landmark's GeoProbe software in Calgary. At the time, it was USD140k per seat, plus 18% annual maintenance. A lot, in other words. It was hard to sell. It needed a sales team, dinners, and golf.  A sale of a few seats might take a year. There was a lot of overhead just managing licenses and test installations. Of course it was expensive!

In response, on the customer side, the corporate immune system kicked in, spawning machine lockdowns, software spending freezes, and software selection committees. These were (well, are) secret organizations of non-users that did (do) difficult and/or pointless things like workflow mapping and software feature comparisons. They have to be secret because there's a bazillion dollars and a 5-year contract on the line.

Catch 22. Even if an ordinary professional would like to try some cheaper and/or better software, there is no process for this. Hands have been tied. Decisions have been made. It's not approved. It can't be done.

Well, it can be done. I call it the 'computational geophysics manoeuver', because that lot have known about it for years. There is an easy way to reclaim your professional right to the tools of the trade, to rediscover the creativity and fun of doing new things:

Bring or buy your own technology, install whatever the heck you want on it, and get on with your work.

If you don't think that's a possibility for you right now, then consider it a medium term goal.

Old skool plot tool

It's not very glamorous, but sometimes you just want to plot a SEG-Y file. That's why we crafted seisplot. OK, that's why we cobbled seisplot together out of various scripts and functions we had lying around, after a couple of years of blog posts and Leading Edge tutorials and the like.

Pupils of the old skool — when everyone knew how to write a bash script, pencil crayons and lead-filled beanbags ruled the desktop, and Carpal Tunnel Syndrome was just the opening act to the Beastie Boys — will enjoy seisplot. For a start, it's command line only: 

    python -R -c ~/segy_files -o ~/plots

Isn't that... reassuring? In this age of iOS and Android and Oculus Rift... there's still the command line interface.

Features galore

So what sort of features can you look forward to? Other than all the usual things you've come to expect of subsurface software, like a complete lack of support or documentation. (LOL, I'm kidding.) Only these awesome selling points:

  • Make wiggle traces or variable density plots... or don't choose — do both!
  • If you want, the script will descend into subdirectories and make plots for every SEG-Y file it finds.
  • There are plenty of colourmaps to choose from, or if you're insane you can make your own.
  • You can make PNGs, JPGs, SVGs or PDFs. But not CGM, sorry about that.

Well, I say 'selling points', but the tool is 100% free. We think this is a fair price. It's also open source of course, so please — seriously, please — improve the source code, then share it with the world! The code is in GitHub, natch.

Never go full throwback

There is one more feature: you can go full throwback and add scribbles and coffee stains. Here's one for your wall:

The 2D seismic line in this post is from the USGS NPRA Seismic Data Archive, and are in the public domain. This is line number 31-81-PR (links directly to SEG-Y file).

White magic: calibrating seismic attributes

This post is part of a series on seismic attributes; the previous posts were...

  1. An attribute analysis primer
  2. Attribute analysis and statistics

Last time, I hinted that there might be a often-overlooked step in attribute analysis:

Calibration is a gaping void in many published workflows. How can we move past "that red blob looks like a point bar so I drew a line around it in PowerPoint" to "there's a 70% chance of finding reservoir quality sand at that location"?

Why is this step such a 'gaping void'? A few reasons:

  • It's fun playing with attributes, and you can make hundreds without a second thought. Some of them look pretty interesting, geological even. "That looks geological" is, however, not an attribute calibration technique. You have to prove it.
  • Nobody will be around when we find out the answer. There's a good chance that well will never be drilled, but when it is, you'll be on a different project, in a different company, or have left the industry altogether and be running a kayak rental business in Belize.
  • The bar is rather low. The fact that many published examples of attribute analysis include no proof at all, just a lot of maps with convincing-looking polygons on them, and claims of 'better reservoir quality over here'. 

This is getting discouraging. Let's look at an example. Now, it's hard to present this without seeming over-critical, but I know these gentlemen can handle it, and this was only a magazine article, so we needn't make too much of it. But it illustrates the sort of thing I'm talking about, so here goes.

Quoting from Chopra & Marfurt (AAPG Explorer, April 2014), edited slightly for brevity:

While coherence shows the edges of the channel, it gives little indication of the heterogeneity or uniformity of the channel fill. Notice the clear definition of this channel on the [texture attribute — homogeneity].
We interpret [the] low homogeneity feature [...] to be a point bar in the middle of the incised valley (green arrow). This internal architecture was not delineated by coherence.

A nice story, making two claims:

  1. The attribute incompletely represents the internal architecture of the channel.
  2. The labeled feature on the texture attribute is a point bar.

I know explorers have to be optimists, and geoscience is all about interpretation, but as scientists we must be skeptical optimists. Claims like this are nice hypotheses, but you have to take the cue: go off and prove them. Remember confirmation bias, and Feynman's words:

The first principle is that you must not fool yourself — and you are the easiest person to fool.

The twin powers

Making geological predictions with seismic attribute analysis requires two related workflows:

  1. Forward modeling — the best way to tune your intuition is to make a cartoonish model of the earth (2D, isotropic, homogeneous lithologies) and perform a simplified seismic experiment on it (convolutional, primaries only, noise-free). Then you can compare attribute behaviour to the known model.
  2. Calibration — you are looking for an explicit, quantitative relationship between a physical property you care about (porosity, lithology, fluid type, or whatever) and a seismic attribute. A common way to show this is with a cross-plot of the seismic amplitude against the physical property.

When these foundations are not there, we can be sure that one or more bad things will happen:

  • The relationship produces a lot of type I errors (false positives).
  • It produces a lot of type II error (false negatives).
  • It works at some wells and not at others.
  • You can't reproduce it with a forward model.
  • You can't explain it with physics.

As the industry shrivels and questions — as usual — the need for science and scientists, we have to become more stringent, more skeptical, and more rigorous. Doing anything else feeds the confirmation bias of the non-scientific continent. Because it says, loud and clear: geoscience is black magic.

The image is part of the figure from Chopra, S and K Marfurt (2014). Extracting information from texture attributes. AAPG Explorer, April 2014. It is copyright of the Authors and AAPG.

A coding kitchen in Stavanger

Last week, I travelled to Norway and held a two day session of our Agile Geocomputing Training. We convened at the newly constructed Innovation Dock in Stavanger, and set up shop in an oversized, swanky kitchen. Despite the industry-wide squeeze on spending, the event still drew a modest turnout of seven geoscientists. That's way more traction then we've had in North America lately, so thumbs up to Norway! And, since our training is designed to be very active, a group of seven is plenty comfortable. 

A few of the participants had some prior experience writing code in languages such as Perl, Visual Basic, and C, but the majority showed up without any significant programming experience at all. 

Skills start with syntax and structures 

The first day we covered basic principles or programming, but because Python is awesome, we dive into live coding right from the start. As an instructor, I find that doing live coding has two hidden benefits: it stops me from racing ahead, and making mistakes in the open gives students permission to do the same. 

Using geoscience data right from the start, students learn about key data structures: lists, dicts, tuples, and sets, and for a given job, why they might chose between them. They wrote their own mini-module containing functions and classes for getting stratigraphic tops from a text file. 

Since syntax is rather dry and unsexy, I see the instructor's main role to inspire and motivate through examples that connect to things that learners already know well. The ideal containers for stratigraphic picks is a dictionary. Logs, surfaces, and seismic, are best cast into 1-, 2, and 3-dimensional NumPy arrays, respectively. And so on.

Notebooks inspire learning

We've seen it time and time again. People really like the format of Jupyter Notebooks (formerly IPython Notebooks). It's like there is something fittingly scientific about them: narrative, code, output, repeat. As a learning document, they aren't static — in fact they're meant to be edited. But they aren't so open-ended that learners fail to launch. Professional software developers may not 'get it', but scientists really subscribe do. Start at the start, end at the end, and you've got a complete record of your work. 

You don't get that with the black-box, GUI-heavy software applications we're used to. Maybe, all legitimate work should be reserved for notebooks: self-contained, fully-reproducible, and extensible. Maybe notebooks, in their modularity and granularity, will be the new go-to software for technical work.

Outcomes and feedback

By the end of day two, folks were parsing stratigraphic and petrophysical data from text files, then rendering and stylizing illustrations. A few were even building interactive animations on 3D seismic volumes.  One recommendation was to create a sort of FAQ or cookbook: "How do I read a log?", "How do I read SEGY?", "How do I calculate elastic properties from a well log?". A couple of people of remarked that they would have liked even more coached exercises, maybe even an extra day; a recognition of the virtue of sustained and structured practice.

Want training too?

Head to our courses page for a list of upcoming courses, or more details on how you can train your team

Photographs in this post are courtesy of Alessandro Amato del Monte via aadm on Flickr

Why I don't flout copyright

Lots of people download movies illegally. Or spoof their IP addresses to get access to sports fixtures. Or use random images they found on the web in publications and presentations (I've even seen these with the watermark of the copyright owner on them!). Or download PDFs for people who aren't entitled to access (#icanhazpdf). Or use sketchy Russian paywall-crumbling hacks. It's kind of how the world works these days. And I realize that some of these things don't even sound illegal.

This might surprise some people, because I go on so much about sharing content, open geoscience, and so on. But I am an annoying stickler for copyright rules. I want people to be able to re-use any content they like, without breaking the law. And if people don't want to share their stuff, then I don't want to share it.

Maybe I'm just getting old and cranky, but FWIW here are my reasons:

  1. I'm a content producer. I would like to set some boundaries to how my stuff is shared. In my case, the boundaries amount to nothing more than attribution, which is only fair. But still, it's my call, and I think that's reasonable, at least until the material is, say, 5 years old. But some people don't understand that open is good, that shareable content is better than closed content, that this is the way the world wants it. And that leads to my second reason:
  2. I don't want to share closed stuff as if it was open. If someone doesn't openly license their stuff, they don't deserve the signal boost — they told the world to keep their stuff secret. Why would I give them the social and ethical benefits of open access while they enjoy the financial benefits of closed content? This monetary benefit comes from a different segment of the audience, obviously. At least half the people who download a movie illegally would not, I submit, have bought the movie at a fair price.

So make a stand for open content! Don't share stuff that the creator didn't give you permission to share. They don't deserve your gain filter.

Rockin' Around The Christmas Tree

I expect you know at least one geoscientist. Maybe you are one. Or you want to be one. Or you want one for Christmas. It doesn't matter. The point is, it'll soon be Christmas. If you're going to buy someone a present, you might as well get them something cool. So here's some cool stuff!


There isn't a single geologist alive that wouldn't think this was awesome. It's a freaking Geiger counter! It goes in your pocket! It only costs USD 60, or CAD 75, or less than 40 quid! Absurd: these things normally cost a lot more.

OK, if you didn't like that, you're not going to like this IR spectrometer. Yes, a pocket molecular sensor, for sensing molecules in pockets. It does cost USD 250 though, so make sure you really like that geologist!

Back down to earth, a little USB microscope ticks most of the geogeek boxes. This one looks awesome, and is only USD 40 but there are loads, so maybe do some research.


You're going to need something to wave all that gadgetry at. If you go down the well-worn path of the rock & mineral set, make sure it's a good size, like this 100-sample monster (USD 70). Or go for the novelty value of fluorescent specimens (USD 45) — calcite, sphalerite, and the like.

If minerals seem passé for a geologist, then take the pure line with a tour of the elements. This set — the last of it's kind, by the way — costs USD 565, but it looks amazing. Yet it can't hold a candle to this beauty, all USD 5000 of it — which I badly want but let's face it will never get.


If you have a rock collection, maybe you want a mineralogical tray (USD 35) to put them in? The same store has all sorts of printed fabrics by designers Elena Kulikova and Karina Eibitova. Or how about some bedding?

These steampunk light switch plates are brilliant and varied (USD 50). Not geological at all, just awesome.

I don't think they are for sale, but check out Ohio artist Alan Spencer's ceramic pieces reflecting each of the major geological periods. They're pretty amazing.


My kids are really into Lego at the moment. Turns out there are all sorts of sciencey kits you can get. I think the Arctic Base Camp (USD 90) is my favourite that's available at the moment, and it contains some kind of geological-looking type (right).

I don't condone the watching of television programmes, except Doctor Who obviously, but they do sometimes make fun Lego sets. So there's the Doctor, naturally, and other things like Big Bang Theory.

You can fiddle with these while you wait for the awesome HMS Beagle model to come out.

Books etc.

A proven success — winner of the Royal Society's prestigious Winton Prize for science books this year — is Adventures in the Anthropocene: A Journey to the Heart of the Planet We Made, by Gaia Vince, Milkweed Editions, September 2015. Available in hardback and paperback.

Lisa Randall's Dark Matter and the Dinosaurs: The Astounding Interconnectedness of the Universe (HarperCollins) just came out, and is doing remarkably well at the moment. It's getting decent reviews too. Randall is a cosmologist, and she reckons the dinosaurs were obliterated by a comet nudged out of orbit by mysteriousness. Hardback only.

If those don't do it for you, I reviewed some sciencey comic books recently... or there's always Randall Munroe.

Or you could try poking around in the giftological posts from 2011, 2012, 2013, or 2014.

Still nothing? OK, well, there's always chocolate :)

The images in this post are all someone else's copyright and are used here under fair use guidelines. I'm hoping the owners are cool with people helping them sell stuff!


Matt Hall

I am a geoscientist in Nova Scotia, Canada. Founder of Agile Geoscience, co-founder of The HUB South Shore. Into geology, geophysics, programming in Python, and knowledge sharing (especially wikis).

The big data eye-roll

First, let's agree on one thing: 'big data' is a half-empty buzzword. It's shorthand for 'more data than you can look at', but really it's more than that: it branches off into other hazy territory like 'data science', 'analytics', 'deep learning', and 'machine intelligence'. In other words, it's not just 'large data'. 

Anyway, the buzzword doesn't bother me too much. What bothers me is when I talk to people at geoscience conferences about 'big data', about half of them roll their eyes and proclaim something like this: "Big data? We've been doing big data since before these punks were born. Don't talk to me about big data."

This is pretty demonstrably a load of rubbish.

What the 'big data' movement is trying to do is not acquire loads of data then throw 99% of it away. They are not processing it in a purely serial pipeline, making arbitrary decisions about parameters on the way. They are not losing most of it in farcical enterprise data management train-wrecks. They are not locking most of their data up in such closed systems that even they don't know they have it.

They are doing the opposite of all of these things.

If you think 'big data', 'data' science' and 'machine learning' are old hat in geophysics, then you have some catching up to do. Sure, we've been toying with simple neural networks for years, eg probabilistic neural nets with 1 hidden layer — though this approach is very, very far from being mainstream in subsurface — but today this is child's play. Over and over, and increasingly so in the last 3 years, people are showing how new technology — built specifically to handle the special challenge that terabytes bring — can transform any quantitative endeavour: social media and online shopping, sure, but also astronomy, robotics, weather prediction, and transportation. These technologies will show up in petroleum geoscience and engineering. They will eat geostatistics for breakfast. They will change interpretation.

So when you read that Google has open sourced its TensorFlow deep learning library (9 November), or that Microsoft has too (yesterday), or that Facebook has too (months ago), or that Airbnb has too (in August), or that there are a bazillion other super easy-to-use packages out there for sophisticated statistical learning, you should pay a whole heap of attention! Because machine learning is coming to subsurface.

Moving ahead with social interpretation

After quietly launching Pick This — our social image interpretation tool — in February, we've been busily improving the tool and now we're moving into 2016 with a plan for world domination. I summed up the first year of development in one of the interpretation sessions at SEG 2015. Here's a 13-minute version of my talk:

In 2016 we'll be exploring ways to adapt the tool to in-house corporate use, mainly by adding encryption and private groups. This way, everyone with email addresses, say, would be connected to each other, and their stuff would only be shared among the group, not with the general public.

Some other functionality is on the list of things to do:

  • Other types of interpretation than points, lines and polygons.
  • Ways to find content more easily, for example with tags like 'Seismic' or 'Outcrop'.
  • Ways to follow individuals, or get notifications of new interpretations on an image.
  • More ways to visualize and generally get at the data Pick This produces.

We're always open to suggestions. Please get in touch if you have a neat idea!

What now?

Times are rock hard in industry right now.

If you have a job, you're lucky — you have probably already survived one round of layoffs. There will likely be more, especially when the takeovers start, which they will. I hope you survive those too. 

If you don't have a job, you probably feel horrible, but of course that won't get you anywhere. I heard one person call it an 'involuntary sabbatical', and I love that: it's the best chance you'll get to re-invent, re-learn, and find new direction. 

If you're a student, you must be looking out over the wasteland and wondering what's in store for you. What on earth?

More than one person has asked me recently about Agile. "You got out," they say, "how did you do it?" So instead of bashing out another email, I thought I'd blog about it.

Consulting in 2015

I didn't really get out, of course, I just quit and moved to rural Nova Scotia.

Living out here does make it harder to make a living, of course, and things on this side of the fence, so to speak, are pretty gross too I'm afraid. Talking to others at SEG suggested that I'm not alone among small companies in this view. A few of the larger outfits seem to be doing well: IKON and GeoTeric for instance, but they also have product, which at least offers some income diversity. 

Agile started as a 100% bootstrapped effort to be a consulting firm that's more directly useful to individual professional geoscientists than anyone else. Most firms target corporate accounts and require permission, a complicated contract, an AFE, and 3 months of bureaucracy to hire. It turns out that professionals are unable or unwilling to engage on that lower, grass-roots level, though — turns out almost everyone thinks you actually need permission, contracts, AFEs, etc, to get hired in any capacity, even just "Help me tie this well." So usually we are hired into larger, longer-term projects, just like anyone else.

I still think there's something in this original idea — the Uberification of consulting services, if you will — maybe we'll try again in a few years.

But if you are out of work and were thinking of getting out there as a consultant, I'm an optimistic person but unless you are very well known (for being awesome), it's hard for me to honestly recommend even trying. It's just not the reality right now. We've been lucky so far, because we work in geothermal and government as well as in petroleum, but oil & gas was over half our revenue last year. It will be about 0% of it this year, maybe slightly less.

The transformation of Agile

All of which is to explain why we are now, since January, consciously and deliberately turning ourselves into a software technology R&D company. The idea is to be less dependent on our dysfunctional industry, and less dependent on geotechnical work. We build new tools for hard problems — data problems, interpretation problems, knowledge sharing problems. And we're really good at it.

We hired another brilliant programmer in August, and we're all learning more every day about our playground of scientific computing and the web — machine learning, cloud services, JavaScript frameworks, etc. The first thing we built was, which is still in active development. Our latest project is around our tool I hope it works out because it's the most fun I've had on a project in ages. Maybe these projects spin out of Agile, maybe we keep them in-house.

So that's our survival plan: invent, diversify, and re-tool like crazy. And keep blogging.

F**k it

Some people are saying, "things will recover, sit it out" but I think that's awful — the very worst — advice. I honestly think your best bet right now* is to find an accomplice, set aside 6 months and some of your savings, push everything off your desk, and do something totally audacious. 

Something you can't believe no-one has thought of doing yet.

Whatever it was you just thought of — that's the thing.

You might as well get started.

* Unless you have just retired, are very well connected in industry, have some free time, and want to start a new, non-commercial project that will profoundly benefit the subsurface community for the next several decades at least. Because I'd like to talk to you about another audacious plan...