The 5%

We recently published our 500th post on this blog. I made the first post on 11 November 2010, a week after quitting my job in Calgary (yes, there was a time when people used to quit jobs). So, 500 posts in a little over 2000 days — about a post every 4 days. About 300,000 words (still only about half of War and Peace). And I probably shouldn't think about this, but let's call it at least 1000 hours (it's probably double that). 

To celebrate the milestone, however arbitrary, I thought I'd spend an evening rounding up some of our favourite and most popular posts. If nothing else, it might serve as place to start for any new readers.

Geoscience

Uncertainty (broadly speaking)

Tech and coding

Our culture

I did say this post was about the top 5%, so strictly I owe you one more post. If you'll indugle me, I'll hark right back to the start — this post on The integration gap from 5 January 2011 was one of my early favourites. It was one of those ideas I'd been carrying around for a while. Not profound or interesting enough for a talk or an article. Just a little idea. I doubt it's even original. I just thought it was interesting. It's exactly what blogs were made for.

It only remains to say Thank You for the support and attention over the years. We appreciate it hugely, and look forward to crafting the next 500 posts for lining the bottom of your digital cat litter box.

PRESS START

The dust has settled from the Subsurface Hackathon 2016 in Vienna, which coincided with EAGE's 78th Conference and Exhibition (some highlights). This post builds on last week's quick summary with more detailed descriptions of the teams and what they worked on. If you want to contact any of the teams, you should be able to track them down via the links to Twitter and/or GitHub.

A word before I launch into the projects. None of the participants had built a game before. Many were relatively new to programming — completely new in one or two cases. Most of the teams were made up of people who had never worked together on a project before; indeed, several team mates had never met before. So get ready to be impressed, maybe even amazed, at what members of our professional community can do in 2 days with only mild provocation and a few snacks.

Traptris

An 8-bit-style video game, complete with music, combining Tetris with basin modeling.

Team: Chris Hamer, Emma Blott, Natt Turner (all MSc students at the University of Leeds), Jesper Dramsch (PhD student, Technical University of Denmark, Copenhagen). GitHub repo.

Tech: Python, with PyGame.

Details: The game is just like Tetris, except that the blocks have lithologies: source, reservoir, and seal. As you complete a row, it disappears, as usual. But in this game, the row reappears on a geological cross-section beside the main game. By completing further rows with just-right combinations of lithologies, you build an earth model. When it's deep enough, and if you've placed sources rocks in the model, the kitchen starts to produce hydrocarbons. These migrate if they can, and are eventually trapped — if you've managed to build a trap, that is. The team impressed the judges with their solid gamplay and boisterous team spirit. Just installing PyGame and building some working code was an impressive feat for the least experienced team of the hackathon.

Prize: We rewarded this rambunctious team for their creative idea, which it's hard to imagine any other set of human beings coming up with. They won Samsung Gear VR headsets, so I'm looking forward to the AR version of the game.

Flappy Trace

A ridiculously addictive seismic interpretation game. "So seismic, much geology".

Team: Håvard Bjerke (Roxar, Oslo), Dario Bendeck (MSc student, Leeds), and Lukas Mosser (PhD student, Imperial College London).

Tech: Python, with PyGame. GitHub repo.

Details: You start with a trace on the left of the screen. More traces arrive, slowly at first, from the right. The controls move the approaching trace up and down, and the pick point is set as it moves across the current trace and off the screen. Gradually, an interpretation is built up. It's like trying to fly along a seismic horizon, one trace at a time. The catch is that the better you get, the faster it goes. All the while, encouragements and admonishments flash up, with images of the doge meme. Just watching someone else play is weirdly mesmerizing.

Prize: The judges wanted to recognize this team for creating such a dynamic, addictive game with real personality. They won DIY Gamer kits and an awesome book on programming Minecraft with Python.

Guess What!

Human seismic inversion. The player must guess the geology that produces a given trace.

Team: Henrique Bueno dos Santos, Carlos Andre (both UNICAMP, Sao Paolo), and Steve Purves (Euclidity, Spain)

Tech: Python web application, on Flask. It even used Agile's nascent geo-plotting library, g3.js, which I am pretty excited about. GitHub repo. You can even play the game online!

Details: This project was on a list of ideas we crowdsourced from the Software Underground Slack, and I really hoped someone would give it a try. The team consisted of a postdoc, a PhD student, and a professional developer, so it's no surprise that they managed a nice implementation. The player is presented with a synthetic seismic trace and must place reflection coefficients that will, she hopes, forward model to match the trace. She may see how she's progressing only a limited number of times before submitting her final answer, which receives a score. There are so many ways to control the game play here, I think there's a lot of scope for this one.

Prize: This team impressed everyone with the far-reaching implications of the game — and the rich possibilities for the future. They were rewarded with SparkFun Digital Sandboxes and a copy of The Thrilling Adventures of Lovelace and Babbage.

DiamondChaser

aka DiamonChaser (sic). A time- and budget-constrained drilling simulator aimed at younger players.

Team: Paul Gabriel, Björn Wieczoreck, Daniel Buse, Georg Semmler, and Jan Gietzel (all at GiGa infosystems, Freiberg)

Tech: TypeScript, which compiles to JS. BitBucket repo. You can play the game online too!

Details: This tight-knit group of colleagues — all professional developers, but using unfamiliar technology — produced an incredibly polished app for the demo. The player is presented with a blank cross section, and some money. After choosing what kind of drill bit to start with, the drilling begins and the subsurface is gradually revealed. The game is then a race against the clock and the ever-diminishing funds, as diamonds and other bonuses are picked up along the way. The team used geological models from various German geological surveys for the subsurface, adding a bit of realism.

Prize: Everyone was impressed with the careful design and polish of the app this team created, and the quiet industry they brought to the event. They each won a CellAssist OBD2 device and a copy of Charles Petzold's Code.

Some of the participants waiting for the judges to finish their deliberations. Standing, from left: Håvard Bjerke, Henrique Bueno dos Santos, Steve Purves. Seated: Jesper Dramsch, Lukas Mosser, Natt Turner, Emma Blott, Dario Bendeck, Carlos André, Björn Wieczoreck, Paul Gabriel.

Some of the participants waiting for the judges to finish their deliberations. Standing, from left: Håvard Bjerke, Henrique Bueno dos Santos, Steve Purves. Seated: Jesper Dramsch, Lukas Mosser, Natt Turner, Emma Blott, Dario Bendeck, Carlos André, Björn Wieczoreck, Paul Gabriel.

Credits and acknowledgments

Thank you to all the hackers for stepping into the unknown and coming along to the event. I think it was everyone's first hackathon. It was an honour to meet everyone. Special thanks to Jesper Dramsch for all the help on the organizational side, and to Dragan Brankovic for taking care of the photography.

The Impact HUB Vienna was a terrific venue, providing us with multiple event spaces and plenty of room to spread out. HUB hosts Steliana and Laschandre were a great help. Der Mann produced the breakfasts. Il Mare pizzeria provided lunch on Saturday, and Maschu Maschu on Sunday.

Thank you to Kristofer Tingdahl, CEO of dGB Earth Sciences and a highly technical, as well as thoughtful, geoscientist. He graciously agreed to act as a judge for the demos, and I think he was most impressed with the quality of the teams' projects.

Last but far from least, a huge Thank You to the sponsor of the event, EMC, the cloud computing firm that was acquired by Dell late last year. David Holmes, the company's CTO (Energy) was also a judge, making an amazing opportunity for the hackers to show off their skills, and sense of humour, to a progressive company with big plans for our industry.

Automated interpretation highlights

As you probably know by know, I've been at the EAGE conference in Vienna this week. I skipped out yesterday and flew over to the UK for a few days. I have already written plenty about the open source workshop, and I will write more soon about the hackathon. But I thought I'd pass on my highlights from the the Automated Interpretation session on Tuesday. In light of Monday's discussion, I made a little bit of a nuisance of myself by asking the same post-paper question every time I got the chance:

Can I use your code, either commercially or for free?

I'll tell you what the authors responded.


The universal character of salt

I especially enjoyed the presentation by Anders Waldeland and Anne Solberg (University of Oslo) on automatically detecting salt in 3D seismic. (We've reported on Anne Solberg's work before.) Anders described training eight different classifiers, from a simple nearest mean to a neural network, a supprt vector model, and a mixture of Gaussians classifier. Interestingly, but not surprisingly, the simplest model turned out to be the most effective at discrimination. He also tried a great many seismic attributes, letting the model choose the best ones. Three attributes consistently proved most useful: coherency, Haralick energy (a GLCM-based texture attribute), and the variance of the kurtosis of the amplitude distribution (how's that for meta?). What was especially interesting about his approach was that he was training the models on one dataset, and predicting on an entirely different 3D. The idea is that this might reveal the universal seismic characteristics of salt. When I asked the golden question, he said "Come and talk to me", which isn't a "yes", but it isn't a "no" either.

Waldeland and Solberg 2016. Salt probability in a North Sea dataset (left) and the fully tracked volume (right). The prediction model was trained on a Gulf of Mexico dataset. Copyright of the authors and EAGE, and used under a Fair Use claim.

Waldeland and Solberg 2016. Salt probability in a North Sea dataset (left) and the fully tracked volume (right). The prediction model was trained on a Gulf of Mexico dataset. Copyright of the authors and EAGE, and used under a Fair Use claim.

Secret horizon tracker

Horizons tracked with Figueiredo et al's machine learning algorithm. The horizons correctly capture the discontinuities. Copyright of the authors and EAGE. Used under a Fair Use claim.

Horizons tracked with Figueiredo et al's machine learning algorithm. The horizons correctly capture the discontinuities. Copyright of the authors and EAGE. Used under a Fair Use claim.

The most substantial piece of machine learning I saw was Eduardo Figueiredo from Pontifical Catholic University in Rio, in the same session as Waldeland. He's using a neural net called Growing Neural Gas to classify (aka or 'label') the input data in a number of different ways. This training step takes a little time. The label sets can now be compared to decide on the similarity between two samples, based on the number of labels the samples have in common but also including a comparison to the original seed, which essentially acts as a sort of brake to stop things running away. This progresses the pick. If a decision can't be reached, a new global seed is selected randomly. If that doesn't work, picking stops. Unfortunately he did not show a comparison to an ordinary autotracker, either in terms of time or quality, but the results did look quite good. The work was done 'in cooperation with Petrobras', so it's not surprising the code is not available. I was a bit surprised that Figueiredo was even unable to share any details of the implementation.

More on interpretation

The other two interesting talks in the session were two from Paul de Groot (dGB Earth Sciences) and Gaynor Paton (GeoTeric). Paul introduced the new Thalweg Tracker in OpendTect — the only piece of software from the session that you can actually run, albeit for a fee — which is a sort of conservative voxel tracker. Unsurprisingly, Paul was also very thorough with his examples, and his talk served as a tutorial in how to make use of, and give attribution to, open data. (I'm nearly done with the grumbling about openness for now, I promise, but I can't help mentioning that I find it a bit ironic that those scientists unwilling to share their work are also often a bit lax with giving credit to others whose work they depend on.)

Gaynor's talk was about colour, which you may know we enjoy thinking about. She had gathered 24 seismic interpreters, five of whom had some form of colour deficiency. She gave the group some interpretation tasks, and tried to gauge their performance in the tasks. It seemed interesting enough, and I immediately wondered if we could help out with Pick This, especially to help grow the sample size, and by blinding the study. But the conclusion seemed to be that, if there are ways in which colour blind interpreters are less capable at image interpretation, for example where hue is important, they compensate for it by interpreting other aspects, such as contrast and shape. 

Paton's research into how colour deficient people interpret attributes. There were 5 colour deficient subjects and 19 colour normal. The colour deficient subjects were more senstive to subtle changes in saturation and to feature shapes. Image copyright Paton and EAGE, and used here under a fair use claim.

Paton's research into how colour deficient people interpret attributes. There were 5 colour deficient subjects and 19 colour normal. The colour deficient subjects were more senstive to subtle changes in saturation and to feature shapes. Image copyright Paton and EAGE, and used here under a fair use claim.

That's it for now. I have a few other highlights to share; I'll try to get to them next week. There was a bit of buzz around the Seismic Apparition talks from ETHZ and Statoil, for example. If you were at the conference, I'd love to hear your highlights too, please share them in the comments.

References

A.U. Waldeland* (University of Oslo) & A.H.S. Solberg (University of Oslo). 3D Attributes and Classification of Salt Bodies on Unlabelled Datasets. 78th EAGE Conference & Exhibition 2016. DOI 10.3997/2214-4609.201600880. Available in EarthDoc.

M. Pelissier (Dagang Zhaodong Oil Company), C. Yu (Dagang Zhaodong Oil Company), R. Singh (dGB Earth Sciences), F. Qayyum (dGB Earth Sciences), P. de Groot* (dGB Earth Sciences) & V. Romanova (dGB Earth Sciences). Thalweg Tracker - A Voxel-based Auto-tracker to Map Channels and Associated Margins. 78th EAGE Conference & Exhibition 2016. DOI 10.3997/2214-4609.201600879. Available in EarthDoc. 

G. Paton* (GeoTeric). The Effect of Colour Blindness on Seismic Interpretation. 78th EAGE Conference & Exhibition 2016. DOI 10.3997/2214-4609.201600883. Available in EarthDoc.

A.M. Figueiredo* (Tecgraf / PUC-Rio), J.P. Peçanha (Tecgraf / PUC-Rio), G.M. Faustino (Tecgraf / PUC-Rio), P.M. Silva (Tecgraf / PUC-Rio) & M. Gattass (Tecgraf / PUC-Rio). High Quality Horizon Mapping Using Clustering Algorithms. 78th EAGE Conference & Exhibition 2016. DOI 10.3997/2214-4609.201600878. Available in EarthDoc.

Open source geoscience is _________________

As I wrote yesterday, I was at the Open Source Geoscience workshop at EAGE Vienna 2016 on Monday. Happily, the organizers made time for discussion. However, what passes for discussion in the traditional conference setting is, as I've written before, stilted.

What follows is not an objective account of the proceedings. It's more of a poorly organized collection of soundbites and opinions with no real conclusion... so it's a bit like the actual discussion itself.

TL;DR The main take home of the discussion was that our community does not really know what to do with open source software. We find it difficult to see how we can give stuff away and still make loads of money. 

I'm not giving away my stuff

Paraphrasing a Schlumberger scientist:

Schlumberger sponsors a lot of consortiums, but the consortiums that will deliver closed source software are our favourites.

I suppose this is a way to grab competitive advantage, but of course there are always the other consortium members so it's hardly an exclusive. A cynic might see this position as a sort of reverse advantage — soak up the brightest academics you can find for 3 years, and make sure their work never sees the light of day. If you patent it, you can even make sure no-one else gets to use the ideas for 20 years. You don't even have to use the work! I really hope this is not what is going on.

I loved the quote Sergey Fomel shared; paraphrasing Matthias Schwab, his former advisor at Stanford: 

Never build things you can't take with you.

My feeling is that if a consortium only churns out closed source code, then it's not too far from being a consulting shop. Apart from the cheap labour, cheap resources, and no corporation tax.

Yesterday, in the talks in the main stream, I asked most of the presenters how people in the audience could go and reproduce, or simply use, their work. The only thing that was available was a commerical OpendTect plugin of dGB's, and one free-as-in-beer MATLAB utility. Everything else was unavailble for any kind of inspection, and in one case the individual would not even reveal the technology framework they were using.

Support and maintenance

Paraphrasing a Saudi Aramco scientist:

There are too many bugs in open source, and no support. 

The first point is, I think, a fallacy. It's like saying that Wikipedia contains inaccuracies. I'd suggest that open source code has about the same number of bugs as proprietary software. Software has bugs. Some people think open source is less buggy; as Linus Torvalds said: "Given enough eyeballs, all bugs are shallow." Kristofer Tingdahl (dGB) pointed out that the perceived lack of support is a business opportunity for open source community. Another participant mentioned the importance of having really good documentation. That costs money of course, which means finding ways for industry to support open source software development.

The same person also said something like:

[Open source software] changes too quickly, with new versions all the time.

...which says a lot about the state of application management in many corporations and, again, may represent opportunity rather than a threat to open source movement.

Only in this industry (OK, maybe a couple of others) will you hear the impassioned cry, "Less change!" 

The fog of torpor

When a community is falling over itself to invent new ways to do things, create new value for people, and find new ways to get paid, few question the sharing and re-use of information. And by 'information' I mean code and data, not a few PowerPoint slides. Certainly not all information, but lots. I don't know which is the cause and which is the effect, but the correlation is there.

In a community where invention is slow, on the other hand, people are forced to be more circumspect, and what follows is a cynical suspicion of the motives of others. Here's my impression of the dynamic in the room during the discussion on Monday, and of course I'm generalizing horribly:

  • Operators won't say what they think in front of their competitors
  • Vendors won't say what they think in front of their customers and competitors
  • Academics won't say what they think in front of their consortium customers sponsors
  • Students won't say what they think in front of their advisors and potential employers

This all makes discussion a bit stilted. But it's not impossible to have group discussions in spite of these problems. I think we achieved a real, honest conversation in the two Unsessions we're done in Calgary, and I think the model we used would work perfectly in all manner of non-technical and in technical settings. We just have to start doing it. Why our convention organizers feel unable to try new things at conferences is beyond me.

I can't resist finishing on something a person at Chevron said at the workshop:

I'm from Chevron. I was going to say something earlier, but I thought maybe I shouldn't.

This just sums our industry up.

Open source FWI, I mean geoscience

I'm being a little cheeky. Yesterday's Open Source Geoscience workshop at EAGE was not really only about full waveform inversion (FWI). True, it was mostly about geophysics, but there was quite a bit of new stuff too.

But there was quite a bit on FWI.

The session echoed previous EAGE sessions on the same subject in 2006 and 2012, and was chaired by Filippo Broggini (of ETH Zürich), Sergey Fomel (University of Texas), Thomas Günther (LIAG Hannover), and Russell Hewett (Total, unfortunately not present). It started with a look at core projects like Madagascar and OpendTect. There were some (for me) pretty hard core, mathematics-heavy contributions. And we got a tour of some new and newish projects that are seeking users and/or contributors. Rather than attempting to cover everything, I'm going to exercise my (biased and ill-gotten) judgment and focus on some highlights from the day.

Filippo Broggini started by reminding us of why Joe Dellinger (BP) started this recurrent workshop a decade ago. Here's how Joe framed the value of open source to our community:

The economic benefits of a collaborative open-source exploration and production processing and research software environment would be enormous. Skilled geophysicists could spend more of their time doing innovative geophysics instead of mediocre computer science. Technical advances could be quickly shared and reproduced instead of laboriously re-invented and reverse-engineered. Oil companies, contractors, academics, and individuals would all benefit.

Did I mention that he wrote that 10 years ago?

Lessons learned from the core projects

Kristofer Tingdahl (dGB) then gave the view from his role as CEO of dGB Earth Sciences, the company behind OpendTect, the free and open source geoscience interpretation tool. He did a great job of balancing the good (their thousands of users, and their SEG Distinguished Achievement Award 2016) with the less good (the difficulty of building a developer community, and the struggle to get beyond only hundreds of paying users). His great optimism and natural business instinct filled us all with hope.

The irrepressible Sergey Fomel summed up 10 years of Madagascar's rise. In the journey from v0.9 to v2.0, the projects has moved from SourceForge to GitHub, gone from 6 to 72 developers, jumped from 30 to 260 reproducible papers, and been downloaded over 40 000 times. He also shared the story of his graduate experience at Stanford, where he was involved in building the first 'reproducible science' system with Jon Claerbout in the early 1990s. Un/fortunately, it turned out to be unreproducible, so he had to build Madagascar.

It's not (yet?) a core project, but John Stockwell (Colorado School of Mines) talked about OpenSeaSeis and barely mentioned SeismicUnix. This excellent little seismic processing project is now owned by CSM, after its creator, Bjoern Olofsson, had to give it up when he went to work for a corporation (makes sense, right? o_O ). The tool includes SeaView, a standalone SEGY viewer, as well as a graphical processing flow composer called XSeaSeis. IT prides itself on its uber-simple architecture (below). Want a gain step? Write gain.so and you're done. Perfect for beginners.

Jeffrey Shragge (UWA), Bob Clapp (SEP), and Bill Symes (Rice) provided some perspective from groups solving big math problems with big computers. Jeff talked about coaxing Madgascar — or M8R as the cool kids apparently refer to it — into the cloud, where it can chomp through 100 million core hours without setting tings on fire. This is a way for small enterprises and small (underfunded) research teams to get big things done. Bob told us about a nifty-looking HTML5 viewer for subsurface data... which I can't find anywhere. And Bill talked about 'mathematical fidelty'. and its application to solving large, expensive problems without creating a lot of intermediate data. His message: the mathematics should provide the API.

New open source tools in geoscience

The standout of the afternoon for me was University of Vienna post-doc Eun Young Lee's talk about BasinVis. The only MATLAB code we saw — so not truly open source, though it might be adapted to GNU Octave — and the only strictly geological package of the day. To support her research, Eun Young has built a MATLAB application for basin analysis, complete with a GUI and some nice visuals. This one shows a geological surface, gridded in the tool, with a thickness map projected onto the 'floor' of the scene:

I'm poorly equipped to write much about the other projects we heard about. For the record and to save you a click, here's the list [with notes] from my 'look ahead' post:

  • SES3D [presented by Alexey Gokhberg], a package from ETHZ for seismic modeling and inversion.
  • OpenFOAM [Gérald Debenest], a new open source toolbox for fluid mechanics.
  • PyGIMLi [Carsten Rücker], a geophysical modeling and inversion package.
  • PySIT [Laurent Demanet], the Python seismic imaging toolbox that Russell Hewett started while at MIT.
  • Seismic.jl [Nasser Kazemi] and jInv [Eldad Haber], two [modeling and inversion] Julia packages.

My perception is that there is a substantial amount of overlap between all of these packages except OpenFOAM. If you're into FWI you're spoilt for choice. Several of these projects are at the heart of industry consortiums, so it's a way for corporations to sponsor open source projects, which is awesome. However, most of them said they have closed-source components which only the consortium members get access to, so clearly the messaging around open source — the point being to accelerate innovation, reduce bugs, and increase value for everyone — is missing somewhere. There's still this idea that secrecy begets advantage begets profit, but this idea is wrong. Hopefully the other stuff, which may or may not be awesome, gets out eventually.


I gave a talk at the end of the day, about ways I think we can get better at this 'openness' thing, whatever it is. I will write about that some time soon, but in the meantime you're welcome to see my slides here.

Finally, a little time — two half-hour slots — was set aside for discussion. I'll have a go at summing that up in another post. Stay tuned!

BasinVis image © 2016 Eun Young Lee, used with permission. OpenSeaSeis image © 2016 Center for Wave Phenomena

READY PLAYER 1

The Subsurface Hackathon 2016 is over! Seventeen hackers gathered for the weekend at Impact HUB Vienna — an awesome venue and coworking space — and built geoscience-based games. I think it was the first geoscience hackathon in Europe, and I know it was the first time a bunch of geoscientists have tried to build games for each other in a weekend.

What went on 

The format of the event was the same as previous events: gather on Saturday, imagine up some projects, start building them by about 11 am, and work on them until Sunday at 4. Then some demos and a celebration of how amazingly well things worked out. All interspersed with coffee, food, and some socializing. And a few involuntary whoops of success.

What we made

The projects were all wonderful, but in different ways. Here's a quick look at what people built:

  • Trap-tris — a group of lively students from the University of Leeds and the Technical University of Denmark built a version of Tetris that creates a dynamic basin model. 
  • Flappy Seismic — another University of Leeds student, one from Imperial College, and a developer from Roxar, built a Flappy Bird inspired seismic interpretation game.
  • DiamonChaser (sic) — a team of devs from Giga Infosystems in Freiberg built a very cool drilling simulation game (from a real geomodel) aimed at young people.
  • Guess What — a developer from Spain and two students from UNICAMP in Brazil built a 'guess the reflection coefficient' game for inverting seismic.

I will write up the projects properly in a week or two (this time I promise :) so you can see some screenshots and links to repos and so on... but for now here are some more pictures of the event.

The fun this year was generously sponsored by EMC. David Holmes, the company's CTO (Energy), spent his weekend hanging out at the venue, graciously mentoring the teams and helping to provide some perspective or context, and help carrying pizza boxes through the streets of Vienna, when it was needed.


Click on the hackathon tag below to read about previous hackathons

Look ahead to EAGE 2016

I'm in Vienna for the 78th EAGE Conference and Exhibition, at Wien Messe, starting on Sunday. And, of course, for the Subsurface Hackathon, which I've already mentioned a few times. 

The hackathon, is, as usual, over the weekend. It starts tomorrow, in this amazing coworking space. That's @JesperDramsch there, getting ready for the hackathon!

I know this doesn't suit everyone, but weekdays don't suit everyone either. I've also always wanted to get people out of 'work mode', into the idea that they can create whatever they want. Maybe we'll try one during the week some time; do let me know what you think about it in the comments. Feedback helps.

Don't worry, you will hear more about the hackathon. Stay tuned.

Open source software in applied geosciences

The first conference event I'll be at is the workshop on open source software. This follows up on similar get-togethers in Copenhagen in 2012 and in Vienna in 2006. I hope the fact that the inter-workshop interval is getting shorter is a sign that open source geoscience software is gaining traction!

The workshop is being organized by Filippo Broggini (of ETH Zürich), Sergey Fomel (University of Texas), Thomas Günther (LIAG Hannover), and Russell Hewett (Total). They have put together a great-looking program. In the morning, Kristofer Tingdahl (CEO of dGB Earth Sciences) will talk about business models for open source. Then Sergey Fomel will update us on Madagascar seismic processing toolbox. Finally, in a series of talks, Jeff Shragge (Univ. Western Australia), Bob Clapp (Stanford), and Bill Symes (Rice) will talk about using Madagascar and other geophysical imaging and inversion tools at a large scale and in parallel.

After lunch, there's a veritable parade of updates and new stuff, with all of these projects checking in:

  • OpenSeaSeis, which raised a lot of eyebrows in 2012 for its general awesomeness. Now a project at Colorado School of Mines.
  • SES3D, a package from ETHZ for seismic waveform modeling and inversion.
  • BasinVis, a MATLAB program for modeling basin fill and subsidence (woo! Open source geology!!)
  • OpenFOAM, a new open source toolbox for fluid mechanics.
  • PyGIMLi, a geophysical modeling and inversion package.
  • PySIT, the Python seismic imaging toolbox that Russell Hewett started while at MIT.
  • Seismic.jl and jInv (that's j-inv), two Julia packages you need to know about.

Aaaand at the very end of the day, is a talk from your truly on 'stuff we can do to get more open source goodness in geoscience'. I'll post some version of the talk here when I can.

Talks and stuff

I don't have any plans for Tuesday and Wednesday, other than taking in some talks and posters. I'm missing Thursday. Picking talks is hard, especially when there are 15 (yup) parallel sessions,... and that's just the oral presentations. (Hey! Conference organizer people! That's crazy!) These conference apps that get ever-so-slightly-better each year won't be really useful until they include a recommendation engine of some sort. I'd like two kinds of recommendation: "stuff that's aligned with my interests but you will disagree with everyone in there", and "stuff that doesn't seem to be aligned with my interests, but maybe it really is".

Oh and also "stuff that isn't too far away from the room I'm in right now because I only have 80 seconds to get there".

Anyway, I haven't chosen my sessions yet, let alone started to trawl through the talk titles. You can probably guess the session titles — Carbonate Petrophysics, Multiple Attenuation, Optimizing Full Waveform Marine Acquisition for Quantitative Exploration II (just kidding).

There are some special sessions I may seek out. There's one for professional women in geoscience and engineering, and two for young professionals, one of which is a panel discussion. Then there are two 'dedicated sessions': Integrated Data for Geological and Reservoir Models, and Towards Exascale Geophysical Applications, which sounds intriguing... but their programmes look like the usual strings of talks, so I'm not sure why they're singled out. There's also something called EAGE Forum, but I can't tell what that is.


Arbitrary base 10 milestone!

I don't pay as much attention to blog stats as I used to, but there is one number that I've been keeping an eye on lately: the number of posts. This humble little post is the 500th on this blog! Kind of amazing, though I'm not sure what it says about me and Evan, in terms of making sound decisions about how we spend our evenings. I mean, if each post is 600 words,... that's two good-sized novels!

I'm not saying they're good novels...

The images of the Impact HUB Vienna that don't have Jesper in them are CC-BY-SA by the HUB.

Poisson's controversial stretch-squeeze ratio

Before reading this, you might want to check out the previous post about Siméon Denis Poisson's life and career. Then come back here...


Physicists and mathematicians knew about Poisson's ratio well before Poisson got involved with it. Thomas Young described it in his 1807 Lectures on Natural Philosophy and the Mechanical Arts:

We may easily observe that if we compress a piece of elastic gum in any direction, it extends itself in other directions: if we extend it in length, its breadth and thickness are diminished.

Young didn't venture into a rigorous formal definition, and it was referred to simply as the 'stretch-squeeze ratio'.

A new elastic constant?

Twenty years later, at a time when France's scientific muscle was fading along with the reign of Napoleon, Poisson published a paper attempting to restore his slightly bruised (by his standards) reputation in the mechanics of physical materials. In it, he stated that for a solid composed of molecules tightly held together by central forces on a crystalline lattice, the stretch squeeze ratio should equal 1/2 (which is equivalent to what we now call a Poisson's ratio of 1/4). In other words, Poisson regarded the stretch-squeeze ratio as a physical constant: the same value for all solids, claiming, 'This result agrees perfectly' with an experiment that one of his colleagues, Charles Cagniard de la Tour, recently performed on brass. 

Poisson's whole-hearted subscription to the corpuscular school certainly prejudiced his work. But the notion of discovering of a new physical constant, like Newton did for gravity, or Einstein would eventually do for light, must have been a powerful driving force. A would-be singular elastic constant could unify calculations for materials soft or stiff — in contrast to elastic moduli which vary over several orders of magnitude. 

Poisson's (silly) ratio

Later, between 1850 and 1870, the physics community acquired more evidence that the stretch-squeeze ratio was different for different materials, as other materials were deformed with more reliable measurements. Worse still, de la Tour's experiments on the elasticity of brass, upon which Poisson had hung his hat, turned out to be flawed. The stretch-squeeze ratio became known as Poisson's ratio not as a tribute to Poisson, but as a way of labeling a flawed theory. Indeed, the falsehood became so apparent that it drove the scientific community towards treating elastic materials as continuous media, as opposed to an ensemble of particles.

Today we define Poisson's ratio in terms of strain (deformation), or Lamé's parameters, or the speed \(V\) of P- and S-waves:

 
 

Interestingly, if Poisson turned out to be correct, and Poisson's ratio was in fact a constant, that would mean that the number of elastic constants it would take to describe an isotropic material would be one instead of two. It wasn't until Augustin Louis Cauchy used the notion of a stress tensor to describe the state of stress at a point within a material, with its three normal stresses and three shear stresses, did the need for two elastic constants become apparent. Tensors gave the mathematical framework to define Hooke's law in three dimensions. Found in the opening chapter in any modern textbook on seismology or mechanical engineering, continuum mechanics represents a unique advancement in science set out to undo Poisson's famously false deductions backed by insufficient data.

References

Greaves, N (2013). Poisson's ratio over two centuries: challenging hypothesis. Notes & Records of the Royal Society 67, 37-58. DOI: 10.1098/rsnr.2012.0021

Editorial (2011). Poisson's ratio at 200, Nature Materials10 (11) Available online.

 

Great geophysicists #13: Poisson

Siméon Denis Poisson was born in Pithiviers, France, on 21 June 1781. While still a teenager, Poisson entered the prestigious École Polytechnique in Paris, and published his first papers in 1800. He was immediately befriended — or adopted, really — by Lagrange and Laplace. So it's safe to say that he got off to a pretty good start as a mathematician. The meteoric trajectory continued throughout his career, as Poisson received more or less every honour a French scientist could accumulate. Along with Laplace and Lagrange — as well as Fresnel, Coulomb, Lamé, and Fourier — his is one of the 72 names on the Eiffel Tower.

Wrong Poisson

In the first few decades following the French Revolution, which ended in 1799, France enjoyed a golden age of science. The Société d’Acrueil was a regular meeting of savants, hosted by Laplace and the chemist Claude Louis Berthollet, and dedicated to the exposition of physical phenomena. The group worked on problems like the behaviour of gases, the physics of sound and light, and the mechanics of deformable materials. Using Newton's then 120-year-old law of gravitation as an analogy, the prevailing school of thought accounted for all physical phenomena in terms of forces acting between particles. 

Poisson was not flawless. As one of the members of this intellectual inner circle, Poisson was devoted to the corpuscular theory of light. Indeed, he dismissed the wave theory of light completely, until proven wrong by Thomas Young and, most conspicuously, Augustin-Jean Fresnel. Even Poisson's ratio, the eponymous elastic modulus, wasn't the result of his dogged search for truth, but instead represents a controversy that drove the development of the three-dimensional theory of elasticity. More on this next time.

The workaholic

Although he did make time for his wife and four children — but only after 6 pm — Poisson apparently had little time for much besides mathematics. His catchphrase was

Life is only good for two things: doing mathematics and teaching it.

In the summer of 1838, he learned he had a form of tuberculosis. According to James (2002), he was unable to take time away from work for long enough to recuperate. Eventually, insisting on conducting the final exams at the Polytechnique for the 23rd year in a row, he took on more than he could handle. He died on 20 April 1840. 


References

Grattan-Guinness, I. (1990). Convolutions in French Mathematics, 1800-1840: From the Calculus and Mechanics to Mathematical Analysis and Mathematical Physics. Vol.1: The Setting. Springer Science & Business Media. 549 pages.

Ioan James, I (2002). Remarkable Mathematicians: From Euler to Von Neumann. Cambridge University Press, 433 pages.

The University of St Andrews MacTutor archive article on Poisson.

ORCL vs GOOG: the $9 billion API

What's this? Two posts about the legal intricacies of copyright in the space of a fortnight? Before you unsubscribe from this definitely-not-a-law-blog, please read on because the case of Oracle America, Inc vs Google, Inc is no ordinary copyright fight. For a start, the damages sought by Oracle in this case [edit: could] exceed $9 billion. And if they win, all hell is going to break loose.

The case is interesting for some other reasons besides the money and the hell breaking loose thing. I'm mostly interested in it because it's about open source software. Specifically, it's about Android, Google's open source mobile operating system. The claim is that the developers of Android copied 37 application programming interfaces, or APIs, from the Java software environment that Sun released in 1995 and Oracle acquired in its $7.4 billion acquisition of Sun in 2010. There were also claims that they copied specific code, not just the interface the code presents to the user, but it's the API bit that's interesting.

What's an API then?

You might think of software in terms of applications like the browser you're reading this in, or the seismic interpretation package you use. But this is just one, very high-level, type of software. Other, much lower-level software runs your microwave. Developers use software to build software; these middle levels contain FORTRAN libraries for tasks like signal processing, tools for making windows and menus appear, and still others for drawing, or checking spelling, or drawing shapes. You can think of an API like a user interface for programmers. Where the user interface in a desktop application might have menus and dialog boxes, the interface for a library has classes and methods — pieces of code that hold data or perform tasks. A good API can be a pleasure to use. A bad API can make grown programmers cry. Or at least make them write a new library.

The Android developers didn't think the Java API was bad. In fact, they loved it. They tried to license it from Sun in 2007 and, when Sun was bought by Oracle, from Oracle. When this didn't work out, they locked themselves in a 'cleanroom' and wrote a runtime environment called Dalvik. It implemented the same API as the Java Virtual Machine, but with new code. The question is: does Oracle own the interface — the method names and syntaxes? Are APIs copyrightable?

I thought this case ended years ago?

It did. Google already won the argument once, on 31 May 2012, when the court held that APIs are "a system or method of operations" and therefore not copyrightable. Here's the conclusion of that ruling:

The original 2012 holding that Google did not violate the copyright Act by copying 37 of Java's interfaces. Click for the full PDF.

The original 2012 holding that Google did not violate the copyright Act by copying 37 of Java's interfaces. Click for the full PDF.

But it went to the Federal Circuit Court of Appeals, Google's petition for 'fair use' was denied, and the decision was sent back to the district court for a jury trial to decide on Google's defence. So now the decision will be made by 10 ordinary citizens... none of whom know anything about programming. (There was a computer scientist in the pool, but Oracle sent him home. It's okay --- Google sent a free-software hater packing.)

This snippet from one of my favourite podcasts, Leo Laporte's Triangulation, is worth watching. Leo is interviewing James Gosling, the creator of Java, who was involved in some of the early legal discovery process...

Why do we care about this?

The problem with all this is that, when it come to open source software and the Internet, APIs make the world go round. As the Electronic Frontier Foundation argued on behalf of 77 computer scientists (including Alan Kay, Vint Cerf, Hal Abelson, Ray Kurzweil, Guido van Rossum, and Peter Norvig, ) in its amicus brief for the Supreme Court... we need uncopyrightable interfaces to get computers to cooperate. This is what drove the personal computer explosion of the 1980s, the Internet explosion of the 1990s, and the cloud computing explosion of the 2000s, and most people seem to think those were awesome. The current bot explosion also depends on APIs, but the jury is out (lol) on how awesome that one is.

The trial continues. Google concluded its case yesterday, and Oracle called its first witness, co-CEO Safra Catz. "We did not buy Sun to file this lawsuit," she said. Reassuring, but if they win there's going to be a lot of that going around. A lot.

For a much more in-depth look at the story behind the trial, this epic article by Sarah Jeong is awesome. Follow the rest of the events over the next few days on Ars Technica, Twitter, or wherever you get your news. Meanwhile on Agile*, we will return to normal geophysical programming, I promise :)


ADDENDUM on 26 May 2016... Google won the case with the "fair use" argument. So the appeal court's decision that APIs are copyrightable stands, but the jury were persuaded that this particular instance qualified as fair use. Oracle will appeal.