Shared posts

04 Sep 04:38

My Favorite Charts

by Robert Kosara

There are many charts I hate, because they’re badly done, sloppy, meaningless, deceiving, ugly, or for any number of other reasons. But then there are the ones I keep coming back to because they’re just so clear, well-designed, and effective.

All of these are a few years old. Like a fine wine analogy that I could insert here, it probably takes a while for a chart to come up again and again in conversation and when looking for examples to realize how good it is.

Scatterplot

My favorite scatterplot, and perhaps my favorite chart ever, is Why Is Her Paycheck Smaller? by Hannah Fairfield and Graham Roberts. It shows men’s versus women’s weekly earnings, with men on the horizontal axis and women on the vertical. A heavy black diagonal line shows equal wages, three additional lines show where women make 10%, 20%, and 30% less. Any point to the bottom right of the line means that women make less money than men.

Why Is Her Paycheck Smaller?

The diagonal lines are a stroke of genius (pun fully intended). When you see a line in a scatterplot, it’s usually a regression line that models the data; i.e., a line that follows the points. But such a line only helps reinforce the difficulty of judging the differences between the two axes, which is something we’re not good at, and which is not typically something you do in a scatterplot anyway.

But the diagonal line, as simple as it is, makes it not just possible, but effortless. It’s such a simple device and yet so clear and effective. All the points on the line indicate occupations where men and women make the same amount of money. To the top left of the line is the area where women make more money than men, and to the bottom right where women make less.

The additional lines show 10%, 20%, and 30% less for women. If it’s hard to tell if a point is lying on the main diagonal of a scatterplot, it becomes impossible to guess the percentage it is off. The additional lines make it possible to guesstimate that number to within a few percent. That is a remarkable level of precision, and it is achieved with three simple lines.

There is some interactivity: mousing over points brings up a tooltip that shows the occupation the point represents and how much more one gender makes than the other. Filters in the top left corner let you focus on just a small number of occupations, which include annotations for a few select jobs.

But the key element is the inclusion of the reference lines that help people make sense of the scatterplot and read it with a high level of precision. Simple but effective, and powerful.

Line Chart

My favorite line chart is The Jobless Rate for People Like You by Shan Carter, Amanda Cox, and Kevin Quealy. This chart is somewhat ancient, having been created in Flash and showing unemployment data from January 2007 to September 2009. But its brilliant design and interaction make it timeless.

The Jobless Rate for People Like You

It’s a line chart, but with a twist. The first thing you see is the heavy blue line, showing the overall unemployment rate. But there are more lines in the background, what are those? So you mouse over and they respond: they light up and there’s a tooltip telling you what they represent. Each is the unemployment rate for a subset of the population, defined as the combination of race, gender, age group, and education. How are hispanic men over 45 with only a high school diploma doing compared to the overall rate? What about black women 15–24? Or white college grads of any age and gender?

Clicking on a line moves the blue line there so it’s easier to see, but the overall rate stays easily visible. The y axis also rescales nicely when the values go above what it can currently display.

In addition, the filters at the top also respond to the selection to show who is selected. Clicking around inside the chart updates them. Hm, so maybe I can use those to explore too? And of course you can, broadening or narrowing your selection, or clicking through different age groups of the same subset of the population, etc.

The Human-Computer Interaction field has a nice term for an indication of more data and interaction: information scent. This is usually used with widgets, which indicate where more information can be found (like the little tick marks on the scrollbar in Chrome when when you search within the page). What makes this chart so good is its clever use of information scent to entice viewers to dig deeper, explore, and ask questions.

It also brilliantly and clearly demonstrates the fact that the overall unemployment rate is a rather meaningless number. The actual rate in your demographic is likely to look very different, and the range is huge. This was the inspiration for my What Means Mean piece, though I don’t think that was nearly as clear as this.

The chart shows interesting data, explains a somewhat advanced concept, and invites people to interact with it. This comes in a package that is understated and elegant in its design. Best line chart ever.

Bar Chart

I have already written about the Bikini Chart, and it remains my favorite bar chart. It’s an incredibly effective piece of communication, and it’s all just based on a simple time series. The fact that the bars point down clearly communicates how it is supposed to be read: down is bad, less down is better than more down.

obama job loss chart

Bar charts are not exactly a common medium for artistic expression, but the designers of this chart managed to subtly but clearly get a message across.

Bubble Chart/Animated Scatterplot

Animated scatterplots may not have been invented by Hans Rosling and gapminder, but they certainly were not a common thing until his TED talk in 2007. And while it may seem a bit facetious to point to the only reasonably well-known example of a particular chart type as my favorite one, this is clearly one of my favorite charts, no matter what type.

gapminder

The animation may seem like a bit of a gimmick – and it has been criticized as not being terribly effective –, but it works to communicate a number of important pieces of information.

The main piece of information, of course, is change over time. How have different countries changed in terms of their wealth, healthcare, etc.? This is reasonably effective, because there are trends, and many countries follow them. The outliers are reasonably easy to spot, especially when you can turn on trails and replay the animation. It’s not always immediately possible to see everything, but it does invite people to play and explore.

But then, there are the explanations. There is the clever animation that constructs the two-dimensional scatterplot from a one-dimensional distribution. There is the clever drill-down animation that breaks continents down into countries, and countries down into quintiles, to show the enormous range of values covered by each. This is not just a simple data display, but a way to introduce people to statistical concepts and data operations they may have heard of but don’t understand (drill-down), or never have heard of in the first place (quintiles).

Rosling’s video, and the gapminder software, not only introduced millions of people to data they knew nothing about (the video has over 8.5 million views!), it also demonstrated how a compelling story can be told without a single photograph or other image, just with data. That is an incredible achievement that opened our eyes to the possibilities of data visualization for communication.

Appreciating Good Work

It’s easy to find, and make fun of, bad charts. But between all the pie chart bashing and general criticism of bad charts, it is equally important to find the good examples and try to figure out what makes them work so well. Even if it may be more fun to beat up the bad examples, we will ultimately learn more from understanding the design choices and ideas that went into the good ones.

05 Sep 21:09

Oil, Gas Companies Laying Groundwork for the Internet of Things

change text size

At this stage, companies are building programs internally to address how they will adopt Internet of Things technologies and how to get the most benefits in terms of managing risks, reducing costs and bringing in new revenues, said Serhii Konovalov, who leads Cisco’s O&G Global Industry Solutions Practice and O&G Innovations, in an interview with Rigzone.

Officials with Software AG are also seeing oil and gas companies starting to lay the framework and mapping their business processes to prepare their organizations for Internet of Things monitoring solutions. Software AG, which focuses on helping companies, including oil and gas companies, become digital enterprises, is seeing oil and gas companies pursue Internet of Things solutions to enhance their cash flow, while others are focused on ways to use sensor technology to achieve greater safety records.

To prepare for the Internet of Things, companies are first seeking to map their business processes to understand how work is done as a way to foster change in their corporate culture. Many firms believe that if they first focus on change management, the tools they use to facilitate operations, including technology, will fall into place.

This strategy of change includes using technology to enable processes as simple as attaching a document that outlines the methodology for a specific process. For example, a document that specifies the requirements for the type of conduit or drilling pipe a company needs can greatly enhance a buyer’s ability to procure the right supply.

“This is a long way from the Internet of Things, but is the top layer of the process automation piece that allows for sensor data to be fed into processes that will give a company enhanced visibility over its operations,” said Donald Fisher, senior director of Software AG Energy Solutions, in an interview with Rigzone.

“Rising CAPEX [capital expenditures] are increasing dramatically in the industry, so firms are very interested in controlling capital costs and being able to prioritize how they spend money,” said Fisher.

Another driver that Software AG officials are seeing in the move towards Internet of Things technology includes operational efficiency, such as being able to secure operational permits for exploration.


123

View Full Article

Karen Boman has more than 10 years of experience covering the upstream oil and gas sector. Email Karen at kboman@rigzone.com.

WHAT DO YOU THINK?

Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.


05 Sep 21:09

IoT-enabled devices to top 40 billion by 2020, say researchers

The number of active wireless connected devices will exceed 40.9 billion by 2020 – more than double the current total, according to ABI Research. The explosion in connected devices will be driven by the internet of things (IoT).

ABI’s estimate of 40 billion devices is significantly more than the figure suggested by Gartner in 2013, when it forecast that the IoT will see 26 billion units installed by 2020.

131023_1397.jpg

ABI also predicted that the installed base of active wireless connected devices will exceed 16 billion this year – 20% more than in 2013.

“The driving force behind the surge in connections is that usual buzzword suspect, the internet of things,” said ABI Research principal analyst Aapo Markkanen.

IoT is a scenario in which objects, animals or people are able to transfer data over a network and talk to each other without requiring human-to-human or human-to-computer interaction.

“If we look at this year’s installed base, smartphones, PCs and other ‘hub’ devices represent 44% of the active total, but by the end of 2020 their share is set to drop to 32%,” said Markkanen. “In other words, 75% of the growth between today and the end of the decade will come from non-hub devices – sensor nodes and accessories.”

From every technology supplier’s strategic point of view, the critical question is how this plethora of IoT devices will ultimately be connected, said ABI. Until recently, the choices that OEMs have faced have been fairly straightforward, with cellular, Wi-Fi, Bluetooth and others all generally addressing their relative comfort zones.

But going forward, they will be in increasing competition with each other, so for the suppliers, the strategic stakes are getting much higher, said Markkanen.

Practice director Dan Shey added: “The recently introduced Thread protocol, spearheaded by Nest Labs, is the clearest example of this convergence.”

Thread is a new IP-based wireless networking protocol initiated by IoT players such as Google Nest to find a new and better way to connect products in the home. It is thought that existing wireless networking approaches, devised and executed before the rise of IoT, are not capable of connecting large numbers of devices.

Thread is an IPv6 networking protocol built on open standards, designed for low-power 802.15.4 mesh networks. But with billions of devices powered by IoT, IPv4 will not be good enough, said Ian McDonald, IT director at Swiftkey.

Shey said it is not only setting the bar higher for ZigBee in the 802.15.4 space, but is also piling pressure on Bluetooth suppliers to enable mesh networking.

ZigBee is a specification for wireless personal area networks (WPANs) operating at 868MHz, 902-928MHz and 2.4GHz. Devices can communicate at speeds of up to 250kbps, and can be physically separated by up to 50 metres in typical applications, more in an ideal environment. ZigBee is based on the 802.15 specification approved by the Institute of Electrical and Electronics Engineers Standards Association (IEEE-SA).

Shey added: “In the meantime, the LTE-MTC and LTE-M initiatives may well expand the market for cellular M2M, while startups like Electric Imp and Spark could do the same for Wi-Fi. And finally, we also should not ignore what is going on with passive, proximity-based connectivity offered by RFID and NFC.”


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.

By submitting my Email address I confirm that I have read and accepted the Terms of Use and Declaration of Consent.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
07 Aug 15:57

The Blangy equation

by Matt Hall

After reading Chris Liner's recent writings on attenuation and negative Q — both in The Leading Edge and on his blog — I've been reading up a bit on anisotropy. The idea was to stumble a little closer to writing the long-awaited Q is for Q post in our A to Z series. As usual, I got distracted...

In his 1994 paper AVO in tranversely isotropic media—An overview, Blangy (now the chief geophysicist at Hess) answered a simple question: How does anisotropy affect AVO? Stigler's law notwithstanding, I'm calling his solution the Blangy equation. The answer turns out to be: quite a bit, especially if impedance contrasts are low. In particular, Thomsen's parameter δ affects the AVO response at all offsets (except zero of course), while ε is relatively negligible up to about 30°.

The key figure is Figure 2. Part (a) shows isotropic vs anisotropic Type I, Type II, and Type III responses:

Unpeeling the equation

Converting the published equation to Python was straightforward (well, once Evan pointed out a typo — yay editors!). Here's a snippet, with the output (here's all of it):

For the plot below, I computed the terms of the equation separately for the Type II case. This way we can see the relative contributions of the terms. Note that the 3-term solution is equivalent to the Aki–Richards equation.

Interestingly, the 5-term result is almost the same as the 2-term approximation.

Reproducible results

One of the other nice features of this paper — and the thing that makes it reproducible — is the unambiguous display of the data used in the models. Often, this sort of thing is buried in the text, or not available at all. A table makes it clear:

Last thought: is it just me, or is it mind-blowing that this paper is now over 20 years old?

Reference

Blangy, JP (1994). AVO in tranversely isotropic media—An overview. Geophysics 59 (5), 775–781.

Don't miss the IPython Notebook that goes with this post.

28 Aug 11:24

Burrowing by burning

by Matt Hall

Most kind of mining are low-yield games. For example, the world's annual gold production would fit in a 55 m2 room. But few mining operations I'm aware of are as low yield as the one that ran in Melle, France, from about 500 till 950 CE, producing silver for the Carolingian empire and Charlemagne's coins. I visited the site on Saturday.

The tour made it clear just how hard humans had to work to bring about commerce and industry in the Middle Ages. For a start, of course they had no machines, just picks and shovels. But the Middle Jurassic limestone is silicic and very hard, so to weaken the rock they set fires against the face and thermally shocked the rock to bits. The technique, called fire-setting, was common in the Middle Ages, and was described in detail by Georgius Agricola in his book De Re Metallica (right; aside: the best translation of this book is by Herbert Hoover!). Apart from being stupefyingly dangerous, the method is slow: each fire got the miners about 4 cm further into the earth. Incredibly, they excavated about 20 km of galleries this way, all within a few metres of the surface.

The fires were set against the walls and fuelled with wood, mostly beech. Recent experiments have found that one tonne of wood yielded about one tonne of rock. Since a tonne of rock yields 5 kg of galena, and this in turn yields 10 g of silver, we see that producing 1.1 tonnes of silver per year — enough for 640,000 deniers — was quite a feat!

There are several limits to such a resource intensive operation: wood, distance from face to works, maintenance, and willing human labour, not to mention the usual geological constraints. It is thought that, in the end, operations ended due to a shortage of wood.

Several archaeologists visit the site regularly (here's one geospatial paper I found mentioning the site: Arles et al. 2013), and the evidence of their attempts to reproduce the primitive metallurgical methods were on display. Here's my attempt to label everything, based on what I could glean from the tour guide's rapid French:

The image of the denier coin is licensed CC-BY-SA by Wikipedia user Lequenne Gwendoline

25 Aug 20:06

Swipe-ables

by jeffrey
It's Katrina Week so get ready for the run of obligatory retrospectives.  Not that there's anything wrong with that.  I don't count myself among the number who would tell you this is old irrelevant news.  On the contrary, there's almost nothing going on in New Orleans today that can be understood without first understanding the effect of the flood.  So it's worth it for everyone to take time for reflection and examination. 

On the other hand, this is the week where national news outlets and producers of commentary feel obliged to weigh in with gross distortions and harmful agenda-laced false narratives.  For example, here is an article suggesting that the forced upheaval of so many lives actually held a "silver lining" for displaced residents. Get ready to read more of this as we perfect our rationalizations for the subsequent gentrification of New Orleans.  Ask around the (relatively) more affluent corners of New Orleans these days and you might be surprised to hear how many people seem to think Barbara Bush was right all along. Not that anyone you ask will remember that quote... especially if they've only recently moved here.

Anyway, what else is there?  Oh yeah clickbait. Professional media runs more on clickbait now than it did nine years ago so there will be plenty of that.  Keep an eye out for Buzzfeed's Katrina Remembered In GIFs of Beyonce Making Funny Faces.. or maybe they've got another slideshow of Six Flags photos to share. 

If you're looking for a less cynical presentation of images, though, this T-P Then and Now gallery is worth a moment of your time.
13 Aug 14:57

Wikipedia: Mass Collaboration or Mass Amateurism?

Because Wikipedia is a process, not a product, it replaces guarantees
offered by institutions with probabilities supported by process: if enough
people care enough about an article to read it, then enough people will care
enough to improve it, and over time this will lead to a large enough body
of good enough work to begin to take both availability and quality of articles
for granted, and to integrate Wikipedia into daily use by millions.
Clay Shirky (2008), Here Comes Everybody


This study intends to contribute to a better understanding of the Wiki phenomenonas a knowledge managementsystemwhich aggregates private knowledge and check to what extent information generated through anonymous and freely bestowed mass collaboration is reliable as opposed to the traditional approach. To achieve that goal, we develop a comparative study between Wikipedia and Britannica Encyclopedias, in order to confront the quality of the knowledge repository produced by them. That will allow us to reach a conclusion about the efficacy of the business models behind them.
So, we intend to find out which of the scenarios represented above is the most accurate to describe mass collaboration: the infinite monkeys theorem used by Keen in The Cult of the Amateur or the ode to the “power of the masses" ofTapscott & Williams in Wikinomics.
We used a representative random sample which is composed by the articles that are comprised in both encyclopedias. Each pair of articles was previously reformatted to hide its source and then graded by an expert in its subject area using a five-point scale. We asked experts to concentrate only on some intrinsic aspects of the articles’ quality, namely accuracy and objectivity, and discard the contextual, representational and accessibility aspects. Whenever possible, the experts invited to participate in the study are University teachers, because they are used to grading students’ work not using the reputation of the source.
The articles have been divided into four main categories: Arts & Entertainment, History & Society, Science & Technology and Travel & Geography. Each main category has further been subdivided in order to find the most suitable expert to evaluate it. The average results obtained are presented below:

To calculate the impact of having only one evaluator per article - imposed by the sample size and articles’ length - we collected a small convenience sample which only integrates Management articles. Each pair of articles was graded by several experts in order to determine the uncertainty associated with having diverse gradings of the same article. The uncertainty indicators obtained are α = 0,9 and MAD = 0,6.
In order to further normalize the results we transformed the {1, 2, 3, 4, 5} scale used by each evaluator in a nine-point scale {-4,-3,-2,-1, 0, 1, 2, 3, 4}, subtracting in each article the grade of the Wikipedia from the correspondent Britannica pair. This step allows us to concentrate on the difference in quality and mitigate the noise induced by eventual differences in interpretation of the absolute scale among the evaluators.
To deal with the lack of Geography grades, we find a significant nonlinear correlation between the average difference in the number of words per article (grouped by grades difference) and the difference in grades, shown in the following chart:
Using that relation, we managed to predict that the global average grade difference of the Geography articles will be 1.7. The function cannot be used to predict individual differences for a particular pair of articles, but in terms of global result, and despite all the assumptions and weakness of this approach, it is one indicator that a future assessment of these articles may lead to a global value of the same order of magnitude to those obtained from the assessed areas.
In global terms, and setting aside the Geography articles, the conclusion was that the average quality of the Wikipedia articles which were analyzed was superior to its peer’s and that this difference was statistically significant. The graphic below shows that 90% of the Wikipedia articles have been considered as having equivalent or better quality than their Britannica counterparts.

The global value of the difference among the pairs of articles assessed has an average of 1,4 ± 0,9 (average absolute deviation) and a median of 1,0 ± 0,6 (median absolute deviation). Those uncertainty values have been calculated using the uncertainty of the management articles pairs’ grades, assessed by multiple evaluators as mentioned above.
In parallel with this study, a survey answered by university professors, was used to characterize the universe of evaluators, showed that traditional information sources were used only by a few (25%) as the first approach to seeking information. Nevertheless, this inquiry also made clear that reliance on these sources was considerably larger than reliance on information obtained through Wikipedia or other nontraditional sources.
This quality perception, as well as the diametrically opposed results of its evaluation through the blind test assessment, reinforces the evaluating panel’s exemption. The following graph will show the reliance on information obtained through alternative sources when compared to classical ones:
However much the chosen sample is representative of the universe to be studied, results have depended on the evaluators’ personal opinion and chosen criteria. This means that the reproducibility of this study’s conclusions using a different grading panel cannot be guaranteed. Nevertheless, this is not enough of a reason to reject the study results obtained through more than five hundred evaluations.
One explanation for the success of Wikipedia can be found in the altruism of individuals who come together around themes they share the same passion for, and it is expected they have remarkable knowledge on these. In other words, in the case studied, mass collaboration seems self-organized, leading to an organization of self-assessment and self-correction among peers which produces impressive results, against all odds.

What if information skills become a mass amateur activity?



Goto Top of Page
12 Aug 20:27

Congressmen Make Transparent Attempt to Bog Down SpaceX With Transparency Request - "You can't take the sky from me!" -Elon Musk, probably.

by Dan Van Winkle

orbcomm_f9_in_hanger_0

We’ve been pretty excited about SpaceX lately with their Falcon 9 rocket launches and landings, the Dragon V2 crew module, and their new spaceport in Texas, but some members of congress don’t share our enthusiasm. Three members of congress in particular are trying to put unnecessary road blocks in front of SpaceX for their own interests. Maybe no one told them that where SpaceX is going, they don’t need roads?

Metaphorical roadblocks could still cause the company problems, though, as House of Representatives members Mike Coffman (R-Colo.), Mo Brooks (R-Ala.), and Cory Gardner (R-Colo.) have sent a memo to NASA asking that they investigate an “epidemic of anomalies” in SpaceX missions and make a full report to congress. They say they’re concerned about SpaceX wasting taxpayer money on developing technology that keeps having problems.

However, as Phil Plait and SpaceNews have pointed out, Elon Musk has already said that SpaceX didn’t use any of its NASA funding dollars for the Falcon 9. Any funding they did receive on other projects was only supplemental and came through the Commercial Orbital Transportation Services, which was created to coordinate private companies making deliveries of cargo and crew to the ISS, as a result of the Space Act.

SpaceX has successfully delivered cargo to the ISS several times now, their Falcon 9 rockets don’t have any more technical problems than any comparably sophisticated technology, and their Dragon V2 crew module looks nothing short of amazing, so what gives? Unfortunately, it seems that the House members are trying to cause trouble with SpaceX to defend the interests of competitor United Launch Alliance, which has its HQ in the home state of Gardner and Coffman—right in Coffman’s own district.

Add to that ULA and Space Launch System partner Boeing’s plant in Alabama, and it starts to look a lot like this call for transparency is more likely a defense of the interests of individual politicians. There’s nothing inherently wrong with transparency, but in this case, I won’t be a bit surprised when these same representatives want to pick over every little detail to bog down SpaceX and make them look bad as they have with calling minor, expected kinks in the grand scheme of the company’s success an “epidemic of anomalies.”

At least they’re honest about their love of transparency—it’s pretty easy to see right through them.

(via Slate, image via SpaceX)

Previously in SpaceX

Are you following The Mary Sue on Twitter, Facebook, Tumblr, Pinterest, & Google +?

12 Aug 20:30

The downside of high school science requirements: More dropouts

by John Timmer

Language and math have always been part of the core public school experience in the US; science, by contrast, has often been considered an optional topic. But the combination of a push for greater standards and a recognition of science's increasing role in our high-tech economy has resulted in the adoption of science requirements by many states. Now, an analysis of US census data suggests that the increased push for science may have a negative effect: an increase in the dropout rate in states that have adopted science requirements.

This isn't to say that science is bad for students. "That there is positive impact of rigorous coursework when chosen by students is not controversial," researchers based at the Washington University School of Medicine wrote in a recent study, "but there has been ongoing debate over the effects of requiring a more difficult high school curriculum for everyone."

The authors relied on data obtained by the US Census Bureau, through the actual census and annual surveys the Bureau performs. (For data junkies, it's worth noting that all of the data is publicly available from the Integrated Public Use Microdata Series website.)

Read 10 remaining paragraphs | Comments








12 Aug 18:55

Calgary seismic firm alleges its IP rights were violated over improper data release


Lauren Krugel, The Canadian Press
Published Sunday, January 12, 2014 7:49AM EST

CALGARY -- A Calgary-based seismic company claims its business is being undercut by regulatory agencies that have been publicly disclosing data it says it spent hundreds of millions of dollars to collect in Canada's offshore.

Geophysical Service Inc. makes -- or rather made -- its money by mapping the earth's layers beneath the ocean floor using sonogram-like technology, amassing a trove of valuable information that it would sell to oil and gas companies hunting for their next big find.

In a bevy of lawsuits, GSI says would-be customers have been getting hold of that data for free after files it submitted as part of the regulatory process -- a required step in obtaining a permit to do the mapping -- were improperly released.

GSI claims its intellectual property rights have been violated and its assets have been expropriated without compensation, rendering its data all but worthless.

The allegations have not been proven in court.

In its heyday, GSI was a 250-person operation that had a hand in some of Atlantic Canada's biggest offshore oil and gas discoveries. Through its predecessor companies, its history dates back to the 1930s.

Today its workforce can be counted on one hand. GSI doesn't do any seismic work anymore; it was forced to sell its two ships and hasn't booked revenues since 2009.

Instead of collecting and processing data, chief operating officer Paul Einarsson figures GSI spends 95 per cent of its time fighting its long-running and multi-fronted legal war against regulatory agencies, federal departments and its own one-time customers.

"I do this 10 hours a day," said Einarsson. "This is all I do."

Einarsson pegs the replacement cost of GSI data at $800 million to $900 million.

GSI has more than 30 lawsuits on the go, with more expected to be filed in the coming weeks. Some of the most high-profile cases are expected to be heard in court throughout 2014.

A decision in a case against the Canada-Nova Scotia Offshore Petroleum Board is expected soon and a case against the Canada-Newfoundland and Labrador Offshore Petroleum Board is expected to go to court later this year.

Other defendants include the National Energy Board, which oversees oil and gas activity in the North, various federal government departments and a slew of oil and gas explorers, large and small, that GSI says have accessed its data without paying for it.

Late last year, GSI filed a more than $170-million suit against U.S. energy heavyweight ExxonMobil and its Canadian subsidiary, Imperial Oil Ltd. (TSX:IMO), alleging "breaches of contract" and copyright infringement. Imperial spokesman Pius Rolheiser confirmed both his company and Exxon had been served, but declined to further comment as the matter is before the courts.

In an effort to modernize in the early 2000s, regulators began asking for digital versions of the data, rather than hard copies. One of their aims, Einarsson said, was to attract more oil and gas investment to the offshore -- something he describes as tantamount to "welfare" for big oil and gas explorers.

"It really meant a huge expansion of the disclosure of the data," said Einarsson.

GSI has been forced to rely on Access-to-Information requests to find out if data is being improperly disclosed by the regulators -- a slow and frustrating process for Einarsson and the company's remaining staff.

The boards are to keep non-exclusive, or speculative, data, which GSI collects in the hopes it can license it out, confidential for 10 years after it's submitted. But GSI argues its data should never be given away.

The guidelines do not set out what is supposed to happen once the confidentiality period is up, Einarsson said.

"It doesn't mandate disclosure. It doesn't override copyright. It doesn't override federal laws. It doesn't override our ownership laws. It just says nothing," said Einarsson.

The first major case began in the Supreme Court of Nova Scotia in November and Einsarsson is expecting a decision this month or next.

In that case, GSI was not seeking damages against the Canada Nova Scotia Offshore Petroleum Board and provincial government, but instead asked for a declaratory judgment that the law doesn't authorize the regulator to demand the data in the first place.

It also sought a permanent injunction "enjoining it from disclosing or otherwise making available to any person whatsoever and in any manner or format whatsoever the complete records, or alternatively the electronic records."

Kathleen Funke, a spokeswoman for the Canada-Nova Scotia Offshore Petroleum Board board declined to comment, as the matter is before the courts, "other than to say that we believe we have followed proper guidelines as set forth by the Accord Acts and our own geological and geophysical guidelines."

The National Energy Board, Canada-Newfoundland and Labrador Offshore Petroleum Board and the federal department of Natural Resources declined to comment as the matter is before the courts.

In a statement of defence filed in the Alberta Court of Queen's Bench this summer by the NEB, the federal regulator argues that the data is not protected by copyright, denying the material was an "original work" and that "skill, judgement or labour" were applied. The NEB also argues it did not benefit financially from the data and "acted in the public interest."

Teresa Scassa, Canada research chair in information law at the University of Ottawa, said copyright can be a "fussy thing" when it comes to data. The information itself may not be original, but the way in which it's compiled or arranged may be.

She said governments are not supposed to make public confidential information that's been provided by a private-sector company as part of a regulatory process. But the boards could argue there's an overriding public interest in making GSI's data public or that its data isn't sufficiently original to be protected under copyright.

"I wouldn't say either argument is a slam dunk," she said.

The GSI case, she said "will raise some very interesting issues that will have implications far outside of that context."

With revenues having dried up and legal bills mounting, Einsarsson says he'll keep up the fight as long as he possibly can.

It's not about saving his company; "I'm done," he says.

"It's the principle," he said. "I want to see it through because I want justice to be done. I don't want somebody else to have to do this. I want to teach these guys a lesson that they can't do this to Canadians."

12 Aug 16:06

The Next Big Thing You Missed: A Startup’s Plan to End Health Insurance Tyranny, With Slick Tools

Insurance companies are notoriously unloved. But few of us hold so very much contempt for the industry as Ali Diab.






12 Aug 16:35

UC Riverside's world-class science fiction library under threat

by Cory Doctorow

Science fiction author Nalo Hopkinson, a professor at UC Riverside, sounds the alarm about a change in management at the Eaton Science Fiction Collection, the largest public science fiction and fantasy in the world. Read the rest

05 Aug 19:31

'Dead Zone' in the Gulf of Mexico is the Size of Connecticut

Scientists say a man-made "dead zone" in the Gulf of Mexico is as big as the state of Connecticut. The zone, which at about 5,000 square miles (13,000 sq km) is the second largest in the...

-- Read more on ScientificAmerican.com
04 Aug 08:00

Ray Bradbury: “I Am Not Afraid of Robots. I Am Afraid of People” (1974)

by Josh Jones

BradburyRobotLetter

Anyone remember Michael Crichton’s Westworld (or the Simpsons parody)? In this dystopian 1973 sci-fi, tourists visit a triumvirate of fantasy theme parks staffed by robotic historical re-enactors: Roman World, Medieval World, and the titular West World, with its “lawless violence on the American Frontier.” When a virus infects the parks’ androids, James Brolin must fight a ruthless robot gunslinger—played by a stone-faced Yul Brenner—to the death. The film may look laughably dated, but the fears it taps into are anything but: 2001, Terminator, Battlestar Galactica, I, Robot, and even a Westworld remake in the works—the perennial theme of man vs. machine, as old in film at least as Fritz Lang’s silent Metropolis, becomes ever more relevant in our drone-haunted world.

But are evil—or at least dangerously malfunctioning—robots something we should legitimately fear? Not according to visionary sci-fi author and Disney enthusiast Ray Bradbury in a letter to English writer Brian Sibley, penned in 1974, one year after the release of theme-park horror Westworld. The main body of Bradbury’s letter consists of a vigorous defense of Walt Disney and Disneyland, against whom “most of the other architects of the modern world were asses and fools.” Sibley recalls that his initial letter “expressed doubts about Disney’s use of Audio-Animatronic creations in Disneyland.” “At the time,” he explains, “I… had probably read too many sci-fi stories about the danger of robots taking over our human world—including, of course, some by Ray—and so saw it as a sinister rather than benign experiment.”

After his praise of Disney, Bradbury writes two agitated postscripts exploding what Sibley calls “ill-informed and prejudiced views” on robots.  He classes automated entities with benign “extensions of people” like books, film projectors, cars, and presumably all other forms of technology. Notwithstanding the fact that books cannot actually wield weapons and kill people, Bradbury makes an interesting argument about fears of robots as akin to those that lead to censorship and enforced ignorance. But Bradbury’s counterclaim sounds a misanthropic note that nonetheless rings true given the salient examples he offers: “I am not afraid of robots,” he states, emphatically, “I am afraid of people, people, people.” He goes on to list just a few of the conflicts in which humans kill humans, religious, racial, nationalist, etc.: “Catholics killing Protestants… whites killing blacks… English killing Irish….” It’s a short sampling that could go on indefinitely. Bradbury strongly implies that the fears we project onto robotic bogeymen are in reality well-grounded fears of each other. People, he suggests, can be monstrous when they don’t “remain human,” and technology—including robots—only assists with the necessary task of “humanizing” us. “Robots?” Bradbury writes, “God, I love them. And I will use them humanely to teach all of the above.” 

Read a transcript of the letter below, courtesy of Letters of Note, and be sure to check out that site’s new book-length collection of fascinating historical correspondence.

June 10, 1974

Dear Brian Sibley:

This will have to be short. Sorry. But I am deep into my screenplay on SOMETHING WICKED THIS WAY COMES and have no secretary, never have had one..so must write all my own letters..200 a weekl!!!

Disney was a dreamer and a doer..while the rest of us were talking ab out the future, he built it. The things he taught us at Disneyland about street planning, crowd movement, comfort, humanity, etc, will influence builders architects, urban planners for the next century. Because of him we will humanize our cities, plan small towns again where we can get in touch with one another again and make democracy work creatively because we will KNOW the people we vote for. He was so far ahead of his time it will take is the next 50 years to catch up. You MUST come to Disneyland and eat your words, swallow your doubts. Most of the other architects of the modern world were asses and fools who talked against Big Brother and then built prisons to put us all up in..our modern environments which stifle and destroy us. Disney the so-called conservative turns out to be Disney the great man of foresight and construction.

Enough. Come here soon. I’ll toss you in the Jungle Ride River and ride you on the train into tomorrow, yesterday, and beyond.

Good luck, and stop judging at such a great distance. You are simply not qualified. Disney was full of errors, paradoxes, mistakes. He was also full of life, beauty, insight. Which speaks for all of us, eh? We are all mysteries of light and dark. There are no true conservatives, liberals, etc, in the world. Only people.

Best,

(Signed, ‘Ray B.’)

P.S. I can’t find that issue of THE NATION, of the NEW REPUBLIC, which ever it was, with my letter in it on Disney. Mainly I said that if Disneyland was good enough for Captain Bligh it was good enough for me. Charles Laughton and his wife took me to Disneyland for my very first visit and our first ride was the Jungle Boat Ride, which Laughton immediately commandeered, jeering at customers going by in other boats! A fantastic romp for me and a hilarious day. What a way to start my association with Disneyland! R.B.

P.S. Can’t resist commenting on you fears of the Disney robots. Why aren’t you afraid of books, then? The fact is, of course, that people have been afraid of books, down through history. They are extensions of people, not people themselves. Any machine, any robot, is the sum total of the ways we use it. Why not knock down all robot camera devices and the means for reproducing the stuff that goes into such devices, things called projectors in theatres? A motion picture projector is a non-humanoid robot which repeats truths which we inject into it. Is it inhuman? Yes. Does it project human truths to humanize us more often than not? Yes.

The excuse could be made that we should burn all books because some books are dreadful.

We should mash all cars because some cars get in accidents because of the people driving them.

We should burn down all the theatres in the world because some films are trash, drivel.

So it is finally with the robots you say you fear. Why fear something? Why not create with it? Why not build robot teachers to help out in schools where teaching certain subjects is a bore for EVERYONE? Why not have Plato sitting in your Greek Class answering jolly questions about his Republic? I would love to experiment with that. I am not afraid of robots. I am afraid of people, people, people. I want them to remain human. I can help keep them human with the wise and lovely use of books, films, robots, and my own mind, hands, and heart.

I am afraid of Catholics killing Protestants and vice versa.

I am afraid of whites killing blacks and vice versa.

I am afraid of English killing Irish and vice versa.

I am afraid of young killing old and vice versa.

I am afraid of Communists killing Capitalists and vice versa.

But…robots? God, I love them. I will use them humanely to teach all of the above. My voice will speak out of them, and it will be a damned nice voice.

Best, R.B.

via Letters of Note

Related Content:

Ray Bradbury: Literature is the Safety Valve of Civilization

The Secret of Life and Love, According to Ray Bradbury (1968)

Isaac Asimov Explains His Three Laws of Robots

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Ray Bradbury: “I Am Not Afraid of Robots. I Am Afraid of People” (1974) is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs.

The post Ray Bradbury: “I Am Not Afraid of Robots. I Am Afraid of People” (1974) appeared first on Open Culture.

04 Aug 16:51

Maze exhibition in D.C.

by David Pescovitz

Over at Smithsonian, a short piece about the National Building Museum's giant maze installation (construction video above) with a brief history of labyrinths.

Read the rest
04 Aug 18:45

SpaceX chooses Texas for first commercial launch facility

by Paul O'Donnell
SpaceX, the space exploration company founded by billionaire Elon Musk, will build the world's first private commercial rocket launch facility on a remote beach in South Texas. Gov. Rick Perry announced an incentive package totaling more than $15 million Monday to close the deal that will pump $85 million in capital investment to the Brownsville area. It'll also add 300 jobs in one of the nation's poorest metropolitan areas. "Texas has been on the forefront of our nation's space exploration efforts…
30 Jul 16:30

Too little too late: maintaining the value of geographic information

by Geoff

Sanjay Kumar of Geospatial Media in a recent blog highlights a problem that has been in many sectors a festering sore for years, maintaining geographic data.    Sanjay was visiting the newly-built headquarters of a large company (I assume Trimble) in Denver last week. Trimble has just completed a new headquarters building using its own geospatially-enabled design and construction technology (aka eating its own dog food)  ranging from surveying, laser scanning, machine control to building information modelling (BIM), which resulted not only in efficiency in construction process saving time and cost, but also offered enterprise wide information with regard to the assets and facilities of the building for maintenance and safety functions.

Sanjay contrasted this forward-looking use of geospatial data and technology with the experience of a person invited to visit the new HQ building being denied a visa by the US immigration agency because the agency's GIS is out-of-date and indicated there was no building at the address of Trimble's new HQ. 

Based on his recent experience Sanjay goes on to generalize about the 'too little too late' approach with regard to harnessing and utilising the value of geospatial information and tools that characterizes too many geospatially-enabled programs. Despite having gone through the entire process of commoditization and industrialization via initiatives like Google Earth and Bing Maps, spatial enabling of industrial work-flows in the field of engineering and construction, mechanization of agriculture, management of emergency services, and critical utility in defence and national security, geospatial technology gets attention too late in enterprises and governments and often too little effort is made to create high quality frequently updated geospatial information.

Data quality and work processes in utilitiesAs I have blogged about on numerous occasions primarily in the context of utility and communications infrastructure, geospatial data quality including currency is a huge problem in many sectors of the World economy.

At last year's ESRI User Conference, Sam Pitroda, adviser to the previous Prime Minister of India offered a vison of IT enabling a better future for India's 400 million people living below the poverty line.  A key part of that vision is creating "a nationwide platform for GIS to [geo]tag every physical asset.  With this, we have platform for cyber security, lots of government and public service applications."

That is an incredible vision, but if we aren't able to maintain the currency of the information including location of our infrastructure, geotagging all of our assets will simply exacerbate the existing problem of maintaining geographic data. 

Data is perishable, just like meat, vegetables or fruit.  The business problem is finding the equivalent of refrigeration so that geographic data maintains its value.  A solution to the problem is to treat information about assets as at least as valuable as the assets themselves.  For example, I remember visiting a Telefonica Sao Paulo engineering facility several years ago.  All the design and construction work at TelefonicaSP is done by outside contractors.  My first question is how do you ensure that as-builts are reported accurately and get into the asset database in a timely fashion ?  It is very simple - contractors don't get paid until the as-builts are in the database and have been verified.  Heathrow is implementing a software system right now based on a similar business process but that is aimed at automatically ensuring as-builts are reported accurately, consistently and in a timely fashion by contractors.  As we move toward a new way of planning, designing, building and operating and maintaining infrastructure, this type of solution for maintaining the currency of geolocation data is going to have to become standard in the construction industry.

If we don't treat geospatial data as a valuable, but perishable commodity, as Sanjay concludes, geospatial will remain highly under-valued and under-utilized technology vis-a-vis its potential and offerings.

05 Aug 19:38

Syfy & Wolfgang Petersen Bringing John Scalzi’s Old Man’s War To TV

by David Wharton
Continuing the trend of new Syfy shows we actually want to watch.
04 Aug 22:29

Newswire: Max Landis is turning Douglas Adams’ Dirk Gently books into a TV series

by William Hughes

Chronicle scribe Max Landis is teaming up with IDW Publishing to create a TV series based on Douglas Adams’ Dirk Gently series of humorous sci-fi novels—all as part of comics publisher IDW’s plans, announced last year to follow Marvel’s lead into TV production. As revealed by The Hollywood Reporter, Landis is attached to the project as an executive producer and is also set to write the pilot. He expressed enthusiasm for the character, originated by Adams in 1987’s Dirk Gently’s Holistic Detective Agency, saying, “This is a dream project of mine.”

Gently (real name Svlad Cjelli, aka Dirk Cjelli) is a sort of affable con man who styles himself as a holistic detectivei.e., one who solves cases not by investigating a mystery, but the society that surrounds the mystery. He usually accomplishes this by frittering the day away doing nothing much at all ...

04 Aug 04:45

BEHOLDING DÜRER

by Brooks Riley

Albrecht_Dürer_067

by Brooks Riley

Personal experiences of art should not be foisted on others except in small doses, given that words can only provide semantic guideposts to such an experience. That's why I never wanted to write a companion piece to my earlier one‚ Holding Albrecht. But recently I found myself longing to see Albrecht Dürer's Paumgartner Altar again, which was nearly destroyed by an acid attack in 1988, removing it from view for over twenty years. After my earlier epiphany at the Staatliche Graphische Sammlung in Munich, holding and beholding the Dürer engravings up close in an empty room, with all the time in the world to delight in their intricate wit and daunting craftsmanship, I felt uneasy as I slouched over to the grandiose Alte Pinakothek, shouldering a dread of crowds, dread of the official museum-going experience, dread that my memory of Dürer's paintings might have let me down.

It was one of those cold spells in May, some of which have names. Not the Eisheiligen of mid-May (five saint days of chill), and too soon for the Schafskälte of early June, this was just a no-name dreary day. I would be visiting old friends, not just the paintings themselves, but also the faces in those paintings. If you live in Germany, you see Dürer's faces everywhere, the genetic variances of a Volk, still in circulation 500 years later. Just look at Oswolt Krel, a young businessman from Lindau. His eyes have darted to the left, his face a mask of worry over some transaction gone wrong. Is it 1499 or 2008?

The Alte Pinakothek is not my favorite museum. It exudes all the charm of a mausoleum, with its hundreds of cubic meters of useless space. In spite of being closed for renovation for four years in the Nineties, the museum failed to replace the tacky plastic strips glued to the bottom of every picture frame, identifying the painting only for those limber enough to bend over the rope barrier to read them. Such cheap labeling is unworthy of a museum which is otherwise world-class. But okay, I'm not here for the labels, I'm here to see the paintings. That too seems to be a problem on a cloudy day. The room featuring the two paintings I most want to see is lit by a skylight far too high to provide enough natural light to see them properly. Sadder but wiser, the museum has also placed ropes in front of every painting, not that a determined demented destroyer would ever be deterred by a textile barrier.

Now for the good news. The Paumgartner Altar has been beautifully restored, well worth the 20-year wait to see it again. My fear that it might be too colorful (a problem with many restorations) was laid to rest. As with so much of Dürer, this triptych is full of oddities, especially the center panel. Dürer seems to be working with his own brand of magic realism, deliberately defying the laws of perspective he knows so well, pumping the religious myth by dwarfing the mortals: not the ones who are naturally diminished by distance, but the ones in the foreground who are on a level with the Madonna and child—a depiction that suggests ‘toys-R-us'. (They so offended an Elector of Bavaria that he had them deleted in the 17th century)

At the back of the courtyard, a wooden cross haphazardly balanced on the ruin of an arch eerily foretells the holy infant's eventual fate. Under the arch, two men converse, ignoring the nativity in the foreground. Behind them, the homey hillsides in the distance, framed by the arch, remind me of Jakob Wassermann's biography of another famous Nuremberger, Kaspar Hauser. Dürer would have loved Kaspar's reaching out to ‘touch' the landscape outside his window in the tower where he was first housed, driven by his fundamental ignorance of distance and perspective, the side-effects of formative years of incarceration in the dark.

This nativity scene seems almost post-modern: Deconstructed elements of the story are present but do not add up to one—it's more Neo Rauch than Northern Renaissance. No one is looking at the baby except its mother (with what looks like horror). The wee pious group below (all Paumgartners) seem to be looking in its direction, but it's doubtful they can see anything with all the cherubs flocking around the baby. Whatever Dürer was after, the center panel cares more about geometry and solids, location and dislocation, distraction and indifference, distance and proximity, incident and discourse than it does about its holy subject: It's an intriguing mess that raises more questions than it answers.

As odd as it is, nothing about this triptych's center panel can compete with the two Paumgartner brothers left and right of it, about whom I wrote in my earlier piece. In spite of their sartorial splendor and a dead dragonette, they look like who they are, wealthy burghers of Nuremberg with other things on their minds. They are the cinema verité for whom Dürer cares the most. They are bigger than life but not idealized. No fantasy landscapes inflate their origins (although said Elector had them added in the 17th century). Without a context, the two come across as recognizable human beings, with clear fraternal differences that could tell tales across the centuries.

It's this absence of context that makes Dürer's self-portrait from 1500 so powerful. It seems to be saying: ‘Do not look at where I am, look at who I am.' He painted it at the age of 28, the year he officially achieved manhood (a late Bar Mitzvah by today's standards). His maturity is evident, especially compared to the self-portrait two years earlier, in which a foppish young narcissist dresses up as a wannabe Casanova complete with Italian landscape outside the window.

When I was young, I thought the Dürer of 1500 was looking at me. Now I know he's asking me to look at him (You talkin' to me?). ‘I am a man now. See me as I really am.' Dürer was already famous and successful. Until 1500 that fame had probably gone to his head. But now he's moved on, he's no longer the rock star. The Dürer of 1500 is the one who will write 4 books on geometry, another 4 books on the human figure, the one who will correspond with Erasmus, Rafael and probably Piero della Francesca, the one who will lean in to Luther, the one who will write a thesis on aesthetics glorifying the genius of the moment over the painstaking failure of effort.

The man who invented the logo will make fun of his own invention. The man who loves geometry and math (his books were the first books on mathematics to be written in German instead of Latin) will lampoon his obsessions in the bewildered gaze of a young Melencolia. The man who looks like a wise young hippie of my generation will document his travels in breathtaking watercolors no patron would deign to commission. The man with the acid pen will draw a rhinoceros from someone's else's description of it.

It is dark in the Alte Pinakothek on this dreary day, dimming the radiance I know is there. A painting can convey so much more than its visual vocabulary might suggest. As I strain to hold the gaze directed at me from Dürer's self-portrait, I know what he's thinking. I nod back.

05 Aug 06:00

Hellboy Hamlet's Not the Weirdest Take, But It Is the Most Awesome

by Katharine Trendacosta

Hellboy Hamlet's Not the Weirdest Take, But It Is the Most Awesome

"Suffer the slings and arrows of outrageous fortune"? Yeah, that sounds about right.

Read more...








05 Aug 09:51

NASA's MESSENGER spacecraft's 10 years in space

Ten years ago, on August 3, 2004, NASA's MESSENGER (MErcury Surface, Space ENvironment, GEochemistry, and Ranging) spacecraft blasted off from Cape Canaveral, Florida, for a risky mission that would take the small satellite dangerously close to Mercury's surface, paving the way for an ambitious study of the planet closest to the Sun.
23 Jul 16:13

Whither technical books?

by Matt Hall

Pile of geophysics booksLeafing through our pile of new books on seismic analysis got me thinking about technical books and the future of technical publishing. In particular:

  • Why are these books so expensive? 
  • When will we start to see reproducibility?
  • Does all this stuff just belong on the web?

Why so expensive?

Should technical books really cost several times what ordinary books cost? Professors often ask us for discounts for modelr, our $9/mo seismic modeling tool. Students pay 10% of what pros pay in our geocomputing course. Yet academic books cost three times what consumer books cost. I know it's a volume game — but you're not going to sell many books at $100 a go! And unlike consumer books, technical authors usually don't make any money — a star writer may score 6% of net sales... once 500 books have been sold (see Handbook for Academic Authors).

Where's the reproducibility?

Compared to the amazing level of reproducibility we saw at SciPy — where the code to reproduce virtually every tutorial, talk, and poster was downloadable — books are still rather black box. For example, the figures are often drafted, not generated. A notable (but incomplete) exception is Chris Liner's fantastic (but ridiculously expensive) volume, Elements of 3D Seismology, in which most of the figures seem to have been generated by Mathematica. The crucial final step is to share the code that generated them, and he's exploring this in recent blog posts (e.g. right).

I can think of three examples of more reproducible geophysics in print:

  1. Gary Mavko has shared a lot of MATLAB code associated with Quantitative Seismic Interpretation and The Rock Physics Handbook. The code to reproduce the figures is not provided, and MATLAB is not really open, but it's a start.
  2. William Ashcroft's excellent book, A Petroleum Geologist's Guide to Seismic Reflection contains (proprietary, Windows only) code on a CD, so you could in theory make some of the figures yourself. But it wouldn't be easy.
  3. The series of tutorials I'm coordinating for The Leading Edge has, so far, includes all code to reproduce figures, exclusively written in open languages and using open or synthetic data. Kudos to SEG!

Will the web win?

None of this comes close to Sergey Fomel's brand of fully reproducible geophysics. He is a true pioneer in this space, up there with Jon Claerbout. (You should definitely read his blog!). One thing he's been experimenting with is 'live' reproducible documents in the cloud. If we don't see an easy way to publish live, interactive notebooks in the cloud this year, we'll see them next year for sure.

So imagine being able to read a technical document, a textbook say, with all the usual features you get online — links, hover-over, clickable images, etc. But then add the ability to not only see the code that produced each figure, but to edit and re-run that code. Or add slider widgets for parameters — "What happens to the gather if if I change Poisson's ratio?" Now, since you're on the web, you can share your modification with your colleagues, or the world.

Now that's a book I'd be glad to pay double for.

Some questions for you

We'd love to know what you think of technical books. Leave a comment below, or get in touch

  • Do you purchase technical books regularly? What prompts you to buy a book?
  • What book keeps getting pulled off your shelf, and which ones collect dust?
  • What's missing from the current offerings? Workflows, regional studies, atlases,...?
  • Would you rather just consume everything online? Do you care about reproducibility?

400 posts

The last post was our 400th on this blog. At an average of 500 words, that's about 200,000 words since we started at the end of 2010. Enough for a decent-sized novel, but slightly less likely to win a Pulitzer. In that time, according to Google, almost exactly 100,000 individuals have stopped by agilegeoscience.com — most of them lots of times — thank you readers for keeping us going! The most popular posts: Shale vs tight, Rock physics cheatsheet, and Well tie workflow. We hope you enjoy reading at least half as much as we enjoy writing.

23 Jul 14:02

Updating how we teach the process of science

by Laura Guertin
This week, I’m with a group of in-service middle school teachers for a week-long workshop on climate science.  It is always inspiring to connect with K-12 teachers to see and hear about their passion for their classrooms and for teaching – and always frustrating to hear that they have to do so with so few resources that include outdated textbooks.  And in those outdated textbooks we will find the “scientific …
23 Jul 12:30

Fieldwork should be safe and welcoming for all. Currently, it’s not.

by Chris Rowan

A post by Chris RowanHow prevalent is sexual harassment and assault during fieldwork? A paper in PLOS One that grapples with this question is getting some justified attention in the press and online at the moment, and the answer is of concern to anyone who works in a field-based science like geology. In the paper, Kate Clancy and coauthors Katie Hinde, Robin Nelson and Julienne Rutherford present ‘the first systematic investigation of field site work environment and experiences, particularly as they relate to sexual harassment and assault’. The results, from interviews of more than 600 people working in 32 different field-based disciplines[1] are not pretty:

  • 64% of all survey respondents stated that they had personally experienced sexual harassment.
  • Over 20% of respondents reported that they had personally experienced sexual assault.
  • The primary targets of harassment and assault were women trainees[2], and they were predominantly targeted by their male superiors.
  • Few respondents were aware of mechanisms to report incidents, and most who did report were unsatisfied with the outcome.

This is a difficult subject to get totally accurate data on, and the authors acknowledge there are risks of some self-selection in a study based on a voluntary survey (although this cuts both ways: they could be as many or more people choosing not to participate because their experiences were too traumatic, as were motivated to participate because of their experiences). But really, the absolute frequencies are unimportant. They are clearly a long way from being zero, and even one incident of harassment or assault is one too many. Also, whilst this might be the first attempt to systematically study this problem, there is a certain lack of surprise at the outcome which is almost as terrible as the numbers themselves.

@Allochthonous Disturbing, but completely unsurprising.

— Kim Hannula (@stressrelated) July 17, 2014

This is a serious issue. The very nature of the geosciences, where time in the field is still an essential part of earning your undergraduate degree, and many graduate degrees and later research are fundamentally tied to field work, means that career progression requires women to enter and persevere in environments where they are at measurable risk of experiencing harassment – or worse. If we wish to undo the highly-skewed-to-white-male post-PhD gender ratio in the geosciences (which is measurably worse than in many other sciences), we are not exactly presenting the safe and welcoming culture that will make it happen.

So what to do? Kate Clancy has previously listed some suggested actions, which boil down to:

  • making sure that you know your institution’s sexual harassment policy and reporting mechanism, and ensure that your colleagues and students are also aware of it.
  • Investigating the issue of pre fieldwork training for researchers[3].

This is a good start, but I don’t think it can end there. If our aim is to truly make the climate within our field more friendly to women and other minorities, I don’t think we can sit back and say that because there is a policy and a process for reporting and resolving complaints, we have no other responsibility. Our frontline defence against sexual harassment and assault should not be forcing people to compound the trauma and emotional damage that they have already suffered to bring people to account for their behaviour – with the additional risk of having their reputations and their future prospects in their field damaged, with no guarantee of a good outcome[4]. We can, we must, be more proactive than that. We need to attack this behaviour at its root, by making it clear upfront that it is not acceptable, that a complaint will be taken seriously, and that people who do break the code of conduct will be dealt with harshly. The latter two may require taking an uncomfortable look at our own behaviour and actions: how often are we inclined to turn a blind eye, or brush off troubling behaviour as nothing of consequence? This is a particular risk in the peculiarly intense environment of field trips, which can skew perceptions and standards of acceptable behaviour, whilst simultaneously isolating the vulnerable from their normal support network, and increasing the perceived power differential between students and mentors. Our oversight is dulled, precisely when we should be more vigilant. Perhaps that is something training can effectively address.

I don’t have any easy answers to all this, but what I do know that just as anywhere else, women working in the field have a right to feel – and be – safe and secure. Currently they are not, and the onus is on us to change that. A discussion of how to do so is already taking place within anthropology, as evidenced by this study, and in other field-based sciences like ecology. Geoscientists should be part of this important conversation too.


  1. 4% of the respondents identified as geologists. Overall, the survey was dominated by anthropologists, who were the original targets of the survey, and were more likely to have been reached by the authors’ recruitment efforts.  ↩

  2. Harassment and assault were documented for both men and women, but women were 3.5 times more likely to report having experienced sexual harassment and more than 5 times as likely to report having been assaulted.  ↩

  3. At least in geology, as far as I know, fieldwork risk assessments do not routinely address this particular risk, but there’s no reason it could not be added.  ↩

  4. For example. It’s no surprise that many people do not get to the stage of making a complaint in the first place.  ↩

24 Jul 20:53

Dropped cell phone calls become rain gauges in West Africa

by abranscombe
A shaky cell phone connection during a rainstorm can be an annoying nuisance. But now scientists are showing that these weakened signals can be used to monitor rainfall in West Africa, a technique that could help cities in the region better prepare for floods and combat weather-related diseases.
21 Jul 06:21

I'm Sorry, This Trench is Full; Those Rocks Will Have to Go Elsewhere

by Garry Hayes
Looking south from Hurricane Ridge into the heart of Olympic National Park
There will be few detailed blogs these next few weeks; I'm on the road leading our Canada/Pacific Northwest field class, and I will be just a bit busy. But I can't help putting up a few photos here and there. In today's pictures we see what happens when subduction zones get out of control, so to speak.

Subduction zones are places where oceanic crust sinks back into the Earth's mantle to be recycled at some future time as magma and lava. The mud and sand that blankets the coast and seafloor often will be scraped off against the edge of the continent to form a highly deformed and sheared deposit called an accretionary wedge. Much of the time, wedge deposits remain underwater or show as low-lying islands, but sometimes the rock gets pushed up into mountain ranges parallel to the coast and subduction zone. California's Coast Ranges resulted in part from such activity, but at Olympic National Park in Washington State, the results are nothing short of spectacular. The mountains have been pushed up into a series of peaks exceeding 7,000 feet in elevation, and with the intense amounts of snowfall, there are a surprising number of active glaciers.
Looking north from Hurricane Ridge across the Juan de Fuca Strait to Vancouver Island
Our trip reconnaissance this week took us to Hurricane Ridge, which has now become a newcomer to my list of the most incredible places I have ever stood. The view is astounding (when conditions are clear). We could look deep in the heart of the park at Mount Olympus, and could see north across the Strait of Juan de Fuca to Victoria Island. A marvelous place!
21 Jul 04:40

The War on Terroir

by Dwight Furrow

by Dwight Furrow

6a019b00fffe15970b01a73df0ad7a970d-150wiFew terms in the wine world are more controversial than "terroir", the French word meaning "of the soil".  "Terroir" refers to the influence of soil and climate on the wine in your glass. But the meaning of "terroir" is not restricted to a technical discussion of soil structure or the influence of climate. Part of the romance of wine is that it (allegedly) expresses the particular character of a region and perhaps its people as well.

According to some "terroirists", when we drink wine that expresses terroir, we feel connected to a particular plot of land and its unique characteristics, and by extension, its inhabitants, their struggles, achievements, and sensibility. Can't you just feel their spirit coursing through your veins on a wild alcohol ride? The most extreme terroirists claim that the influence of soil and climate can be quite literally tasted in the wine. If this strikes you as a bit of, well, the digested plant food of bovines to put it politely, you are not alone. Many in the wine business are skeptical about the existence of terroir claiming that winemakers should make the best wine they can without trying to preserve some mystical connection with the soil. But the issue is an important one because the reputation of entire wine regions rests on the alleged unique characteristics of their terroir, not to mention the fact that the skill and discernment of wine tasters often involves recognizing these characteristics.

There is confusion, however, regarding what this concept of terroir conveys. Some uses of the term simply imply that wine grapes are influenced by climate and soil so that wines from a region with broadly similar soil types and macroclimates have common characteristics discernable in the wine. This is obviously true and unobjectionable. Factors such as the ability of soil to drain or absorb water, the presence of stones that radiate heat into the vineyard, and the effects of nutrients on plant metabolism are among the important known effects of soil on vineyards. The soil and climate in Bordeaux differs from the soil and climate in Burgundy and thus they grow different grapes and make wines of quite contrasting styles that are apparent in the glass.

But the vociferous defenders of terroir have something more controversial in mind. They claim that the distinctive character of the soil is directly transferred to the glass. As Matt Kramer writes in his book Making Sense of Wine, some wines "... allow us to eavesdrop on the murmurings of the earth." One chardonnay is described as exhibiting  "a powerful flavor of the soil: the limestone speaks." This is a much more controversial claim because Kramer and others seem to be suggesting that there is some discernable likeness between the taste of the wine and the taste of the soil. If this strikes you as  disgusting, you are not alone, especially when a wine is reputed to exhibit barnyard notes. They ought to be honest and label their wines Domaine de Poop. In fact, it is usually flavors of chalk, flint, or slate that is reputed to be in the wine, and the view that such a flavor likeness exists is widely held especially in France, Italy, and Germany.

Science has weighed in on the controversy, and the science seems to be on the side of the anti-terroirists. Science writer and wine expert Jamie Goode in a series of articles on this topic writes:

"But I'd exclude from my definition of terroir the concept that soils can directly influence the character of a wine, for instance, by flavour compounds in the soil being directly translocated to the grapes. This sort of mechanism is not impossible, but it does seem to be hugely implausible. I'm not a root physiologist, but I do have a PhD in plant biology, and I've yet to hear a convincing explanation for how soil components can directly alter the flavour of grapes and hence the finished wine... Suffice to say, stony, earthy or mineral flavours in a wine are not necessarily 'terroir' notes. If a wine grown on chalky soil tastes chalky, it's an unjustified leap of faith to say that this is the 'terroir' speaking. The minerals in the soil may be fortuitously imparting chalky notes to the wine indirectly by altering the vine's metabolism, but you'd only be able to tell this by comparing this wine with one made from the same grapes grown on different soils and vinified in a similar way."

Food scientist Harold McGee, writing, with Daniel Patterson, in the New York Times agrees:

"What about the flavors of soil and granite and limestone that wine experts describe as minerality - a term oddly missing from most formal treatises on wine flavor? Do they really go straight from the earth to the wine to the discerning palate?

No.

Consider the grapevine growing in the earth. It takes in elemental, inert materials from the planet - air and water and minerals - and, using energy captured from sunlight, turns them into a living, growing organism. It doesn't just accumulate the earth's materials. It transforms them into the sugars, acids, aromas, tannins, pigments and dozens of other molecules that make grapes and wine delicious."

The science suggests that minerals are dissolved in soil moisture which is absorbed by the plant and influences the development of the grape vines. The minerals have no direct influence on flavor.

But here is where this debate gets muddled. Nothing in these referenced articles on the science of terroir deny that the unique characteristics of a region's soil can, with the proper vinification, influence the taste of the wine. In fact they repeatedly assert the plausibility of an indirect connection between soil and flavor, although the mechanisms have not yet been discovered. As McGee and Patterson write:

"It's possible, then, that soil minerals may affect wine flavor indirectly, by reacting with other grape and yeast substances that produce flavor and tactile sensations, or by altering the production of flavor compounds as the grape matures on the vine.

It seems to me this concedes everything the terroirists want. If the soil uniquely alters the grapes' metabolism in a way that influences flavor in the glass, then the flavor is a result of the soil, and the wine is an expression of those unique properties. The upshot of the science is that flavors in the wine are not copies or imitations, or likenesses of the way these minerals taste in the ground. But why anyone should care about that is beyond me. (I have yet to talk to anyone who makes a habit of tasting rocks. But if you are so inclined let me know the results.) What should matter to terroirists are the causal influences of unique factors in the soil of particular vineyards on the discernable flavors of wines from those vineyards. If such a connection exists, regardless of how direct it is, their claims to the uniqueness and special value of place-"somewhereness" as Matt Kramer puts it-will have been vindicated. Furthermore, the vindication suffers no loss of romance. For it is still true that centuries of careful husbandry and the matching of grapes and soil by dedicated viticulturalists have produced unique wines of extraordinary beauty that cannot be replicated elsewhere.

On the other hand, why should skeptics deny the possibility of such an indirect connection? Since we are talking about the causal influence of soil constituents on the metabolism of grapes, there is nothing mystical here. The terroirists seems to want more than they need and the skeptics (at least those whose skepticism is grounded in science) deny more than they must. There is plenty of room for agreement here with no compromise of basic principles.

So I will unilaterally declare a truce in the "terroir wars". Who says philosophy has no influence on worldly events?

For more ruminations on the philosophy of food and wine visit Edible Arts.

21 Jul 19:01

Your TSA Security Fees Just Doubled, But Security Won't Improve

by Zenon Evans

As if flying weren't costly enough, your next plane ticket is going to be more expensive, thanks to the federal government. Today the Transportation Security Administration's (TSA) security fee rose by more than double.

CNN reports:

Until Monday, a passenger was charged $2.50 for each leg of a journey. For a nonstop round trip, the cost was $5. For a round trip with a connection each way, the cost was $10.

The fee was capped at two flights each way. That means you couldn't get charged more than $5 each way or $10 round trip, even if you took three flights to get your destination.

Now, passengers must pay a flat fee of $5.60 in each direction, no matter how many plane transfers are made to get from one city to another.

For passengers flying a nonstop round trip, that means the fee will increase from $5 to $11.20.

Passengers flying round-trip with a connection each way will see their fees increase $1.20 to $11.20 per round trip, versus $10 before the fee increase.

Domestic flyers will also get hit with an additional $5.60 if you have a layover that's four hours or longer.

"Business travelers who fly non-stop routes, and travelers in secondary markets requiring connections," suggests Fox News, "will see the biggest impact."

"Due to new TSA fee hike, travelers will pay a billion dollars more per year in added taxes/fees," tweeted Nick Calio, the president and CEO of Airlines for America, an industry advocacy group.

The securirty agency, which operates with an annual budget of over $7 billion, gets a lot of flak. More than half of Americans believe all those pat-downs and invasive body scans are mere security theater that have no real deterrent on hijackings.  And those skeptics are right. Research on the 13-year-old agency shows it so far hasn't had a measurable effect on air travel safety.  

The "TSA estimates the hike will generate $16.9 billion more than current collections," explains USA Today. The heftier fee won't actually do much (if anything) to improve security, though. "Congress agreed to the increase in December to raise $12.6 billion to cut the deficit," and nothing will go to security improvement until that's paid. 

Airlines for America criticizes that the government treating "airlines and their passengers as its own personal ATM," though some members of Congress say they never intended for the TSA to charge this much. Senate Budget Committee chair Patty Murray (D-Wash.) has contested that the TSA changed how defines a "round trip" flight in order to work around the cap Congress placed on the agency's fees.

21 Jul 19:46

Scopes Monkey Trial: Guilty

July 21 is verdict day in the infamous Scopes "Monkey" Trial of 1925. The verdict came in from a jury in Dayton, Tenn., that John Thomas Scopes had committed the crime of teaching evolution to...

-- Read more on ScientificAmerican.com