You are here

John Bell, Senior Vice President, Arria NLG, on taming Big Data

03rd November 2014

John Bell has more than 15 years' experience working with oil and gas infrastructure software solutions, with companies including Oracle, Inovageo, EMC, Aspen Technology and SAP Portals UK.

Oil & Gas Technology spoke to John Bell at Arria NLG to discover how advanced Big Data analytics systems can streamline processes and reduce downtime at both upstream and downstream oil and gas operations

INTERVIEW: OGT talks to John Bell, Senior Vice President, Arria NLG, about taming Big Data
'We want faster processors and bigger storage and we need them now to manage the data we already have.'

Where is Big Data being produced in the oil and gas industry?

Big Data is being produced everywhere in oil and gas.  For example, we have 70 years of seismic acquisition data, from 2D through to 3D, and are at the beginning of the 4D period. 

Asset data is also increasing exponentially as more and more sensors are added, and when you add regulatory and compliance data into the mix, you already have huge amounts of data.

This will only increase as we move into more complex operating environments, including shale and deep water, not to mention the need for real-time data.  We want information and we want it now, and in order to get the right information, we are acquiring more and more data to give us the answers we want, at the time we want them.

 

What problems does this cause?

This vast increase in the amount of available data can cause more problems than it solves, especially in two areas.  First is the speed of response: we are simply unable to consume that much data without developing the tools to identify, interpret and draw out the information we actually need. 

Second, the sheer volume of data puts an enormous amount of pressure on infrastructure, including bandwidth and cloud storage.

We want faster processors and bigger storage and we need them now to manage the data we already have.

 

Why is there a need for more efficient data analytics?

Traditional analytic methods produce output that requires an army of analysts to make sense of and provide decision support.  The data collected from the hundreds of thousands of sensors you might find on a rig may provide you with the information you need to take the next step, but it might still take hours to get to a decision and the resulting instructions. 

With what we call articulate analytics, text can be integrated with graphs, tables and any number of output styles, to give a decision based on the data available, with the subsequent instructions required to carry out the process, all within 60 seconds. 

Traditional methods require a minimum of 3–4 hours: valuable time that could be used far more productively.

 

How does NLG work and how is it applied to the oil and gas industry?

NLG – Natural Language Generation – is a sophisticated technology that can take a diverse set of data sources, analyse that data using processes that emulate human reasoning, and then – and this is the really important step – express the results of that analysis and reasoning in natural language, whether that be English, Spanish or Chinese.

It’s a technology that takes data and turns it into actionable plain-language reports providing timely and easily understood decision support. In the oil and gas world, we have proven this technology as applied to exception-based surveillance.  Existing alerting systems trigger alarms that require the engineer to do a deep-dive into the data.  This technology provides a short-cut by automating all the detailed analysis and producing a recommendation that the engineer can choose to accept or refine.

 

Does NLG improve process-system safety?

Yes, in many applications. When dealing with the huge volume of data, for example on a rig in the Gulf of Mexico, what one mostly finds is standard data types, and large volumes of data.

Most existing dashboards show current status, from which the user can deduce only the most obvious actions required.  With NLG, on the other hand, it is possible to concurrently monitor and compare all the other assets, and therefore all the potential amber or red light situations.  This accelerated and greatly enhanced analytic capability produces immediate, concise and accurate instructions for acting upon processes and systems as well as for assessing their potential for failure.

NLG opens the realm of preventative operations, not just reactive actions to fix something that has already gone wrong.  Not only will this pro-active approach improve process-system safety; it will significantly reduce downtime and therefore increase productivity onsite.

 

Is there a risk that NLG will omit vital data from its analysis?

The simple answer is no.  First, it needs to be remembered that there is a human element to NLG: we absorb into NLG software the engineer’s ability to articulate the problem and provide an answer.  NLG captures in software the subject matter expertise that guides standard, high-consensus procedures for investigating and analysing operator alerts and their background data, resulting in articulate, repeatable and reliable output.

If you consider the high volume of automated data collected on any given day, it is far more likely that relevant information will be missed if you are using only human analysis.  However, combining that automated data with algorithms that implement the knowledge and experience of your subject matter experts means that you are far less likely to miss the vital data required for rapid decision-making in the field.

 

Is it putting too much trust into (artificially intelligent) third-party analysis?

It’s easy to get side-tracked into the question of whether the software really is artificially intelligent.  There are tools we use every day that 20 years ago would have been considered artificially intelligent, but today are just considered unsurprising.  The key to NLG technology is its ability to capture SME knowledge and apply it to data from many and varied sources.  The result is expert output that is both immediate and easy to understand, and that bears as much credibility as the expertise it embodies.  

Let others debate whether that’s artificial intelligence. In some contexts of use you might want to retain a human in the loop, as with any critical technology chain, and especially in applications lacking a high degree of consensus among engineers about the steps of analysis required to produce actionable results; but there are many places where consensus is high and actionable results are readily obtainable.  Take, for example, our NLG work for the UK Met Office.  Detailed data and information on weather conditions across the UK have been available to meteorologists for years. The problem with so much data is that there simply are neither enough meteorologists nor hours in the day to produce localised articulate, written reports that utilise all that data.  Using NLG means that the Met Office is now able to produce 5,000 location-specific, articulately written reports for sites across the country instead of the 60 per day it produced before using NLG.

 

There’s a need to ‘humanise’ big data analysis for a non-technical audience, but is there a need when the audience is already highly versed in detailed analytics – such as oil and gas engineers?

NLG is not there to ‘dumb down’ data analysis either for a non-technical audience or for experienced engineers, although it can of course produce different styles of reports for different audiences using the same input data.

NLG makes sense of the huge volume of data available and cuts down the time needed by engineers in the field to analyse what needs doing.  As a result, engineers find they have more time and information available to them to carry out needed maintenance and repairs, and to stay focused on higher-level or system-wide concerns.  Our application of NLG in the oil and gas field takes available data, and rather than just outputting that data in graphs or tables, it analyses the data and issues recommendations to speed up the actions that flow from that analysis.  For example, most analytic tools could tell me my hard drive is broken, but an NLG application would be able to articulate not only why it is broken, but issue instructions on how to fix it, in terms that I personally can understand.  This is where NLG comes into its own. It’s not just automated data analysis: it’s the application of knowledge and experience, articulated in a style that operators can understand at their own level, whether they are non-technical or have years of engineering experience.

 

What frontiers of oil and gas exploration and production are opened up with this technology/how will this save money for operators?

NLG is especially useful in difficult-to-operate environments, for example deep water or the Arctic, or anywhere that might be considered harsh, where minimum manning is an attractive option and rapid decision-making is necessary.

Additionally, moving into frontier exploration, particularly in the emerging nations, means that much of the data already available is in less straightforward formats – documents, spreadsheets, CAD drawings and so on.  NLG can use all of that data, collecting everything that is available and producing output in articulate text with appropriately annotated illustrations.  Any company involved in oil and gas, whatever the region, is already overwhelmed by the amount of data it has. NLG offers a significantly faster and more powerful capability of analysis and communication when it is needed, drawing upon all available data.

NLG also has the ability to synthesise voice output, giving engineers the opportunity for hands-free input while working.

The possibilities for saving money are numerous.  Because of the speed of processing and analysing the available data, maintenance can be done before a situation becomes critical.  The more sensors for pressure, temperature, motion, and so on, a rig has, the more awareness can be brought to what can go wrong, and the more NLG can help.  Reducing the risk of error and need for repair in turn reduces downtime and increases productivity. 

 

How can NLG be applied to downstream process?

A refinery is every bit as complex as a rig, and potentially has even more elements with sensors to capture data.  Refineries run to much tighter economic margins, and any gain in productivity could make the difference between profit and loss. 

NLG works in exactly the same way as in the case we talked about earlier, capturing all of the automated data and applying the subject matter expertise of the engineers, yielding reports specifying maintenance and repair needs, as well as variations in volume, pressure and anything else that’s relevant.  Shortening the decision-making time and significantly reducing downtime associated with maintenance and repair will ultimately impact the bottom line, increasing profitability and reliability of supply.

 

To find out more about Arria NLG please visit: www.arria.com 

Got a news tip? Email news@oilandgastechnology.net