Share with your friends


Analytics Magazine

Analytics in the oilfield

March/April 2014

Warren Wilson

Properly deployed, predictive and other forms of advanced analytics can yield crucial insights for exploration and production companies.

By Warren Wilson

Predictive analytics software has become increasingly attractive as it has increased in capability and fallen in price. It is becoming so powerful that many enterprises consider it a must-have technology. As with any investment decision, however, mounting enthusiasm tends to mute the critics and skeptics who may have valid questions about the rationale.

The importance of predictive and other forms of advanced analytics is self-evident. Properly deployed, they can yield crucial insights obtainable in no other way. Accordingly, exploration and production (E&P) companies should develop an analytics strategy as quickly as possible, if they have not already done so.

E&P companies should also be mindful that investments made without clear goals and implementation plans risk wasting critical resources and not achieving the desired results. As E&P companies formulate their plans, they can boost the likelihood of success by taking care to address three basic questions: What business value do you expect to gain, what data is required to realize that value, and which analytics tools are best suited to your goals? Only when those questions have satisfactory answers can an enterprise move forward with confidence.

Many E&P companies have already embraced business intelligence and other analytics tools in their back offices, particularly for financial management and enterprise resource planning. But they are significantly further behind in operations technology. Many are using only rudimentary IT in the operations that define their industry: exploring for oil and gas, developing reserves and managing production for maximum lifetime value.

Oil Field

Drilling data often isn’t saved. It is simply discarded, foreclosing any opportunity to look for patterns that could enable earlier problem detection and point the way toward better practices.


Drilling data, for example, is routinely gathered in real time so that the rig can be shut down if key measurements such as torque on the drill pipe and bit, or pressure in the mud circulation system, move outside of established limits. But this drilling data often isn’t saved. It is simply discarded, foreclosing any opportunity to look for patterns that could enable earlier problem detection and point the way toward better practices. So the starting point must be to identify gaps in data capture and plug as many of them as possible.

The next challenge is to minimize, or at least significantly reduce, data
fragmentation. Typically, exploration, development and production departments have maintained separate data repositories, each with its own data types. These repositories may be further fragmented geographically, for example, if companies organize and store data on the basis of national boundaries or oil-producing regions.

Such fragmentation reflects the way IT has typically been adopted: in piecemeal fashion, by local or departmental managers, to address a specific problem. Today, however, the E&P industry (like many others) has become so data-driven that the limitations of piecemeal adoption are all too evident. For one thing, the resulting data fragmentation makes data management less efficient and more expensive than necessary. In addition, it prevents the company from analyzing its data in a comprehensive, unified and forward-looking manner. This, in turn, poses two main problems. One is that fragmentation reduces the value of each type of data – exploration, development and production – individually. The other is that fragmentation makes it impossible to analyze the three types holistically, denying the company the insights that can come only from an integrated approach.

The best path forward can vary considerably from one enterprise to another, because piecemeal adoption means that no two companies start from the same place. One E&P company might be getting suboptimal results from its seismic tests and exploration drilling, but not realize it because it lacks the tools to analyze historical data and identify the factors degrading its results (to say nothing of predictive tools). Another company might be drilling too many or too few development wells, or not siting them correctly – again, due to lack of analytical insight. Yet another company might be deferring too much or too little production because it lacks the predictive analytics capabilities needed to make better deferment decisions. Or it might not have adequate insight into why production in a given well, field or region is declining, and how future production might be optimized using various intervention methods.

In addition to different starting points, E&P companies have unique assets. As a result, the key problem for one company may lie in its seismic exploration methods, while for another the main challenge might be production management. Such differences dictate different strategies with regard to data integration and analytics, leading individual E&P companies toward different vendors and applications.

Furthermore, E&P companies’ exploration, development and production operations historically have operated as separate departments. They support each other, of course, and information is routinely shared among departments. But integrated approaches have been difficult to impossible because the necessary information has typically been fragmented, housed in separate databases that are isolated from one another.

This isolation takes two basic forms. Similar types of data may be isolated geographically. For example, production data may be stored in regional databases that cannot talk to each other. Production data also may be stored in different formats, and/or with different technologies, that prevent holistic analysis.

Still, regardless of different starting points and unique challenges, E&P companies share common goals – reducing the drag on business performance that stems from the lack of unified data and analytics capabilities. The solution starts with creating a platform for analytics by consolidating and integrating data by type (exploration, development or production) over as broad a geographic area or portion of the company as possible.

Next is to deploy analytics tools that can optimize the value of historical data in each of the three main categories, while laying groundwork for real-time and predictive tools that can holistically analyze the three main categories of data.

Ovum primary research shows that E&P and oilfield services companies are taking up this challenge. In its 2013 ICT Enterprise Insights survey, Ovum interviewed more than 400 IT decision-makers in E&P (among more than 6,500 across 17 industries). Asked about their priorities in information management, the respondents said data warehousing and data management/integration technologies are among their top priorities for investment this year. Both technologies are important steps in laying the groundwork for broader use of analytics tools.

Important though it is, data integration should not be undertaken all at once. Depending on the degree of fragmentation of its existing data, an E&P company may face a complex challenge extracting all of this data, transforming it into a consistent format, and loading it into a new, unified database. Most enterprises will want to rely on an outside company – the analytics software vendor, a systems integrator or both – to do that, rather than build or hire for such skills internally.

Pragmatism dictates tackling the problem in smaller bites – for example, focusing on just one data type across a small to medium-sized geographic area. The approach will depend on what challenges the company is trying to address.

If offshore seismic data is a key problem, the company might focus first on analytics that allow it to monitor the data coming back from the seismic vessel in real time. That will allow it to identify poor-quality data immediately and have the operator correct it under its current contract, rather than waiting weeks to discover the problem and having to engage the contractor again under a new contract.

If the company’s main challenges involve development drilling, analytics tools can help determine the optimal number and spacing of wells to optimize yield and production costs. Similarly, predictive and prescriptive analytics tools can help an E&P company maximize the value of a well field’s lifetime production. Such tools also can help to minimize the cost of replacing submersible pumps (which fail with some regularity, bringing production to a halt), or to choose the best procedures to “work over” a well whose production has fallen due to causes such as sand accumulation or casing deterioration.

Still, while analytics software can deliver significant value in each of these cases, it is important to keep in mind that these examples address the three domains – exploration, development and production – separately. Companies that unify their data to enable holistic analysis of all three domains will understand each of the three much more deeply than they do today. In addition, they will likely find hidden interrelationships that can only be guessed at today. Ultimately, these new and deeper understandings will improve exploration success, bring new efficiencies to the development phase, and increase the lifetime return on their assets and investments.

Warren Wilson ( leads Ovum’s energy team, focusing primarily on IT for upstream oil & gas. His research focuses on the ways in which leading-edge IT such as analytics, information management and mobile/wireless technologies can enable better practices and results throughout the upstream industry. Wilson brings a unique combination of skills to his oil & gas research. He holds a degree in geology, has direct experience working in the oilfield, and spent several years as a journalist covering the exploration and production industry. An IT analyst for the past 15 years, his research has focused on mobile business applications and enterprise applications including ERP, CRM, supply chain management and analytics.

Wilson joined Ovum in 2006 when Ovum acquired his former employer, Summit Strategies, where he had worked for the previous eight years. Before becoming an IT analyst, he was a reporter and editor for U.S. newspapers including the Seattle Post-Intelligencer and The Miami Herald. He majored in geology at Carleton College in Northfield, Minn., and later worked in the oilfield as a roughneck and in well logging.

business analytics news and articles

Related Posts

  • 49
    FEATURES ABM and predictive lead scoring Account-based marketing, and the related technology of predictive lead scoring, is dramatically changing the face of sales and marketing. By Megan Lueders Software Survey: Joys, perils of statistics Trends, developments and what the past year of sports and politics taught us about variability and…
    Tags: analytics, predictive, data, software, strategy
  • 49
    FEATURES ABM and predictive lead scoring Account-based marketing, and the related technology of predictive lead scoring, is dramatically changing the face of sales and marketing. By Megan Lueders Software survey: joys, perils of statistics Trends, developments and what the past year of sports and politics taught us about variability and…
    Tags: analytics, predictive, data, software, strategy
  • 49
    Frontline Systems is shipping a new product line release for desktop and cloud, Version 2017 of its Solvers for Excel and, its SaaS equivalent. The new release unifies and simplifies Frontline’s products, makes learning predictive and prescriptive analytics accessible to everyone at very low cost, is easier to upgrade…
    Tags: data, software, analytics, predictive, advanced
  • 48
    Eric Siegel, founder of Predictive Analytics World (PAW, a series of conferences held throughout the year in major U.S. and European cites) and the author of the new book, “Predictive Analytics: The Power to Predict who will Click, Buy, Lie or Die,” is without question a key player in the…
    Tags: analytics, predictive, data
  • 44
    James Drew, adjunct professor at Worcester Polytechnic Institute and Fellow at the Verizon Corporation for more than 30 years (where he was an internal consultant in statistics and data mining), will lead a “Foundations of Modern Predictive Analytics” workshop on Oct. 2-3 in Arlington, Va.
    Tags: data, analytics, predictive, software, tools

Analytics Blog

Electoral College put to the math test

With the campaign two months behind us and the inauguration of Donald Trump two days away, isn’t it time to put the 2016 U.S. presidential election to bed and focus on issues that have yet to be decided? Of course not.


Survey: Despite the hype, AI adoption still in early stages

The hype surrounding artificial intelligence (AI) is intense, but for most European businesses surveyed in a recent study by SAS, adoption of AI is still in the early or even planning stages. The good news is, the vast majority of organizations have begun to talk about AI, and a few have even begun to implement suitable projects. There is much optimism about the potential of AI, although fewer were confident that their organization was ready to exploit that potential. Read more →

Data professionals spend almost as much time prepping data as analyzing it

Nearly 40 percent of data professionals spend more than 20 hours per week accessing, blending and preparing data rather than performing actual analysis, according to a survey conducted by TMMData and the Digital Analytics Association. More than 800 DAA community members participated in the survey held earlier this year. The survey revealed that data access, quality and integration present persistent, interrelated roadblocks to efficient and confident analysis across industries. Read more →



2017 Winter Simulation Conference (WSC 2017)
Dec. 3-6, 2017, Las Vegas


CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:

For more information, go to