With Lower Prices, Operators Need to Reoptimize Well Spacing and Frac Design

On April 20, WTI traded as low as -\$37/bbl, the lowest ‘price’ ever. Massive demand reduction caused by COVID-19 coupled with a collapse in OPEC+ supply constraints have conspired to bring our market the equivalent of a 100-year flood. Rigs are being laid down at a shocking pace, and the industry is experiencing a massive brain-drain as talent is laid off and retired early. Many of these people are never coming back, even if prices recover.

Personally, I do not believe that the current price environment is a new long-term normal. When a vaccine is available, demand will recover (though perhaps not to levels previously seen). The vast reduction in capital investment and loss of capacity will impact supply for years. Who knows? Maybe there will be a bounce to relatively high oil prices in 2021 or 2022.

Right now, companies are focused on survival. While many companies have stopped drilling altogether, most are continuing operations at a reduced pace. What changes do they need to make in response to this changed price and cost environment?

The simplest response would be to drill fewer wells, but to drill and complete them the same. If you do that, you’re leaving money the table.

Here is a ultrasimplified example to illustrate. Let’s say you could save \$400,000 by pumping less proppant, but this will reduce production by 10,000 bbl. Should you do it? At \$60/bbl, you’re sacrificing \$600,000, so the answer is no. I am neglecting time-value of money to keep this example very simple. If the price of oil is \$30/bbl, then you’re only sacrificing \$300,000 in revenue, and so yes, you should cut back on the proppant.

Of course, a real design optimization is more complicated. You need to consider price, the changing prices of inputs (has proppant gotten cheaper?), the time-value of money, and the specifics of your company’s business.

It is critical to be able to predict the production response to design changes. This is doable, but it takes work. A ‘data-driven’ approach would be to look back at past well performance and use this to predict responses to changes. If you have a lot of wells, you are able to normalize for geologic and operationally-driven variability across your asset, and you are only considering designs that you’ve used in the past, this approach may work. But a drawback to a purely data-driven approach is that the ‘optimal design’ may be something you’ve never done before. Data-driven approaches struggle to extrapolate outside the dataset. In contrast, physics-based models answer the question ‘why’ and so help drive innovation and improved engineering practices. They are less vulnerable to a data ‘overfit’ and are not as affected by correlated inputs and noise created by heterogeneity and small sample-size.

ResFrac is an example of a physics-based analysis tool. We simulate the physics of hydraulic fracturing, proppant placement, and production. We draw on our collective experience, and the collective experience of the scientific literature, and encapsulate these learnings into a physics-based engine to explain why things happen and how.

There is not an ‘either-or’ decision between being ‘physics-based’ and ‘data-driven.’ ResFrac is physics-based, but is also deeply informed by ‘data.’ In each project, we build the model and perform a history match based on all available data. The physics that go into the simulation are based on the collective experience of the scientific literature, as well as our experience applying the simulator to practical problems, and fine-tuning in response to that experience. Similarly, data-driven models may be designed to incorporate elements of physics.

A key advantage of physics-based modeling is that it is inherently interpretable. By looking at the output of a ResFrac simulation, you can see the mechanism linking changing a design parameter and increased production. Or, you identify a key uncertainty that impacts results – whether it is inaccuracy in well drilling surveys or the consistency of perforation hole size – and this allows you to focus resources on resolving these issues.

Many or most companies pursue both approaches, data-driven and physics-based, and then compare and contrast the results and think critically. These findings can then be fed into an iterative loop with economic optimization to drive decision-making.

With ResFrac, our workflow is: (1) set up an initial model based on geologic data and completions information, (2) history match to available data, and then (3) apply the history matched model to predict response to design changes. We help companies make decisions on topics such as well landing depth, well spacing, stage length, cluster spacing, and proppant injection schedule.

Our ‘killer-app’ is that we integrate the capabilities of a hydraulic fracturing and reservoir simulators. In shale, hydraulic fracturing creates the reservoir. The conventional approach of running hydraulic fracturing and reservoir simulation separately leads to an awkward workflow, a loss of physics and information, and an inability to simulate inherently coupled processes like frac hits, parent/child interaction, refracs, reinjection, and DFITs. Our integrated solution seamlessly handles these issues.

A good published example is given in our paper “A Utica Case Study: The Impact of Permeability Estimates on History Matching, Fracture Length, and Well Spacing,” by Fowler, McClure and Cipolla, presented at the 2019 ATCE. We performed a history match to Utica production data, including both fracturing and production data, and then used the results to optimize well-spacing and cluster-spacing. Based on the results from our 2018 DFIT Industry Study, we showed how to use a DFIT to properly assess permeability, and how to feed this into the analysis. The difference between an optimal and suboptimal design is a 30% impact on NPV or more.

Interactions between design parameters show why a physics-based approach is so important. For example, if you decrease cluster spacing but don’t simultaneously increase perforation pressure drop to overcome stress shadowing, then you may not see a benefit.

If you haven’t already tried a particular design, a data-driven approach probably won’t be able to tell you to use it. Also, a data-driven approach may not have this level of detail. If you rely solely on publicly reported data (well spacings, proppant loading, etc.), you’re missing out on a huge amount of critical, and granular, engineering inputs that are critical for optimizing design.

Both physics-based and data-driven approaches need to be integrated with basic engineering and in-situ diagnostics. For example, engineers use DTS/DAS to diagnose perforation efficiency, and to detect that flow can occur behind pipe for some distance away from the perfs. They’re using downhole imaging to diagnose problems that limit the effectiveness of limited-entry, such as perforation erosion, uneven hole sizes, and dependence of hole size diameter on perf orientation. And finally, engineers figure out solutions to these challenges – working with service companies to improve uniformity of hole size, experimenting with shot phasing, and even reconsidering the grade of steel used in the casing.

The last downturn in prices, in 2015, spurred a wave of innovation in frac design. Fracturing has become dramatically more effective and economically efficient since then. I am hopeful the same thing will happen in this downturn. The risk is that companies react by cutting spending so much that they lay off the technical teams they need to drive this innovation. If they are truly constrained on capital, they may have no choice. But the companies that are able to hang-in and continue to invest in innovation will see major benefit in the short-run and the long-run. The key is to integrate approaches – physics-based modeling to optimize and understand ‘why,’ data to learn from past experience, and ‘bread and butter’ engineering and field diagnostics to diagnose and solve inefficiencies.

It seems to me that operators really do understand the need to optimize design in order to maximize shareholder value. Interestingly, our company ResFrac hasn’t seen a slowdown in business over the past several months. We’ve continued to start up new projects and make new license sales, continuing at a rate much higher than it was one year ago. I think this is a reflection of the understanding among operators that they need to continue to optimize and improve in order to maximize the economic performance.

Thank you to Peter Reiss, Mark Pearson, Joe Frantz, Charles Kang, and Garrett Fowler for their helpful feedback on earlier drafts of this post.

Learn why both independents and supermajors alike trust ResFrac

Search