Skip to Content

Big Oil Goes Mining for Big Data

As petroleum production gets trickier, digital innovation becomes more crucial.

The world isn’t running out of oil and natural gas. It is running out of easy oil and gas. And as energy companies drill deeper and hunt in more remote regions and difficult deposits, they’re banking on information technology to boost production.

Data, in this case, really is the new oil. “It’s pretty sweeping,” says Paul Siegele, president of the Energy Technology Company at Chevron. “Information technology is enabling us to get more barrels of each asset.”

Oil companies are using distributed sensors, high-speed communications, and data-mining techniques to monitor and fine-tune remote drilling operations. The aim is to use real-time data to make better decisions and predict glitches.

The companies began to employ such technologies more than a decade ago, partly to help its aging workforce multitask remotely. But the technologies have gained speed along with the underlying trends: cheaper computing and communications technology, and a proliferation of data sensors and analytical software.

The industry term is the “digital oil field,” though the biggest companies have trademarked their own versions. At Chevron, it’s the “i-field.” BP has the “Field of the Future,” and Royal Dutch Shell likes “Smart Fields.”

Whatever these programs are called, they’ll play a huge role in the future of energy companies. The ones that are most successful at operating remotely and using data wisely will claim big rewards. Chevron cites industrywide estimates suggesting 8 percent higher production rates and 6 percent higher overall recovery from a “fully optimized” digital oil field.

That’s significant, says Siegele. Despite advancing renewable technologies, the International Energy Agency projects that global oil demand will still be growing by 2035 as more people use cars. And, as extraction becomes more difficult, almost $20 trillion in investments will be needed to satisfy these future needs.

Chevron is currently deploying up to eight global “mission control” centers as part of its digital program. Each is focused on a particular goal, such as using real-time data to make collaborative decisions in drilling operations, or managing wells and imaging reservoirs for higher production yields. The purpose is to improve performance at more than 40 of its biggest energy developments. The company estimates that these centers will help it save $1 billion a year.

At one machinery support center, opened in Houston in 2010 and expanded last year, shift engineers monitor visualizations and analytics from operations in Kazakhstan and Colombia. The center’s staff diagnosed a gas-injection compressor that showed subtle signs of overloading at Chevron’s Sanha Field off the coast of southern Africa. Operators there fixed the problem and avoided a potential loss of millions of dollars in downtime. Now there’s an automated early detection system based on the symptoms observed at that site.

Chevron first tested the i-field program in its century-old fields in California’s San Joaquin Valley, where it is using advanced thermal technologies to squeeze heavy oil from what might have once been considered a depleted reservoir. In the past, workers would drive around inspecting thousands of wells a day, says David Dawson, general manager of Chevron’s upstream workflow transformation organization. Now they use sensors and remote monitoring, and visit a well only when repairs are needed.

Since this early trial, real-time data analysis, imaging, and remote collaboration has become key to the setup at some of Chevron’s newest and most complex projects. These include projects in the deep waters of the Gulf of Mexico, off the coast of Nigeria, and 130 kilometers off the coast of Australia—the controversial $37 billion Gorgon Project, the single largest natural gas project in Australia’s history.

Real-time safety backups are also crucial as production gets more complicated, says Morningstar oil services equity analyst Stephen Ellis. Today, for example, Chevron is under fire in Brazil, where the company took responsibility for a 3,000-barrel offshore oil spill in November caused by an unanticipated pressure spike in a well. Siegele says Chevron’s i-field program will help prevent accidents and improve safety.

Much of the software innovation that’s key to the digitization of big oil is happening at oil service contracting companies, such as Halliburton and Schlumberger, and big IT providers including Microsoft and IBM.

Not every problem has been solved, however. It’s still tough to ensure reliable communications from the Arctic’s outer continental shelf, via fiber optic lines or satellite. Another limitation is data transmission speeds to relay pressure and temperature information from thousands of feet below the surface—although in recent years, electrically “wired” drill pipes have been able to relay this data an order of magnitude faster than before, at one megabit per second.

Already, Chevron’s internal IT traffic alone exceeds 1.5 terabytes a day. “The fire hose of data that comes up every minute and every hour is incredible,” says Jerry Hubbard, president of Energistics, a global nonprofit consortium working to standardize data-exchange formats within the energy industry.

Even startups are exploring the digital oil field. “The code in the old software platforms being used today, a lot of it is 20 years old,” says Kirk Coburn, who started Surge, a new Houston-based energy software startup accelerator with a digital oil section. “This technology can still be massively modernized.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.