Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

On September 5, a consortium of three oil-industry giants, Chevron, Devon, and Statoil, announced the results of a well test in the Jack Field located about 270 miles south of New Orleans and 175 miles offshore. It may be the largest discovery since Alaska’s Prudhoe Bay in 1968, and it is almost entirely due to recent advances in exploration technology.

The test was conducted in water more than 7,000 feet deep, with the bit going to a total of 28,175 feet, breaking the record set in Chevron’s deepwater Tahiti Field (see “The Oil Frontier). The test also set records for operating conditions: tools and fittings worked under 15,000-20,000 pounds of pressure, according to Stephen J. Hadden, senior vice president for exploration and production at Devon Energy Corporation. Perforating guns, which are used to poke additional holes in well pipe within pay sands in order to increase the flow of oil, were also successfully used at record depths.

The well sustained a flow rate of about 6,000 barrels a day, strong enough to encourage analysts to predict that the field may contain anywhere from three billion to fifteen billion barrels of oil, although the results of a second well test scheduled for 2007 will sharpen the accuracy of those figures considerably. If the higher-end estimate is correct, though, the discovery would approach Prudhoe Bay in size, and possibly increase total U.S. reserves by some 50 percent.

The most fundamental change in the technology of oil exploration today is in the collection of seismic data from sea level and in the computers that build models from that information.

To gather seismic data, ships fire charges from sea level and record how long it takes the ensuing vibrations to return. Whereas ships used to trail sensors on one cable perhaps 3,000 meters long, says Hadden, they’re now trailing sensors on more than 10,000 meters of line, and often dragging as many as nine cables, vastly expanding the scope of information returned.

Meanwhile, back in Houston, where major oil companies and a subcontractors crunch numbers, “the computers are evolving,” says Hadden. “It’s not one breakthrough, it’s a steady march. And we’re getting a clearer and clearer picture of the structures below.”

Like many deepwater Gulf fields, the Jack Field is “sub-salt,” meaning it lies beneath a protective layer of signal-scrambling salt. The advances in computing have made it easier for companies to guess what lies below, and center around improved algorithms and faster processors, which together are better able to convert the results into a useful picture of the relative depths of different geologic structures. That knowledge makes it worthwhile to drill test wells, which often cost as much as $100 million a piece.

9 comments. Share your thoughts »

Tagged: Energy

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me