Economists aren’t generally known for consensus of opinion, but there’s one point on which almost all of them agree: regulated markets are less efficient and end up costing consumers more than unregulated markets. It is this conviction that has driven the deregulation of the electric utility industry over the past decade. In a free electricity marketplace, the theory goes, the forces of supply and demand would drive the industry to become more efficient and reliable. Electricity prices would drop, supplies would increase and companies would make rational investment decisions based on expected returns. Because one unit of power is indistinguishable from any other, electricity would ultimately become a commodity, bought and sold on the basis of price with no concern over whether it was produced a few kilometers away or halfway across the country.
Time has shown, however, that the road to commoditization will not be without its bumps. As the California energy crisis of 2000 has demonstrated, partial deregulation-in this case, letting wholesale electricity prices float while keeping the retail price capped-can make matters worse. And as long as public opposition to new generating plants remains high, the supply of electric power will have difficulty keeping up with demand, no matter how enlightened the deregulatory policies.
But as those who study the electric industry point out, there’s an even more fundamental obstacle to realizing the promise of deregulation: the technology itself. Nearly everything in the current power system-from the generating plants and the transmission grid that distributes electricity throughout the country to the devices that run on that power and the meters that keep track of power usage-is designed for use in a centralized system of regulated, monopolistic utilities that produce power at a few locations and ship it out to local customers at a fixed price. While the regulatory policies have begun to change, the technology, for the most part, has not kept up.
Take the power grid, for example. Utilities began building this network of transmission lines over 100 years ago to bring power from their generating plants straight to their customers. In the early 20th century, they began to interconnect their transmission systems so that a utility that needed extra power might buy it from a nearby firm; but these uses remained a small part of the grid’s traffic. Thomas Edison himself-who came up with the grid’s original hub-and-spoke design-would likely have no difficulty recognizing today’s transmission system. But while Edison’s design has sufficed for a century, it doesn’t offer the flexibility required to turn electricity into a commodity.
“People are trying to operate the grid in a way it wasn’t designed for,” says Thomas Overbye, a power systems expert at the University of Illinois at Urbana-Champaign. “When you try to treat electricity as a commodity, you change the whole flow pattern.” A customer in Pennsylvania, for example, might now contract for power from a supplier in Illinois, through an electricity reseller who would pay for transmission rights on the lines between the two places. As a result, the electricity may be traveling a thousand kilometers instead of a few hundred, and because electrons follow the path of least resistance, the current will distribute itself over a variety of routes between source and destination, not just the single transmission path that has been paid for. As more and more customers-mainly large industrial concerns and utilities themselves-have gone further afield to find the least expensive power, traffic on transmission lines has increased to the point that a growing number of them are bumping up against their maximum capacity. It will only get worse in the future, as competition and choices increase, prompting more users to look beyond local utilities.
Help could be on the way. A new generation of technologies, some already in existence and some under development, could allow the power system to operate with unprecedented flexibility, efficiency and stability. Consumers could control the timing of their electricity purchases, in order to get the best prices. Homeowners and companies could operate their own generators, selling excess power back to the grid and helping to drive prices down further (see “Power to the People,” TR May 2001). Electricity would finally become a true commodity, freely traded in an open market. One caveat: many of the systems that would enable such practices are still hypothetical, and realizing them could require a fundamental rethinking of how the grid is owned and operated.
One way consumers might soon get the best prices for power-while at the same time reducing the strain on the grid-is by strategically timing when they buy electricity. Some large companies have been doing this in a rather crude way for years. A utility might offer them cheaper rates if they will agree to having their power turned off during times of peak demand. A company that can shut down operations for a few hours a few times a year can get a big break on its electric bill and can help utilities avoid overloading the grid. Similarly, a number of utilities have voluntary programs that give residential consumers a price break in exchange for installing energy-saving controls on their air-conditioning systems and water heaters. At peak hours, the utility sends a signal wirelessly or via the power lines that changes the thermostat setting on air conditioners or shuts down water heaters for a while.
Both the commercial and the residential programs are effective in their limited goals-helping utilities pull back from the brink when they near capacity. But neither explicitly involves the consumer in the decision to cut back on demand, says Steve Hauser, an energy systems expert at Pacific Northwest National Laboratory in Richland, WA. And if electricity is ever to become a true commodity whose price is set by the interplay of supply and demand, the demand will need to be determined by millions of individual decisions made around the grid.
To that end, Hauser’s team is developing dishwashers, refrigerators, air conditioners and other appliances that turn themselves off for brief periods of time when they sense it will help the grid. The “grid-friendly” appliances contain a device that a user can program so that shutoffs can be overridden at times when an appliance’s performance is critical. In times when the user’s needs are more flexible, however, the device can monitor the quality of the electricity flowing from the socket, detecting changes that indicate that the region of the grid is in danger of a blackout. The devices work because disturbances to the grid, like spikes in demand that outpace supply, can make the voltage level and the frequency of the current’s alternation deviate from their normal values. “If the frequency or the voltage started to shift, the appliance would drop off line for some period of time to allow the grid to stabilize,” says Hauser. Customers, he suggests, could get a rebate or a credit each time an appliance tripped off line. Of course, the grid won’t be saved by one or two dishwashers shutting down in a crisis, but Hauser envisions a time-perhaps as soon as three to five years from now-when a significant percentage of appliances sold are grid friendly.
One advantage of Hauser’s technique is that it does not depend upon providing special information to the appliances-all they need to know can be found in the current from the outlet. But ultimately, Hauser says, “we would like price signals to be sent down the line and have appliances respond to the price.” There is widespread agreement among power industry experts that utilities will eventually need to move to “real-time pricing,” charging more for electricity at peak times and less during periods of low demand to encourage their customers to move some of their use from peak to off-peak hours. Indeed, Washington State’s Puget Sound Energy is already charging homeowners different rates at different times-most on weekday mornings and evenings, less during the day, and least at night and on weekends. The utility hopes its customers will respond by, say, waiting until after 9 p.m. to run the dishwasher or washing machine.
But ideally, says Karl Stahlkopf, vice president of power delivery at the Palo Alto, CA-based Electric Power Research Institute, utilities would be able to vary pricing hourly or even minute by minute, sending real-time prices to smart appliances that can modify their activities automatically. Air-conditioning systems, for instance, might crank their thermostats up a couple of degrees when energy prices peak. Eventually, appliances might even send information back to the grid about how much electricity they expect to need in coming hours and how much they would be willing to pay for it.
Just how appliances would communicate with the grid is anybody’s guess, says Hauser: “It’s like trying to predict in 1985 where the Internet was going.” Researchers have suggested using phone lines, the Internet and satellite communications, among others. Some are even looking into transmitting the information along with the electricity, so an appliance would not need a separate hookup to keep abreast of fluctuating prices. But regardless of how the technical problem is solved, it’s clear that real-time pricing and communicative appliances could go a long way to solving both consumers’ and utilities’ woes.
Turning electricity into a commodity, however, requires changes not only to the way power is consumed, but also to the way it’s produced and distributed. With deregulation, a growing number of small power producers have begun sending electricity into the grid, and according to predictions from the U.S. Department of Energy, that number will skyrocket in coming years (see “Changing Capacity”). Thus an increasing percentage of the total supply of electricity is expected to be provided by small, independent plants, many of them using solar cells, windmills or other unconventional means to generate power. Similarly, an increasing number of large power users-factories, office buildings and others-and even some homeowners are expected to operate their own generators and sell their excess power to their local utilities. Eventually, instead of a few large plants feeding electricity into transmission systems at a few points, power producers will be scattered around the grid, most of them small and many of them contributing power only when the price for electricity goes above a certain level or when they have excess capacity.
“Distributed generation,” as this scenario is often called, could help keep electricity cheap and plentiful-and could encourage alternative power generation technologies. But for now, at least, it’s causing some technological headaches. For one thing, each new generating unit must be tied into the grid in such a way that the 60-hertz oscillation of its electrical output is synchronized with the oscillation of the entire network, says Jeff Dagle, an electrical engineer at Pacific Northwest. “A generator in Florida is in lockstep with a generator in Illinois,” he says. But there is no standardized way to establish such connections, making it difficult and expensive for nontraditional energy suppliers to hook up to the grid, says T. J. Glauthier, a former deputy secretary at the DOE who heads the Palo Alto, CA-based Electricity Innovation Institute. This in turn discourages new generators and drives up the cost of the electricity they supply, Glauthier says.
A number of startup companies are now working on affordable, standardized devices to keep small power generators in sync with the grid, and to allow them to communicate with consumers about price, availability and demand. Once such devices become available, costs should drop, and factories with backup generators or homeowners with solar cells on the roof may be able to compete with the utilities, at least on peak-load pricing.
Unfortunately, making it easy for more power suppliers to hook up to the grid could wind up threatening its stability. Today, the network’s proper functioning is the responsibility of a number of systems operators, each in charge of a large, contiguous section. The systems operator monitors the grid and issues directions for the management of the generating plants and transmission equipment. The operator’s most important function is to match electricity consumption with production, bringing new generating capacity on line as demand increases and taking it off line as demand falls. But as more power suppliers set up shop around the grid, the system becomes far more complex, says Steve Gehl, a director of strategic technology at the Electric Power Research Institute. This makes it much harder for central operators to know how the system is behaving and to direct it effectively.
The best solution, researchers from the Electric Power Research Institute suggest, may be the creation of a “self-healing grid”-a system that constantly monitors itself to spot potential problems and then correct them before they lead to power outages or other disruptions. Here’s how it would work: An array of sensors would detect everything from the voltage and current at junctions and substations, to the temperature of the air and the transmission lines, to the wind speed (a major factor in how efficiently the air cools the lines). Satellites would collect the data and forward them to a central location, where they would feed into a computer model that simulated the grid’s behavior over the coming minutes. With high-speed computers it should be possible to see problems arise in the simulations before they happen, and prevent them.
Take the case of a hot summer day, when consumers switch on their air conditioners and send demand skyrocketing. The sensors would be able to show if the transmission lines were beginning to heat up in certain spots. Before the wires could overheat enough to sag into trees, the central computers would use large electronic switches-giant transistors, in essence-to automatically reroute power as necessary, maybe even isolating a section of the grid to prevent it from taking the rest of the network down with it (see “A Smarter Power Grid,” TR July/August 2001).
The challenges to making this scenario a reality are less technical than economic and political. Indeed, Gehl says, many of the technologies needed for such a self-healing grid already exist. But, he adds, there has not been enough attention to how the technologies would be hooked together in a system. Perhaps more important, nobody has been willing to make the investments that advanced, automated control of the electricity infrastructure will demand. The problem, Gehl says, is that “it’s just not clear that people who invest in this development would be rewarded.”
As the grid is now set up, explains Marija Ilic, a power systems engineer in MIT’s Department of Electrical Engineering and Computer Science, there are few economic incentives to invest in improving it. In some parts of the country, the local electric utilities own sections of the grid, while in other regions parcels of the grid are owned by private transmission companies. But in either case, Ilic says, tight regulation stifles innovation. The owners of the grid are guaranteed a certain rate of return on their investment, and decisions about building new transmission lines are typically made by systems operators (federal or state governmental entities, depending on the region), based on studies about which parts of the grid are most vulnerable to overload. The operation of the grid is, in other words, the regulatory system at its worst. There is no way to reward entrepreneurs who take chances and provide new or better services.
The 50-year-old problem that eludes theoretical computer science
A solution to P vs NP could unlock countless computational problems—or keep them forever out of reach.
The moon didn’t die as early as we thought
Samples from China’s lunar lander could change everything we know about the moon’s volcanic record.
Forget dating apps: Here’s how the net’s newest matchmakers help you find love
Fed up with apps, people looking for romance are finding inspiration on Twitter, TikTok—and even email newsletters.
Inside the machine that saved Moore’s Law
The Dutch firm ASML spent $9 billion and 17 years developing a way to keep making denser computer chips.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.