By Matthew L. Wald
Everybody seems to love renewables. They have a role, but being beloved doesn’t assure that they will be successful, especially not on their own.
Remember cellulosic ethanol? That was ethanol made from the non-food parts of the plant, like corn cobs and stalks, wheat straw, and other materials that are today mostly waste. Those materials have all the chemical ingredients that are in corn kernels, but in a different form. Surely there was a way to process them into something useful?
Apparently not. A lot of smart engineers and chemists, backed by a lot of money, have tried for a long time, but haven’t made it work, at least not on a commercial basis.
How about nanotechnology? Structures measured in billionths of a meter that would help cure diseases, repair structures in the body, and perform other miracles? In 2001, nanotech looked so promising that Cornell decided that its nuclear research reactor was unlikely to see an increase in use and tore it down to make space for a nanotech lab. But today, nanotech ranks high only on the list of breakthroughs with great expectations that haven’t been met.
Or fuel cells? Devices made practical in the Apollo moonshot era, that would be used under the hood to convert hydrogen to electricity, with no byproducts besides pure water and a little heat, weaning us away from gasoline without bulky, fragile batteries? Cars and trucks zipping along on electricity made by fuel cells powered with hydrogen, and models small enough to fit in a briefcase or backpack, to re-charge cell phone batteries and laptops. What happened to those?
Popularity and political appeal mean little for the long-term success of a technology. At the same time, the technologies that turn out to have big impacts are often not the ones with the most popularity. There will always be technological potentials that receive more support, fanfare, and hype, but pundits, politicians, and tech-watchers must remain level-headed when it comes to finding solutions. The worst case is that the popular technology will suck the money and air out of the conversation about the technology that will work. The resources available to decarbonize are limited; we should spend them wisely.
Technological Miscues
At the height of the period of fuel cell hype, I was invited by a venture capital firm to meet a scientist who thought he had licked a central problem of fuel cells. He had made a membrane that takes the hydrogen atom, nature’s simplest, with one electron and one proton, and divides those components into two flows. The flow of those charged particles is electric current.
The inventor was based at a government nuclear laboratory and had a new way to make the membrane: bombard it with neutrons from a nuclear reactor. This created small spaces, he said, that let the membrane do its filtration work. He had a sample of the membrane, which looked to the naked eye like kitchen plastic wrap, but a little thicker.
In a plush conference room, paid for with the profits from the VC company’s prior early-stage investments in a consumer electronics technology, he set up a test apparatus: a voltmeter, that looked like it came from Radio Shack, and a fuel cell that was sandwiched between slices of lucite. At the top was a funnel into which he poured a bit of methanol, which can deliver hydrogen. This made the voltmeter needle jump.
It was a watershed moment, although not as intended. “See? See? It’s working,” said the venture capitalist, animatedly.
“Yes,” said the scientist. “Now all we have to do is raise the density by two orders of magnitude.”
The VC paused. “What does ‘order of magnitude’ mean?”
It meant that a commercial version would have to produce 100 times more power from the same square area, but it also meant that the VC didn’t understand the details of what he had invested in. This isn’t necessarily bad. The business/technology ecosystem has space for people with big bank accounts created by previous lucky guesses. (When a venture capitalist does tech, there’s always a game show quality to the transaction--I’ll take what’s behind door number 3, for an initial funding round of $500k). The fact that in this case the investor did not have much technical understanding helps explain some technology wrong turns.
Among other wrong turns, or at least turns that didn’t match the hype, was the Segway, the two-wheeled, computer-balanced personal transportation device, that was hyped almost like magic carpets. Today it is a niche tool for mall cops and urban sightseeing tours.
Or there is cold fusion, a flash in the pan because it seemed too good to be true. (It was.) For a while, it was all the rage, the energy version of the Macarena.
Some silver-bullet technologies are long delayed, but they might yet deliver: self-driving cars for example, or fusion. And there are some more technologies whose failures were complete but not notorious, only quiet, like magneto-hydrodynamics. (It’s obscure unless you were paying attention in the opening scene of “The Hunt for Red October.”)
Solar isn’t the only technology where hype and hope exceed the results. But it has done spectacularly well in capturing the public imagination. Who could dislike turning sunlight into electric current?
A Shining Example
Misleading marketing helped. Rooftop solar is advertised all over as providing energy independence. An ingredient of its success is giving people with modest influence in the world the feeling that they are participating. Never mind that the Eastern Interconnection, the grid that stretches from New Orleans to Halifax, is the biggest machine in the world, with nearly 4,000 generators, and the Western interconnection has thousands more; put a few kilowatts of solar panels on your roof and you are a player! Enthusiasts talk about declaring their independence, but their dependence has just gotten deeper; they are no longer customers; they are now captive producers. There is no place else to sell the panels’ output.
The math doesn’t work for independence. For example, one blog advocating rooftop solar says that a household that uses 2,000 kilowatt-hours a month will need a 14.34 kilowatt system. The system could, in fact, produce 2,000 kilowatt-hours a month, but it would produce all of it within a ten-hour window every day (because the sun is up an average of 12 hours, but too close to the horizon to generate substantial electricity for the first and last hours.) And the production is out of sync with the household’s demand. With net metering, the household could reduce its charge to zero, but what it really needs is to be connected to a diversified grid, most likely with natural gas burners that run a bit less during daylight and more at night or when it’s cloudy. Batteries could help with this problem, but if the household needs 2,000 kilowatt-hours a month, that’s 64 KWH a day. To cope with three days without sun (A near-certainty in most of the United States) would require roughly 200 kilowatt-hours of storage, at many hundreds of dollars per kilowatt-hour. That means that the combination of solar panels and battery storage could approach the cost of the house.
A homeowner could back up the system with a fossil-powered generator instead, but that is expensive and dirty.
The illusion of independence is a harmless deception compared to the macro effects on the power system. As solar penetration of a grid increases, the panels drag behind them an enormous chain of support equipment: expensive batteries that are environmentally harmful to produce; fossil plants that are maintained at high cost to operate for a few hundred crucial hours a year, and improvements in the distribution system to handle the two-way flow. In California’s case, solar has required gigawatts of batteries whose purpose is not always clearly understood. They do store some energy, but they are essential for a different reason: they keep the lights on when the sun goes down and the solar rapidly goes away, faster than the fossil plants can start up to carry the load, to help maintain adequate supplies on the upside of the infamous Duck Curve.
When solar production is strongest, prices can dip below zero, meaning that at the sunniest times, the output is worthless, or worse than worthless. The California grid operator sets prices for electricity every five minutes, and reported that in April of this year, in 20.8 percent of those intervals, prices were below zero. And more than 800,000 megawatt-hours of electricity was “curtailed,” meaning unplugged. That’s enough electricity to run about 1 million households. (Or, to be completely accurate, would have been enough electricity to run those households. Except that the households would have had to use almost all that energy at mid-day.)
To be fair, curtailments are heavier than average in April, because sun is strong and air conditioning demand is low. But curtailments will get worse in the future, as renewable installations rise.
California and some other jurisdictions have belatedly un-sweetened the mix of incentives they offer for rooftop solar, generally by moving away from “net metering,” the pricing system that dictates that if a customer uses 900 kilowatt-hours a month and produces 800, then the customer is billed for just 100. It is true that the customer who generates 800 kWh is saving the utility the cost of generating that much energy (or buying it from a generator) but generation is only part of the utility’s cost; there is also the cost of distribution and transmission and various other services. Those costs will then fall to other customers, the ones who don’t have solar panels, perhaps because they do not own a roof to put them on.
New York and other places have instituted some fixed charges. The solar industry has howled. But engineering reality eventually determines the value of a technology no matter how popular it is. And a consensus emerges that the degree of implementation must eventually rest on the technology’s commercial utility. When solar reaches the point that its production exceeds what can be used, it’s time to re-think. Solar isn’t bad, in appropriate doses. It’s just over-hyped and mis-used.
Flavor of the Month
Other technologies advance without public clamor, simply because they work, often by leveraging other technologies. Fracking for natural gas in shale, for example, which created an American energy revolution, relies on a technique used in oil fields for years to fracture rocks bearing the resource needed, combined with new developments in directional drilling, and 3d analyses of geology done by supercomputers. Fracking emerged over years, with small public notice, but revolutionized the economics of the power grid and took a big chunk out of emissions of soot, smog precursors, mercury and carbon dioxide. Golden rice and other genetic modifications of food crops are probably in a similar category.
But failure IS an option, and not necessarily a bad one. Tech entrepreneurs should swing for the fences and expect most of their efforts to fail. The problem is the technology flavor-of-the-month syndrome, where we over-focus on a new, shiny object and raise unreasonable expectations. Artificial intelligence may be in this category. It was plodding along quietly and sometimes productively until November 2022, when ChatGPT was released and became the greatest thing since sliced bread. (Not that sliced bread changed the world so much, either.) Today AI is prominent as a way for students in high school and college to cut corners on term papers.
Its deeper uses are less obvious; for example, it is being used to improve the fuel utilization in boiling water reactors. Engineers at the University of Tennessee and Oak Ridge National Laboratory are working to combine artificial intelligence with 3-d printing. They have come up with a design for a reactor core with components fabricated in shapes that older manufacturing techniques could never achieve. But cooling channels of varying widths, for example, can keep temperature and power production more even across the core.
Other technology ideas get prominence not because they are ripe for exploitation, but simply because their success would solve some of the world’s very difficult problems. But the fact that they are needed doesn’t make them easier to deploy. Cellulosic ethanol is an example.
Some tech turkeys get currency because they would benefit interests with political clout—corn farmers, for example. Corn ethanol was initially promoted to reduce dependence on oil imports. It persists even though another invention, fracking, has turned the United States into a net oil exporter.
Some move forward on sheer momentum, often because of a military background. Nuclear energy is an example. Nuclear technologies, like other technologies, are assembled from the same ingredients but reach different endpoints; they differ from one another as much as spaghetti sauce and gazpacho. We are witnessing that now in nuclear energy, where there are half a dozen ways to split atoms. The atoms vary and the neutrons used to split them vary in energy. How did we end up where we are now, and how might we break away from the current paradigm of low-enriched uranium and neutrons whose energy is moderated by water? Again, the answer here is a guide to what may succeed in future.
Patience is a Virtue
The great secret of the current nuclear fleet is that its water-and-uranium mixture wasn’t chosen because it was best; it was chosen because it had momentum. Specifically, it had the imprimatur of the United States Navy because Admiral Hyman Rickover had chosen it for submarines. But even Rickover wasn’t certain it was best. He chose it because it was one of two types for which prototypes were built, and the other one, “fast” reactor cooled by liquid metal, developed leaks. The fast reactor, in which the neutrons that sustain the chain reaction are not slowed down and thus go on to split the next atom with great force, would have had better fuel utilization, but in the Cold War competition with the Soviet Union Rickover was eager to choose what could be deployed the fastest.
The Atomic Energy Commission and the commercial industry followed, scaling up the submarine reactor and putting it on land, to make electricity. The first nuclear power plant, EBR-1, was cooled by sodium and potassium, but that was a 1950s government effort; the private sector went with the better-established water technology. Now we are trying to go back to a similar technology.
There was other nuclear innovation. Public Service of Colorado built Fort St. Vrain, a gas-cooled, graphite moderated reactor that used fuel far more efficiently. But it was retired early because it was buggy, and the commercial pressure was to return to better-tested technologies. That was in the 70s and 80s; now we are trying to revive graphite-moderated, gas-cooled reactors. The lesson of those two nuclear technologies, which have seemed like dead ends for decades and are now in line to be tried again, is that successful innovation may also require patience and fortitude, to a degree that is rare in the commercial sector.
Gas-graphite reactors and fast reactors are two great hopes for advanced reactors, echoes of technologies pioneered decades ago, that were later bypassed.
The underlying point is that in the end, technology progress depends on public popularity only to the extent that public opinion steers some government R&D money. Popularity does not change physics or chemistry. And determining in advance which technologies will have a big impact is almost impossible. The best we can do is to have patience with what looks promising, and to beware of irrational overexuberance, which is a problem not limited to the stock market.