Showing posts with label energy. Show all posts
Showing posts with label energy. Show all posts

5.24.2017

Free energy sources in the very long run

Judson - 2017 - The energy expansions of evolution
The history of the life–Earth system can be divided into five ‘energetic’ epochs, each featuring the evolution of life forms that can exploit a new source of energy. These sources are: geochemical energy, sunlight, oxygen, flesh and fire. The first two were present at the start, but oxygen, flesh and fire are all consequences of evolutionary events. Since no category of energy source has disappeared, this has, over time, resulted in an expanding realm of the sources of energy available to living organisms and a concomitant increase in the diversity and complexity of ecosystems. These energy expansions have also mediated the transformation of key aspects of the planetary environment, which have in turn mediated the future course of evolutionary change. Using energy as a lens thus illuminates patterns in the entwined histories of life and Earth, and may also provide a framework for considering the potential trajectories of life–planet systems elsewhere. 
Free energy is a universal requirement for life. It drives mechanical motion and chemical reactions—which in biology can change a cell or an organism. Over the course of Earth history, the harnessing of free energy by organisms has had a dramatic impact on the planetary environment. Yet the variety of free-energy sources available to living organisms has expanded over time. These expansions are consequences of events in the evolution of life, and they have mediated the transformation of the planet from an anoxic world that could support only microbial life, to one that boasts the rich geology and diversity of life present today. Here, I review these energy expansions, discuss how they map onto the biological and geological development of Earth, and consider what this could mean for the trajectories of life–planet systems elsewhere.
Worth reading in its entirety for the log-timescale perspective on energy budgets alone, but also as a fantastic piece of science writing and communication. "Of all the planets and moons in the Solar System, Earth is the only one to have fire..."

2.17.2014

Ambient carbon dioxide affects human decision-making

Amir Jina and I recently visited William Fisk at LBL who pointed us to his fascinating study:


Usha Satish, Mark J. Mendell, Krishnamurthy Shekhar, Toshifumi Hotchi, Douglas Sullivan, Siegfried Streufert, and William J. Fisk

Abstract:
Background: Associations of higher indoor carbon dioxide (CO2) concentrations with impaired work performance, increased health symptoms, and poorer perceived air quality have been attributed to correlation of indoor CO2 with concentrations of other indoor air pollutants that are also influenced by rates of outdoor-air ventilation. 
Objectives: We assessed direct effects of increased CO2, within the range of indoor concentrations, on decision making. 
Methods: Twenty-two participants were exposed to CO2 at 600, 1,000, and 2,500 ppm in an office-like chamber, in six groups. Each group was exposed to these conditions in three 2.5-hr sessions, all on 1 day, with exposure order balanced across groups. At 600 ppm, CO2 came from outdoor air and participants’ respiration. Higher concentrations were achieved by injecting ultrapure CO2. Ventilation rate and temperature were constant. Under each condition, participants completed a computer-based test of decision-making performance as well as questionnaires on health symptoms and perceived air quality. Participants and the person administering the decision-making test were blinded to CO2 level. Data were analyzed with analysis of variance models. 
Results: Relative to 600 ppm, at 1,000 ppm CO2, moderate and statistically significant decrements occurred in six of nine scales of decision-making performance. At 2,500 ppm, large and statistically significant reductions occurred in seven scales of decision-making performance (raw score ratios, 0.06–0.56), but performance on the focused activity scale increased. 
Conclusions: Direct adverse effects of CO2 on human performance may be economically important and may limit energy-saving reductions in outdoor air ventilation per person in buildings. Confirmation of these findings is needed.

From the LBL press page
On nine scales of decision-making performance, test subjects showed significant reductions on six of the scales at CO2 levels of 1,000 parts per million (ppm) and large reductions on seven of the scales at 2,500 ppm. The most dramatic declines in performance, in which subjects were rated as “dysfunctional,” were for taking initiative and thinking strategically. “Previous studies have looked at 10,000 ppm, 20,000 ppm; that’s the level at which scientists thought effects started,” said Berkeley Lab scientist Mark Mendell, also a co-author of the study. “That’s why these findings are so startling.” 
The primary source of indoor CO2 is humans. While typical outdoor concentrations are around 380 ppm, indoor concentrations can go up to several thousand ppm. Higher indoor CO2 concentrations relative to outdoors are due to low rates of ventilation, which are often driven by the need to reduce energy consumption. In the real world, CO2 concentrations in office buildings normally don’t exceed 1,000 ppm, except in meeting rooms, when groups of people gather for extended periods of time. 
In classrooms, concentrations frequently exceed 1,000 ppm and occasionally exceed 3,000 ppm. CO2 at these levels has been assumed to indicate poor ventilation, with increased exposure to other indoor pollutants of potential concern, but the CO2 itself at these levels has not been a source of concern. Federal guidelines set a maximum occupational exposure limit at 5,000 ppm as a time-weighted average for an eight-hour workday.

9.24.2012

"The world's lightest electronic vehicle"


From the project's Kickstarter:
"We have backgrounds in mechanical, electrical, and aerospace engineering from Stanford. We also love longboarding, snowboarding, kiteboarding, and wakeboarding. So it's no surprise that we joined our revolutionary prototype drivetrain with our favorite longboard components. But the bigger picture is even more exciting: changing the world of transportation and shifting the perception of what a vehicle can be. Here's an example. Charge the board every day (6 miles of use). Total electricity cost? Less than $5 per year."
For reference, Wikipedia's description of the last mile problem is short but does the trick. I'm shocked at the amount of power they say the motor generates (2000 watts, or about as powerful as a moped engine).

6.06.2012

Butting Heads: A Year’s Review of Research in the Context of the Global Energy Paradox

[This is the third in a four-part series of guest posts by first year students in Columbia's Sustainable Development program]

Butting Heads: A Year’s Review of Research in the Context of the Global Energy Paradox
By: Aaron Baum, Pablo Egaña and Erin McNally.

Today, the energy sector emits around two-thirds of global greenhouse gas emissions; yet, around one-quarter of the world's population has basic energy needs that are not being met and are rising.  There is a paradox in the pursuit of the dual goal to curb emissions by meeting demand via technology, which was impressively discerned by William S. Jevons more than a hundred years ago: technological progress that achieves higher levels of efficiency and decreased negative environmental effects tends to be accompanied by a decrease in the price of the resource, with a consequent increase in consumption rates.  Jevons’s paradox obliges us to scrutinize the view that the dual problem of rising emissions and demand can be simply resolved through the increased efficiency that technological progress yields.

In what direction is the research into viability – in terms of costs and feasibility – of carbon capture as well as generation and storage of sustainable energy headed?  In the following, we briefly explore answers to this question from selected investigations in leading scientific journals (Nature, Science and PNAS), focusing on the role that technology can play in sustainable development and the role of natural disasters in investment.

2.03.2012

Why don't we build more nuclear power plants?

Prospects for Nuclear Power
Lucas W. Davis

Abstract: The prospects for a revival of nuclear power were dim even before the partial reactor meltdowns at the Fukushima nuclear plant. Nuclear power has long been controversial because of concerns about nuclear accidents, proliferation risk, and the storage of spent fuel. These concerns are real and important. In addition, however, a key challenge for nuclear power has been the high cost of construction for nuclear plants. Construction costs are high enough that it becomes difficult to make an economic argument for nuclear, even before incorporating these external costs. This is particularly true in countries like the United States where recent technological advances have dramatically increased the availability of natural gas.


12.12.2011

Fukushima's long-term implications

Two articles came out in PNAS Environmental Sciences this week estimating the fallout from the Fukushima nuclear disaster (Yasunari et al. and Kinoshita et al.). Of particular concern is the following from Yasunari et al.:
As a general characteristic, most of the eastern parts of Japan were effected by a total 137 Cs deposition of more than 1,000 MBq km−2 . Our estimates show that the area around NPP in Fukushima, secondarily effected areas (Miyagi and Ibaraki prefectures), and other effected areas (Iwate, Yamagata, Tochigi, and Chiba prefectures) had 137 Cs depositions of more than 100,000, 25,000, and 10,000 MBq km−2 , respectively. Airborne and ground-based survey measurements jointly carried out by MEXT and the US Department of Energy (DOE) (21) show high 137 Cs deposition amounts were observed northwestward and up to a distance of 80 km from Fukushima NPP. It was estimated from the first measurement that by April 29, more than 600,000 MBq km−2 had been deposited in the area, which is greater than our estimate of less than 500,000 MBq km−2 (Fig. 2A), yet well within the range of uncertainty of our method (Fig. S4).
1,000 MBq (megabecquerels) per square kilometer is 1 kilobecquerel per square meter, so the three broad exposure estimates correspond to 100, 25, and 10 kBq m-2, with maximum treatments around 500-600 kBq m-2. To answer how worried we should be about this, we turn to Almond, Edlund, and Palme, 2009:
We use prenatal exposure to Chernobyl fallout in Sweden as a natural experiment inducing variation in cognitive ability. Students born in regions of Sweden with higher fallout performed worse in secondary school, in mathematics in particular. Damage is accentuated within families (i.e., siblings comparison) and among children born to parents with low education. In contrast, we detect no corresponding damage to health outcomes. To the extent that parents responded to the cognitive endowment, we infer that parental investments reinforced the initial Chernobyl damage. From a public health perspective, our findings suggest that cognitive ability is compromised at radiation doses currently considered harmless.
The heaviest fallout in Sweden (also due to Cesium 137 contamination) was around 65 kBq m-2 (see figure 2 of the paper). Moreover Japan's population density is roughly an order of magnitude larger than Sweden's. Given this, it looks like the long term human costs of this disaster may be absolutely staggering.

10.05.2011

Energy and temperature are substitutes in the production of health


This week in AEJ Applied:

Climate Change, Mortality, and Adaptation: Evidence from Annual Fluctuations in Weather in the US 
Olivier Deschênes and Michael Greenstone

Abstract: Using random year-to-year variation in temperature, we document the relationship between daily temperatures and annual mortality rates and daily temperatures and annual residential energy consumption. Both relationships exhibit nonlinearities, with significant increases at the extremes of the temperature distribution. The application of these results to "business as usual" climate predictions indicates that by the end of the century climate change will lead to increases of 3 percent in the age-adjusted mortality rate and 11 percent in annual residential energy consumption. These estimates likely overstate the long-run costs, because climate change will unfold gradually allowing individuals to engage in a wider set of adaptations.





More on temperature's substitutes/compliments here and here.

(h/t Michael O)

7.17.2011

Superman: A transitional energy source


Happy Summer Sunday afternoon from Fight Entropy, thanks to our friends at Saturday Morning Breakfast Cereal. Click through for the whole cartoon.

3.14.2011

Does Daylight Saving Time Save Energy?

While reflecting on my loss of sleep this past weekend, I found this working paper from a few years ago:

Does Daylight Saving Time Save Energy? Evidence from a Natural Experiment in Indiana

Matthew J. Kotchen, Laura E. Grant
NBER Working Paper No. 14429

Abstract: The history of Daylight Saving Time (DST) has been long and controversial. Throughout its implementation during World Wars I and II, the oil embargo of the 1970s, consistent practice today, and recent extensions, the primary rationale for DST has always been to promote energy conservation. Nevertheless, there is surprisingly little evidence that DST actually saves energy. This paper takes advantage of a natural experiment in the state of Indiana to provide the first empirical estimates of DST effects on electricity consumption in the United States since the mid-1970s. Focusing on residential electricity demand, we conduct the first-ever study that uses micro-data on households to estimate an overall DST effect. The dataset consists of more than 7 million observations on monthly billing data for the vast majority of households in southern Indiana for three years. Our main finding is that -- contrary to the policy's intent -- DST increases residential electricity demand. Estimates of the overall increase are approximately 1 percent, but we find that the effect is not constant throughout the DST period. DST causes the greatest increase in electricity consumption in the fall, when estimates range between 2 and 4 percent. These findings are consistent with simulation results that point to a tradeoff between reducing demand for lighting and increasing demand for heating and cooling. We estimate a cost of increased electricity bills to Indiana households of $9 million per year. We also estimate social costs of increased pollution emissions that range from $1.7 to $5.5 million per year. Finally, we argue that the effect is likely to be even stronger in other regions of the United States.

8.28.2010

A new mechanism to consider when measuring climate impacts on economies

[A shorter (and more heavily copy-edited) version of this post was published in EARTH Magazine, read it here.]

My paper Temperatures and tropical cyclones strongly associated with economic production in the Caribbean and Central America was recently published in the Proceedings of the National Academy of Sciences. Because the paper is a little technical, here is a presentation of the results that everyone should be able to understand.

Following countries over time, years with higher than
normal temperatures during the hottest season 
(Sep-Oct-Nov) exhibit large reductions in output across  
several non-agricultural industries.
Central finding:
Economic output across a range of industries previous thought of as "not vulnerable to climate change" respond strongly to changes in temperature.  The data suggest that the response is driven by the direct human response to high temperatures: people generally are less productive and tire faster when it's hot.  This impact, which appears to be quite large, has not been factored into any previous estimates for the global cost of climate change.

Background
Governments and organizations around the world are trying to figure out how much money we should spend to avoid climate change.  The answer isn't obvious.  On the one hand, climate change seems ominous and we'd like to spend lots of money to avoid it. But on the other hand, if we spend money on avoiding climate change, we can't spend it on other important things. For example, imagine that the United Nations has a million dollars it can spend. Should it spend it on building solar panels or building schools?  Both are clearly important. But if we want to get the most "bang for our buck," we need to figure out what the benefits of these two types of investments are.

A whole research industry has sprung up around the cost-benefit analysis of preventing climate change.  How much money should be spent to prevent climate change by investing in more expensive low-carbon technologies? Who should pay for it and when should they pay for it?  A tremendous amount of intellectual machinery has been applied to this problem by many extremely smart people.  The basic approach is to build models of the world economy-climate system and try to see what happens to the climate and the economy under different global policies.  These models are used by governments around the world to determine what they think the best climate policies are and how much they should spend on the problem.

However, there is something of a dark secret to this approach: we don't really know what will happen to us if the climate changes.  We have a fairly good grasp of how much it might cost to implement different energy policies. And we've learned a lot about how different energy policies will translate into global climate changes.  But when it comes to figuring out how those climate changes translate into costs to society (both financial and non-monetary), we end up having to do a lot of guesswork.

It's unfair to say we know nothing about the costs of climate change, but what we understand well is limited to certain types of impacts.  For example, we have been doing extensive research on the possible agricultural impacts for years. We've also done studies for a lot of the health impacts.  But most research stops there.  For example, we only are beginning to learn about the effect of climate on people's recreation and perceived happiness.  We're also only beginning to learn about the effect of climate on violence and crime.  We know a lot (but not nearly everything) about the effect of climate on ecosystems, but we don't really understand how ecosystems affect us, so we still can't estimate this impact on society. The list goes on.

Now we know a lot about climate impacts on health and agriculture because people have studied those impacts a lot.  Why did we study those kinds of impacts so much? I'm not sure. Maybe because the importance of climate on health and agriculture is obvious (eg. my plants on the windowsill died after just two days of this summer's heat wave).

The fact that we only really understand agricultural and health impacts of climate change is very important in the cost benefit analyses I mentioned earlier.  When governments are trying to figure out the best policies, they add up the known costs of preventing climate change and they add up the known benefits of preventing climate change.   If the costs outweigh the benefits, then that suggests we shouldn't spend much money to stop climate change.  But there is a natural asymmetry in this comparison between costs and benefits: we know all (or most of) the costs but only know the health and agricultural benefits.  So when we add up the costs of energy policies, the numbers tend to look very big.  But when we add up the known benefits of those policies, we add up the health benefit and the agricultural benefits, but we have to stop there because we don't know what else will be affected by climate change.  Maybe it shouldn't be surprising that many cost-benefit analyses find that climate change is not worth spending a lot of money on.

But what we know about climate impacts in non-health and non-agricultural sectors is slowly improving.  In a 2009 working paper, Dell, Jones and Olken did something very simple and got very surprising results.  They compared the economic output of countries over time with year-to-year changes in the weather of those countries.  They found that in poor countries, small increases in the annual average temperature of a country lead to large drops in economic output of that country.  The approach sounds simple, right? It is.  But the results are startling because they found such a large effect of temperature. They estimate that a 1C increase in average temperatures decreases a poor country's gross domestic product (GDP) by 1.1% in the same year. To get a sense of how big this effect is, recall that the economy of the Unites States shrank by 2.4% in 2009 and people are upset about the state of the economy.

Because the effect found by Dell et al. is so large, many people have been skeptical that it represents something real (note from my own unpublished work: I can corroborate their results using different data sets from the ones they used).  To check these results further, in 2010, Jones and Olken tried to looking for a similar effect in the exports of these countries and found that they also responded strongly to temperature changes.  Do people believe the general result yet? I'm not sure.  But part the skepticism seems to persist because its hard to know why poor countries should be so strongly affected by temperature.  One reason for this is that it's very hard to know what mechanisms are at work when one is only looking at macro-economic data.  Further, thinking of ways in which temperature affects economics this strongly and systematically across countries seems to be hitting the limits of many peoples' imaginations. This is where my study comes in.

6.10.2010

How wise is a cellulosic ethanol mandate?

The Tech Review describes recent "slow progress" on the development of commercial cellulosic ethanol.  Apparently, some companies are moving ahead with strategies to scale up:
ZeaChem, based in Lakewood, CO, has begun construction of a 250,000-gallon-per-year demonstration plant in Boardman, OR, that will produce chemicals from sugar and eventually ethanol from wood and other cellulosic materials...
The company's strategy for making the business a financial success and attracting investment for commercial scale plants is to start by producing ethyl acetate, which "takes about half the equipment and sells for twice the price of ethanol, so it's an ideal starter product," he says. Other biofuels companies are taking a similar approach--looking for high value products to offset high costs, at least initially. ZeaChem plans to incorporate the technology into an existing corn ethanol plant for commercial production of ethyl acetate. "If all goes well, that plant could be in operation by the end of next year," he says. A stand-alone commercial cellulosic ethanol plant would follow. It could switch between selling acetic acid, ethyl acetate, or ethanol, depending on the market.
If you're at all confused about why this happening, when we don't know how to make cellulosic ethanol without net energy expenditure, its driven by government support:

A renewable fuel standard signed into law in late 2007 requires the use of 100 million gallons of cellulosic ethanol in the United States this year and will ramp up to 16 billion gallons by 2022. But so far no commercial plants are operating, according to the Biotechnology Industry Organization (BIO), a leading trade group representing biofuel companies. The U.S. Environmental Protection Agency announced in February that it was scaling back the mandates to just 6.5 million gallons, which could be supplied by existing small-scale demonstration plants and new plants expected to open this year. That's up from approximately 3.5 million gallons produced in 2009.
Is it wise for our government to be driving this kind of investment?   One concern is that biofuels, in general, lead to the production of crops for energy which necessarily will increase the prices of food crops that are displaced.  In a recent working paper, Wolfram Schlenker and Michael Roberts try to identify (using weather shocks) the effect of the biofuel mandate on world food prices.  They predict that the biofuel mandate, as it stood at the time of writing, would lead to an increase of world food prices by 20-30%.  Further, they argue that since agricultural production will expand to meet this demand, and expansion of cultivated land releases CO2 in net, the policy may not even reduce GHG emissions.

This second point reminds me of a blog post I wrote two years ago, where I argued that innovations in the technology for the conversion of cellulose into fuel may have dramatic externalities.  If biomass that is usually considered "useless" suddenly has a shadow price, the strategic incentives to harvest entire ecosystems may be dangerously strong.  

The government should almost certainly not be subsidizing the development of this technology; and one can argue (depending on how risk-averse you are) that they should be taxing it for the risk we all are bearing should it succeed.  

6.06.2010

1980

This is surreal to me. This is an excerpt from an article in the New York Times on April 12, 1980.

I repeat. 1980. Thirty years ago. The internet was still science fiction then.

It is about the oil spill caused by the Ixtoc I oil well in the Gulf of Mexico, which spewed 140 million gallons over several months in 1980:
History's largest oil spill has been a fiasco form the beginning to end. Human errors and ineffctive safety equipment caused the blowout, and none of the "advanced" techniques for plugging the well or recapturing the oil worked satisfactorily thereafter. The gusher ran wild for nearly ten months....
The enduring question is whether a devastating blowout could occur in our own offshore waters....
A second question: Could a blowout in American waters be quickly capped and cleaned up? Ixtoc I shows that control technology is still quite primitive.  Attempts were made to jam the pipe back into the hole; a large cone was lowered over the well to capture oil and gas, and steel and lead ball were dropped down the well to plug it. Nothing worked. Relief wells to pump in mud failed for months to reach their target... The mop-up techniques did not function effectively either....
Most Americans would accept risking such blowouts to find oilfields as rich as Mexico's.  But the lessons of Ixtoc I can help reduce the risks.
If you don't know why this is darkly funny, look over the list of strategies BP has employed to stop the current spill.  There is obviously something wrong with the incentives to innovate emergency/cleanup technology.  In 1980, the techniques we are still using today were being joked about as "advanced."

The sheer volume of innovation that has occurred in the last 30 years across an uncountable number of research fields is astonishing and a tremendous feat of human ingenuity.  The fact that effectively zero innovation has occurred in oil-drilling-catastrophe-management suggests that nobody believed there was a sufficient payout to warrant such investments.  Since these catastrophes are massive public-bads, and the cost of the externality is almost certainly not internalized by the oil-companies, then standard economic theory would suggest the government needs to create incentives to invest in these technologies.  The problem is doubly difficult because we often think that research in profitable industries is under-supplied, since researchers cannot capture the full value of their work.  So policies to reduce our risk must counteract both a public-bad problem and an innovation problem.

The obvious tendency is to heap blame on BP.  But the current situation is a result of 30 years (at least) of improper policy.  Whether the US government had sufficient information in 1980 to realize its policies motivating innovation [in these technologies] was too weak, I cannot say.  But this article seems to suggest that perhaps they did.