6.13.2010

Free online JPAL lectures about conducting randomized field trials

Real experiments have set a new standard for establishing causality in development economics.  In an evaluation process that looks very similar to clinical trials in medicine, social policies in a randomly assigned "treatment group" are evaluated against outcomes for other individuals in a "control group." The argument for this kind of work is that (1) the effect of social policies must be measured carefully if we want to understand whether they are worth implementing at a national level and (2) historical analysis of policies is often insufficient because there is no "control group" against which outcomes can be compared.  For a good example of this kind of research, read this Tech Review article on Ben Olken's work on democracy in Indonesia.

Not all modern work in development is of this type since not all interventions or questions are suitable for the method (eg. my own work studying the impact of hurricanes on development cannot be), but there is a trend toward relying on it more and more. 


The Abdul Latif Jameel Poverty Action Lab (JPAL) {co-founded by Ester Duflo, the subject of an earlier post by Jesse} is a group of researchers that do randomized field trials of various policy interventions for poor communities. While many people do this kind of work, the folks at the (JPAL) have honed the method and mastered the logistics (which are tremendously complex).  There is course taught by several people at JPAL for executives about how to conduct a randomized trial (and why) which has been recorded and is free online.  To access it:
  1. open up iTunes (which you can download for free here
  2. go to the iTunes Store (there is a tab on the left for it)
  3. search "poverty action lab executive training
All the lectures should show up and are free to download.

6.10.2010

How wise is a cellulosic ethanol mandate?

The Tech Review describes recent "slow progress" on the development of commercial cellulosic ethanol.  Apparently, some companies are moving ahead with strategies to scale up:
ZeaChem, based in Lakewood, CO, has begun construction of a 250,000-gallon-per-year demonstration plant in Boardman, OR, that will produce chemicals from sugar and eventually ethanol from wood and other cellulosic materials...
The company's strategy for making the business a financial success and attracting investment for commercial scale plants is to start by producing ethyl acetate, which "takes about half the equipment and sells for twice the price of ethanol, so it's an ideal starter product," he says. Other biofuels companies are taking a similar approach--looking for high value products to offset high costs, at least initially. ZeaChem plans to incorporate the technology into an existing corn ethanol plant for commercial production of ethyl acetate. "If all goes well, that plant could be in operation by the end of next year," he says. A stand-alone commercial cellulosic ethanol plant would follow. It could switch between selling acetic acid, ethyl acetate, or ethanol, depending on the market.
If you're at all confused about why this happening, when we don't know how to make cellulosic ethanol without net energy expenditure, its driven by government support:

A renewable fuel standard signed into law in late 2007 requires the use of 100 million gallons of cellulosic ethanol in the United States this year and will ramp up to 16 billion gallons by 2022. But so far no commercial plants are operating, according to the Biotechnology Industry Organization (BIO), a leading trade group representing biofuel companies. The U.S. Environmental Protection Agency announced in February that it was scaling back the mandates to just 6.5 million gallons, which could be supplied by existing small-scale demonstration plants and new plants expected to open this year. That's up from approximately 3.5 million gallons produced in 2009.
Is it wise for our government to be driving this kind of investment?   One concern is that biofuels, in general, lead to the production of crops for energy which necessarily will increase the prices of food crops that are displaced.  In a recent working paper, Wolfram Schlenker and Michael Roberts try to identify (using weather shocks) the effect of the biofuel mandate on world food prices.  They predict that the biofuel mandate, as it stood at the time of writing, would lead to an increase of world food prices by 20-30%.  Further, they argue that since agricultural production will expand to meet this demand, and expansion of cultivated land releases CO2 in net, the policy may not even reduce GHG emissions.

This second point reminds me of a blog post I wrote two years ago, where I argued that innovations in the technology for the conversion of cellulose into fuel may have dramatic externalities.  If biomass that is usually considered "useless" suddenly has a shadow price, the strategic incentives to harvest entire ecosystems may be dangerously strong.  

The government should almost certainly not be subsidizing the development of this technology; and one can argue (depending on how risk-averse you are) that they should be taxing it for the risk we all are bearing should it succeed.  

6.06.2010

1980

This is surreal to me. This is an excerpt from an article in the New York Times on April 12, 1980.

I repeat. 1980. Thirty years ago. The internet was still science fiction then.

It is about the oil spill caused by the Ixtoc I oil well in the Gulf of Mexico, which spewed 140 million gallons over several months in 1980:
History's largest oil spill has been a fiasco form the beginning to end. Human errors and ineffctive safety equipment caused the blowout, and none of the "advanced" techniques for plugging the well or recapturing the oil worked satisfactorily thereafter. The gusher ran wild for nearly ten months....
The enduring question is whether a devastating blowout could occur in our own offshore waters....
A second question: Could a blowout in American waters be quickly capped and cleaned up? Ixtoc I shows that control technology is still quite primitive.  Attempts were made to jam the pipe back into the hole; a large cone was lowered over the well to capture oil and gas, and steel and lead ball were dropped down the well to plug it. Nothing worked. Relief wells to pump in mud failed for months to reach their target... The mop-up techniques did not function effectively either....
Most Americans would accept risking such blowouts to find oilfields as rich as Mexico's.  But the lessons of Ixtoc I can help reduce the risks.
If you don't know why this is darkly funny, look over the list of strategies BP has employed to stop the current spill.  There is obviously something wrong with the incentives to innovate emergency/cleanup technology.  In 1980, the techniques we are still using today were being joked about as "advanced."

The sheer volume of innovation that has occurred in the last 30 years across an uncountable number of research fields is astonishing and a tremendous feat of human ingenuity.  The fact that effectively zero innovation has occurred in oil-drilling-catastrophe-management suggests that nobody believed there was a sufficient payout to warrant such investments.  Since these catastrophes are massive public-bads, and the cost of the externality is almost certainly not internalized by the oil-companies, then standard economic theory would suggest the government needs to create incentives to invest in these technologies.  The problem is doubly difficult because we often think that research in profitable industries is under-supplied, since researchers cannot capture the full value of their work.  So policies to reduce our risk must counteract both a public-bad problem and an innovation problem.

The obvious tendency is to heap blame on BP.  But the current situation is a result of 30 years (at least) of improper policy.  Whether the US government had sufficient information in 1980 to realize its policies motivating innovation [in these technologies] was too weak, I cannot say.  But this article seems to suggest that perhaps they did.

6.05.2010

(Oil spill & windmill) x (technology & management)

The BP spill is tragic, but it's gotten me learning about oil drilling technology and history. Some nice sites by the NYTimes, which I think are generally much more interesting then their coverage of the political drama:

A graphical explanation of how the well was supposed to be plugged.

A reference describing the different attempts at stopping the leak.

A timeline of historical oil spills, with actual newspaper clippings from the events and descriptions of the legislation that followed each.

Also, quite depressing, is this innovative way of communicating to people exactly how large of an area is affected by the slick.

After all of this, I think (or at least hope) that we're all more educated about the technology underlying our massive energy infrastructure.

Trying to be a pragmatist, I keep looking at these diagrams (especially the ones that show how far underwater, and underground, the leaking well is) and thinking to myself, "is this really easier than windmills, seriously?" It's understandable (at least from a micro-economists perspective) that power companies with massive amounts of sunk capital would misrepresent the costs of a transition to cleaner technology, even if it would have large positive externalities. Interestingly, on this point, a recent NREL study suggests that the costs of stabilizing energy supply using wind and solar power have been exaggerated.

A major argument against more rapid investment in renewables provided by many energy experts has been that it is difficult to ensure a steady supply of energy when weather and daylight fluctuate. To me, this always seemed dubious, since the US is enormous and there are large anti-correlations in wind and cloud-cover across the country. It seemed that if panels and farms were distributed intelligently across space, we could take advantage of these known structures to provide a smoother power-supply to the country. All it would take is some planning, knowledge of meteorology and some cooperation among utilities. This is exactly the finding of the NREL study. They say that 35% of energy could be supplied by renewables without installing substantial backup capacity. All it would take is a little coordination.

5.27.2010

Study Hacks on Esther Duflo

Cal Newport, the guy behind my favorite grad student philosophy / self-help /how-to-be-awesome website, Study Hacks, just posted a pretty interesting article on Esther Duflo, MIT econ professor, vanguard of the randomization movement in development economics, recent Clark medalist, undergrad professor of Sol's, and all around awesome professor. Specifically, he uses her as an example of the do's and don't's of finding one's "Life's Mission" :


This is what complicates the mission to find a mission. On the one hand, to discover them (and recognize them), you need a non-conformist’s confidence and a dedication to exploration. Duflo, for example, was a notorious searcher. Among other acts of defiance, she took time off in the middle of her studies to go work on practical economic problems in Moscow (where she met Jeffery Sachs). When she took Banerjee’s class she was actively seeking an outlet for her intellectual energies.
On the other hand, this sense of exploration has to be backed with competence in the relevant field. And developing this competence has a decidedly unexciting, conformist feel to it — a process replete with hard focus and resistance to distraction.
What I find interesting about this (aside from the various compelling tidbits about Duflo's life) is how much I think it has in common with doing good interdisciplinary work. People love to think about doing work that spans fields, and at the same time a lot of the interdisciplinary work that's done is rightfully lambasted as being unrigorous. I think that's because to do it right one has to go through the drudgery of learning the ins and outs of not one but two fields. There's the flight of inspiration that comes from thinking up great cross-field paper topics ("Can we use monsoon strength as an instrument for trade costs?") but there's also the long stretches of building confidence in relevant areas ("Hm. Guess I better learn about the global climate circulation. And trade.")

Viewed in this light, one can think of Duflo's work as in a lot of ways combining the statistical techniques from epidemiology (or maybe even more rightly, pharmacology) with the concerns of development economics. Economists of my generation I think largely take for granted the progression towards randomized trials and emphasis on identification (though even that's changing fast, see for the example the Spring 2010 JEP), but the progress to incorporate those techniques involved taking the field and broadening its boundaries by mixing it with the techniques and concerns of another. It's nice to hear the back story about how one of the major players decided to head down that route and appreciate just how much skill, drive, and chutzpah it took to do so.

And yes, I just noticed that Study Hacks' byline is "Demystifying Sustainable Success." How apt.

5.24.2010

Migrating birds on the front page of the NY Times

I'm very impressed and surprised by the coverage. And its a great story about innovations in scientific research. I didn't even realize that when I was a kid watching wildlife videos and learning about the Arctic tern that we couldn't actually observe them migrate. We could only see where they started and ended. But now with smarter tracking technology, we can observe their entire trajectory.

The whole story reminds me of other innovations in observational technology that slingshotted an entire field. For example, the invention of GFP, which can make portions of tissue glow, lead to enormous advances biology and related fields. (Martin Chalfie, one of the inventors who won the Nobel for it, is here at Columbia. I know because I saw him explain the idea to a gymnasium full of kids here with a [humorously] malfunctioning flash-light). I think that a lot of times, when we learn science in [grad]school, there is so much focus on theory, mechanisms and methods that we sometimes forget that the starting point of all science is observation.

PS. If you're like me and did a double take at the article's nonchalant statement about the groundbreaking technology
Geolocators ... just record changing light levels. If scientists can recapture birds carrying geolocators, they can retrieve the data from the devices and use sophisticated computer programs to figure out the location of the birds based on the rising and setting of the sun.
You should check out the website of the geolocator manufacturer Lotek, where they post scientific papers on the method. Here's an abstract from "An advance in geolocation by light" by P. A. Ekstrom (2004):
A new analysis of twilight predicts that for observations made in narrow-band blue light, the shape of the light curve (irradiance vs. sun elevation angle) between +3 and -5.DEG. (87 to 95.DEG. zenith angle) has a particular rigid shape not significantly affected by cloudiness, horizon details, atmospheric refraction or atmospheric dust loading. This shape is distinctive, can be located reliably in measured data, and provides a firm theoretical basis for animal geolocation by template-fitting to irradiance data. The resulting approach matches a theoretical model of the irradiance vs. time-of-day to the relevant portion of a given day's data, adjusting parameters for latitude, longitude, and cloudiness. In favorable cases, there is only one parameter choice that will fit well, and that choice becomes the position estimate. The entire process can proceed automatically in a tag. Theoretical estimates predict good accuracy over most of the year and most of the earth, with difficulties just on the winter side of equinox and near the equator. Polar regions are favorable whenever the sun crosses -5.DEG. to +3.DEG. elevation, and the method can yield useful results whenever the sun makes a significant excursion into that elevation range. Early results based on data taken on land at 48.DEG.N latitude confirm the predictions vs. season, and show promising performance when compared with earlier threshold-based methods.

5.22.2010

The achievements and under-achievements of our species

A friend sent me this spectacular time lapse video of the space shuttle preparation in an email. if you haven't seen it, you can watch it right here.

5.19.2010

NCDC: Warmest April Global Temperature on Record


GISS was already saying that 2010 would likely be the warmest year on record a while ago due partly to the current El Niño (see, for example, section 6 in the Current GISS Global Surface Temperature Analysis from January) and observation data are now bearing this out. NOAA's NCDC just released their April State of the Climate report, and not only was April 2010 the warmest April on record, but 2010's Jan-Apr four month span was the warmest on record as well. A gridded temperature anomaly map is at right.

There's a couple of things about this that are interesting. The first, which relates to some of my own work looking at how people internalize new info about climate change, is that record events seem to be one of the most straightforward ways of conveying to people that the climate is changing. "Climate" is, after all, the general distribution of various properties of the ocean and atmosphere over time, not any specific value at any specific time. Difficulty with the notion that climate change is a shift in this distribution is what often seems to underlie opposition to the theory of anthropogenic climate change; this is what gives rise to, say, your uncle's comment that a cold winter day proves that Al Gore is a liar.

Records, however, seem to be one of the ways of talking about distributions that people intuitively understand. Everyone understands that Usain Bolt is by objective measures a very fast runner cause he holds the world record for the 100m and 200m dashes. Similarly, people intuitively understand that if this is the "hottest year on record" then the climate is in a hot state. Moreover, if we get several hottest years on record in a row, then the climate is probably getting hotter.

So what? Well, given the above it's probably not as much of a surprise that studies like Yale's F&ES Project on Climate Change have been finding a downward trend in beliefs about climate change. After a slew of record hot years the planet has has a host of second-and-third hottest years since 2005, not exactly the sort of news that makes headlines, even if by historical standards the globe is still hot. Should 2010 shape up, as Hansen has repeatedly predicted, to be the hottest year on record, we'll probably see just as many news stories talking about how "global warming is back" as we have recent stories about the "cooling trend" of the past few years.

This, in turn, makes for an interesting counterfactual political economy question: what if Copenhagen had been scheduled for this December rather than last? Not to put too-high a hope on the established mechanisms, but maybe our outcome would have been a lot more positive if delegates had arrived knowing that they had just lived through the hottest calendar year on record...

5.18.2010

Time Lapse of Eyjafjallajokull Erupting

Want to see what hundreds of tons of aerosols being ejected into the atmosphere looks like in time lapse video? Of course you do. First commenter to suggest a way to use this as an instrument for something worthwhile gets all the fermented shark they can eat.

Hi, I'm Jesse, and I'm now blogging with Sol.



Iceland, Eyjafjallajökull - May 1st and 2nd, 2010 from Sean Stiegemeier on Vimeo.

Data transfer from Matlab to Stata and reverse

Zip file with Matlab code to export or import datasets to Stata. For use in conjunction with "insheet/outsheet". Importing data from Stata is restricted to numeric variables (but not the reverse).

3.22.2010

Sustainable Development at Columbia University

We've launched our new website for the Sustainable Development Doctoral Society for our PhD program at Columbia University:

www.columbia.edu/cu/sdds

We're aiming to maintain it well with active research updates.

3.12.2010

Blindness interventions

This is a pretty inspiring story of field work for children with treatable blindness in India, and neurological research that goes alongside treatments:

http://www.ted.com/talks/pawan_sinha_on_how_brains_learn_to_see.html

and another great story of work in the field on blindness (my classmate Mark Orrs is working with him).

http://adventure.nationalgeographic.com/2009/12/best-of-adventure/geoff-tabin/1

2.07.2010

Nomads in India, redistribution and Nash equilibria

This month's NGM has a nice article on nomadic groups in India. The basic story is that there are castes of nomads (an estimated 80M people total) that have traditional nomadic lifestyles and they are apparently "trapped" since they seem unwilling to settle, the gov't has difficulty providing them with services unless the settle, and settled populations are unwilling to let them settle. There are the usual undertones of discrimination and bureaucracy, but otherwise this seems like an unfortunate example of a classical Nash problem interfering with progressive redistribution.

I'm not entirely convinced by the gov'ts argument that they must settle in order for them to receive political rights or some forms of gov't benefits, such as basic healthcare, but I've also never tried administering an Indian municipality.

2.06.2010

Climate change and extremes

A lot of folks ask me what's going to happen with tropical cyclones or heat waves, etc with climate change. If you're not someone with any atmospheric science background, but want a single, easy few-page article to read, I'd recommend this one.

One of my advisors, Adam Sobel, gave me a copy of "Climate change, picturing the science", a coffee table book on climate change that he wrote a chapter for. His chapter on what we expect for extremes is very nice. Its honest about what we know or don't, and why, but it doesn't expect you to have thought much about the subject before. You can read it on google books here.

1.18.2010

Haiti

One of my supervisors, John Mutter, is a seismologist and has a recent post on the CNN blog in response to the earthquake in Haiti.

"Earthquakes don’t kill people; buildings do. And the poorest constructed buildings are inevitably home to the very poorest people. Homes and other structures built way out of safe building code – if codes even exist or are known about, or minimally enforced after the building inspector is bribed for a permit – are built by people who lack the resources to build minimally safe structures if they could."

I think that both him and I find nothing surprising about what has happened in Haiti. This doesn't make it any less tragic, but rather more tragic. The fact that this was at all foreseeable suggests that we can all be guilty of not having helped mitigate risk in advance. The Heifer Project is an organization that promotes development by letting wealthy Americans buy a cow or goat for a family in a poor country. Maybe an organization can enable an American to donate money to pay for rebar in a Haitian home during reconstruction.

1.16.2010

Aerial Photography and human development

This guy is a really cool aerial photographer that focuses on issues at the human-environment interface. His photos are a super interesting and powerful way to raise issues about development patterns in the US. I saw his new book "Over" in B&N recently and thought that I'd like to have his photos in every presentation I give from now on.

http://www.alexmaclean.com/

1.13.2010

Open Energy Info

Haven't gotten the chance to look through all of this, but it seems like a good project to aggregate decentralized information about energy projects, practices and potential:

http://en.openei.org/wiki/Main_Page

(even after 4 years at MIT, I'm still sometimes surprised when predictions about the "power of the www" actually come true).

1.10.2010

survey data

This website has a huge directory of survey data sets (mostly household-level, I think) that should be useful:

www.surveynetwork.org

They don't host the actual data, but it seems like a great place to see what's available.

I'll try to add it to my [short] list of places to look for data.


12.11.2009

Matlab toolbox updated

I've updated my Matlab toolboxes here.

Some useful additions include scripts to flexibly coarsen spatial data, area or population weighted averages of global spatial data and a new (small) suite of network functions.

12.01.2009

Sound pollution and sea mammals

This is a really interesting (and easy to read) article in Physics Today (I don't know why it's there) on the impact of sound pollution on sea mammals. It discusses, for example, the correlation of mass beachings and Naval sonar exercises.

Also, one sentence the author tosses out but caught me was:

"Biologists have used estimates of the population size and metabolic rate of sperm whales to calculate that those whales alone probably take about as much biomass out of the ocean as do all human fisheries."