8.25.2011

An association between the global climate and civil conflict

Yesterday, a paper that I wrote with Kyle Meng and Mark Cane was published in the journal Nature:

Civil conflicts are associated with the global climate 

Credit: Nature/S. Hsiang
Abstract: It has been proposed that changes in global climate have been responsible for episodes of widespread violence and even the collapse of civilizations. Yet previous studies have not shown that violence can be attributed to the global climate, only that random weather events might be correlated with conflict in some cases. Here we directly associate planetary-scale climate changes with global patterns of civil conflict by examining the dominant interannual mode of the modern climate, the El Niño/Southern Oscillation (ENSO). Historians have argued that ENSO may have driven global patterns of civil conflict in the distant past, a hypothesis that we extend to the modern era and test quantitatively. Using data from 1950 to 2004, we show that the probability of new civil conflicts arising throughout the tropics doubles during El Niño years relative to La Niña years. This result, which indicates that ENSO may have had a role in 21% of all civil conflicts since 1950, is the first demonstration that the stability of modern societies relates strongly to the global climate.

Updates

  • The original article is un-pay-walled here.  
  • A short summary that I wrote for Earth Magazine is here.
  • I present the results in a 30 min non-technical talk to policy-makers at the Woodrow Wilson Center here (starting at 43:00).


Additional Nature materials: Andrew Solow's News & Views pieceNature Podcast, Nature NewsEditorial

8.23.2011

More on slack labor markets affecting conflict

Kyle Meng points us to this recent NBER working paper. See this related post and this one too.

Building Peace: The Impact of Aid on the Labor Market for Insurgents
by Radha Iyengar, Jonathan Monten, Matthew Hanson

Abstract: Employment growth could reduce violence during civil conflicts.  To determine if increased employment affects violence we analyzed varying employment in development programs run by different US military divisions in Iraqi districts.  Employment levels vary with funding periods and the military division in charge.  Controlling for
variability between districts, we find that a 10% increase in labor-related spending generates a 15-20% decline in labor-intensive insurgent violence.  Overall the 10% spending increase is associated with a nearly 10% violence reduction, due to reduction in attacks which kill civilians, but increased attacks against the military. These findings indicate that labor-intensive development programs can reduce violence during insurgencies.

8.22.2011

The Lorax: did Dr. Seuss get sustainable development right?

Nature recently published a book review of Dr. Seuss's classic "The Lorax," so I figured that sharing my [coincidentally] recent thoughts on the book was fair game.

I recently bought the picture book for my 3-yr old cousin and read it to her when she visited New York. The next day while washing dishes, I started wondering to myself if I thought Seuss actually captured the challenge of sustainable development correctly.  I decided that he didn't.

8.21.2011

Weekend links @ the Adventure Journal

Some links from the adventure journal that I found and liked during my continuing ascent (descent?) into the blogosphere:

- The climate on other planets can put ours in perspective.

- Optimal policies for adaptation to environmental risk does not involve reducing our risks to zero. Instead, we should aim to accept calculated risks when the [marginal] cost of mitigating them becomes high.

- In rural Rwanda, bikes are low-tech but high-value, endearing them to their owners and D. Turrene, who put together this 1 minute tribute:


Une Minute de Vélos - Rwanda from Darcy Turenne on Vimeo.

8.18.2011

The best research quote ever

 Theories have four stages of acceptance:
1. This is worthless nonsense;
2. This is interesting, but perverse;
3. This is true, but quite unimportant;
4. I always said so. 
-J.B.S. Haldane
Credit to this book, which I'll review when I finish it. More research quotes here.

8.16.2011

Paul Meier passes away

The New York Times and boingboing point out that Paul Meier, one of the first and loudest proponents of randomized trials in medicine, passed away last week. From his obituary:
As early as the mid-1950s, Dr. Meier was one of the first and most vocal proponents of what is called “randomization.”

Under the protocol, researchers randomly assign one group of patients to receive an experimental treatment and another to receive the standard treatment. In that way, the researchers try to avoid unintentionally skewing the results by choosing, for example, the healthier or younger patients to receive the new treatment.

If the number of subjects is large enough, the two groups will be the same in every respect except the treatment they receive. Such randomized controlled trials are considered the most rigorous way to conduct a study and the best way to gather convincing evidence of a treatment’s effects.
Before randomization, the science of clinical trials was imprecise. Researchers, for example, would give a new treatment to patients who they thought might benefit and compare the outcomes to those of previous patients who were not treated, a method that could introduce serious bias.
The article rightly focuses on Meier's influence in medicine, but the influence of the randomized trial on modern economics (and the applied social sciences in general) cannot be overstated. The observation that randomization lets one link correlation to causality as opposed to simple association is the basis of much of modern econometrics, and the framing and intellectual architecture built upon that insight has provided a host of very important results. Everything from Freakonomics' popularization of applied microeconomics to the ongoing row in the development community between the highly successful "randomistas" and their critics can be viewed as stemming from Meier and his allies' attempts to make medical research robust. While randomization clearly isn't everything, and even a well-identified research strategy must be subject to a host of caveats, if you're following this blog, doing applied work yourself, or just generally care about empirically determined (as opposed to theoretically justified) policy, you might want to raise a glass tonight to Meier and his legacy.

8.14.2011

Weather, stock market returns, and subtlety in causal inference

While hanging out with a few academic friends on Friday I began discussing a recent research paper with someone I didn't know particularly well. It turned out that this guy was the odd man out of the group and instead of being a professor / post doc / grad student he worked in finance, and was not terribly supportive of a lot of empirical work. Trotting out the classic "correlation doesn't imply causation" critique he then said something along the lines of "you could show that rain makes the stock market go up and down and it wouldn't mean anything." This of course reminded me of one of my favorite counterintuitive-but-compelling research literatures: the effects of weather on stock market returns.

Now, first off, it has to be said that one of the nice things about working with climate data and effects is that causality is, in fact, generally pretty easy to establish. While humans appear to be quite good at affecting climate at decadal time scales, we generally are unable to affect day-to-day or even month-to-month weather patterns, and have great difficulty predicting timing and spatial patterns of highly-relevant weather behavior such as heat waves and storms even over a time span of hours or days. While this is bad from a welfare point of view (e.g., we'd love to be able to predict where a hurricane will make landfall a month ahead of time) it means that statistical analyses of the impact of weather itself on a given phenomenon, provided you're careful about your research design, are generally pretty causally attributable. (see important caveat below)*

Given that, it turns out that there's some pretty strong evidence that weather affects stock market returns. There are multiple papers pointing out that stock market returns are affected by local weather (the latter of those containing this depressing gem of wisdom: "behavioral finance shows that lower temperature can lead to aggression, while higher temperature can lead to both apathy and aggression"). My favorite and, as far as I can tell from this literature, the definitive word on the subject so far, is this paper by Hirschleifer and Shumway showing that: yes, stock market returns are affected by the weather; the effect is driven by sunlight or the lack thereof and not precipitation per se; but the effects are so small that the only way to arbitrage across it is if you have absurdly low transaction costs (echoing one of my favorite applied finance papers of all time, Schleiffer's The Limits of Arbitrage).

If the sunlight result makes you think of SAD, or seasonal affective disorder, you're onto something interesting: Kamstra, Kramer and Levi find strong evidence (getting some nice identification off of solar insolation across hemispheres) that stock markets experience something like it, too. A follow up paper argues that one could capture the same result based just on hemisphere-appropriate seasonality and that an explicitly psychological 'SAD' effect is probably not supportable at present, though that finding was in turn disputed by Kamstra et al. Regardless, I'd argue that (a) seasonality driving markets is a fairly interesting idea, as is any result that links natural processes (which, after all, the seasons fundamentally are) and human behavior and (b) this only further impresses the necessity of the important caveat below.

All of which is to say that sometimes what seems at first glance to be a semi-ludicrous postulate can turn out to be quite true. Evidence has been found that stock markets are affected by everything from sports results to lunar phases, and in many cases these relationships seem both intuitive and robust. The question of what those results mean, however, can sometimes be difficult to tease out (have I mentioned the important caveat*?), so a policy proscription or a deeper insight into human nature might not actually be forthcoming. Put in other words, perhaps my finance friend was right: you can show that stock markets are affected by sunny days, but really, what does that mean?

* Important caveat: The fact that weather is exogenous doesn't mean that saying something about mechanisms / pathways / etc. is easy. Weather affects everything from crop production to labor supply to ecology and phenology to the stock market behaviors seen above, so if you're going to make a claim about weather affecting something *through* some pathway, or even more dangerously plan on using it as an instrument, you should be very, very careful. Economists call your justification for claiming causality in such cases your "exclusion restriction," and if there's one concept I'd like to see enter into the general population memosphere, it's that.

8.13.2011

Five years later, five myths about Africa

Scott Baldauf, Africa bureau chief of The Christian Science Monitor has a interesting and anecdote-rich piece from last week in which he lays out five common myths about Africa that keep coming up in his work. If you're feeling impatient, the five points are:
  1. Africa is 'poor'
  2. Africa is 'violent'
  3. Africa needs our help
  4. Africa is 'backward'
  5. Africa is a country
That he considers none of these to be true isn't necessarily surprising (the idea of Africa being a country is so commonly joked about that there's a major Africa blog titled "Africa is a country"), but it is a nice little summary of the difficulty of trying to "understand" an entire continent, and one that I largely agree with. Though let's not forget that even that is not an uncontroversial idea.

8.12.2011

Why you should care about aerobiology

Last week's Eos had an interesting article on aerobiology (pdf here).  The authors summarize some of the science behing the dispersal of single celled organisms via the air, then talk about the social implications of air-dispersal (eg. the spread of infectious diseases over large distances) and current advances in monitoring and modeling the process (this last bit is left out below, see the article if you're interested).

[Other recent Eos articles are here and here]
The High Life: Transport of Microbes in the AtmosphereBy David J. Smith, Dale W. Griffin & Daniel A. Jaffe 
Microbes (bacteria, fungi, algae, and viruses) are the most successful types of life on Earth because of their ability to adapt to new environments, reproduce quickly, and disperse globally. Dispersal occurs through a number of vectors, such as migrating ani- mals or the hydrological cycle, but trans- port by wind may be the most common way microbes spread. 
General awareness of airborne microbes predates the science of microbiology. Peo- ple took advantage of wild airborne yeasts to cultivate lighter, more desirable bread as far back as ancient Egypt by simply leaving a mixture of grain and liquids near an open window. In 1862, Louis Pasteur’s quest to dis- prove spontaneous generation resulted in the discovery that microbes were actually single-celled, living creatures, prevalent in the environment and easily killed with heat (pasteurization). His rudimentary experi- ments determined that any nutrient medium left open to the air would eventually teem with microbial life because of free-floating, colonizing cells. The same can happen in a kitchen: Opportunistic fungal and bacterial cells cause food items exposed to the air to eventually spoil. 
Unknowingly, Pasteur founded the field today referred to as aerobiology, the science that studies the diversity, influence, and survival of airborne microorganisms. Sci- entists now have the ability to monitor the movement of atmospheric microorganisms on a global scale. But long-term molecular- based measurements of microbe concen- trations are still missing—such information is needed to improve understanding of microbial ecology, the spread of disease, weather patterns, and atmospheric circulation models.

8.11.2011

Networks of the economic elite

In graph theory, the set of companies and their board members is a classic example of a bipartite graph: individual board members sit on the boards of different companies, "linking" them in an abstract sense.  Similarly, different board members are "linked" to one another by sitting on the same board of a specific company.
I recently ran across this very nice visualization of board members and companies for the United States. The visualization project is aptly titled "They Rule" and was purportedly built to improve political-economic transparency:
Overview
They Rule aims to provide a glimpse of some of the relationships of the US ruling class. It takes as its focus the boards of some of the most powerful U.S. companies, which share many of the same directors. Some individuals sit on 5, 6 or 7 of the top 1000 companies. It allows users to browse through these interlocking directories and run searches on the boards and companies. A user can save a map of connections complete with their annotations and email links to these maps to others. They Rule is a starting point for research about these powerful individuals and corporations.
Context 
A few companies control much of the economy and oligopolies exert control in nearly every sector of the economy. The people who head up these companies swap on and off the boards from one company to another, and in and out of government committees and positions. These people run the most powerful institutions on the planet, and we have almost no say in who they are. This is not a conspiracy, they are proud to rule, yet these connections of power are not always visible to the public eye.
The visualization uses the API of the data collection group littlesis.org, which is itself also worth checking out.  It seems like a data set ripe for network-based analysis of our country's political economic structure.

8.10.2011

What's causing the English riots?

Rioting in London has now spread to several other cities in England. After starting in Tottenham, North London (where 26 years ago the Broadwater Farm riots started) the riots are now in their 3rd or 4th day and have already resulted in multiple deaths and a huge amount of destruction. Notable losses thus far include a warehouse housing much of the English music and film industry's product stock, which many fear may bankrupt uninsured firms throughout the industry. So, what's causing the riots?

As always, it depends on what you mean by "cause." The proximate reason was the shooting death of unarmed father of four and / or alleged drug dealer Mark Duggan. The small but long-running empirical literature on riots reveals what I imagine most of us already suspect: these sort of events are common triggers but seem to serve a roll much more akin to catalyst than true cause. Much has been said in the mean time about the role of social media and easy cellular communication in advancing the riots, but much as Jared Cohen of Google Ideas said about the Arab revolts of this past year, it's hard to argue that either of those factors is really making a difference in whether the riots occur so much as simply helping rioters coordinate (for a particularly good write up of the role of social media in the riots, and the source of that Cohen Op-Ed, see this post at Gigaom).

So what's causing the riots and what can we learn from it? It obviously depends on the type of riot but a running theme is a simultaneous interaction between poverty, economic disparity, and political disenfranchisement. That's clearly not always the case, but this echoes a lot of what we know from the empirical literature on conflict, especially internal conflict. You're not likely to have a civil war if you have homogeneous social groups or an established political system respected by the public, and you're not likely to have riots if you have the same.

All of which is to say that riots generally seem to be very human expressions of discontent and powerlessness that are difficult to ascribe to purely thuggish motivations like a desire to loot. As more stories pour in ( "I saw 3 or 4 young women looting Tesco Express for nappies and milk tonight" "In Enfield most of those who gathered in the town centre were white. The youngest looked about 10-years-old" "No kids don't want to go to college no more coz they don't get paid") it's becoming clear that the English rioting seems to be driven by that same combustible mixture of poverty, inequality, and lack of options. Given that those factors seem to only worsening around the world, it seems wise to view the English riots not as a freak event but rather as part of a larger global trend.

8.09.2011

Follow the Hurricane Season

Artemis has a nice site for following the Atlantic's current storm season here.  The site even tabulates predictions from the various forecasting groups and has links to real-time satellite imagery.

The site features a viewer by stormpulse.com, which is worth checking out on its native site for a variety of interactive features (they also map Pacific storms other severe weather in the United States).  This data-visualization group may even be helping our government improve short-term policy: they've made it onto an LCD in the White House situation room for three years running.

8.06.2011

Temperature and worker output

Earlier this week I was at a small but excellent conference on adaptation to climate change, hosted at PERC in Bozeman (Michael Greenstone's keynote is covered here).  Lots of interesting results and ideas came up, but this was one of the most exciting outcomes for me.

Matt Neidell was presenting his working paper (joint with Josh Graff Zivin) on "Temperature and the Allocation of Time: Implications for Climate Change" when he put up this graph. It shows the number of minutes in a day that individuals (who work in outdoor or temperature-exposed sectors in the USA) spent working as a function of maximum temperature (in Fahrenheit) that day.  The interesting part of the graph is that on hot days, people work for less time.


Wolfram Schlenker then commented something like, "60 minutes out of an 8-hr work day? That's a huge effect!" And I thought "hmmm... 12.5% sounds familiar..." and then pulled up this graph from my own work on my computer.  The panel shows total output for similar sectors (in 28 countries, not including the USA) as a function of average daily temperature (in Celsius).  The graph shows that national output in several [non-agricultural] industries seemed to decline with temperature in a nonlinear way, declining more rapidly at very high daily temperatures.


Matt's graph uses micro-data from the American Time Use Survey combined with interpolated daily weather station data while mine uses total national production from UN national accounts combined with degree-day reconstructions from NCEP reanalysis, so they are completely different data sets utilizing completely different methods, but the results look extremely similar!  On hot days, output in non-agricultural sectors drops and workers work less.

Furthermore, not only do the shape of the response-functions look similar, but the magnitudes of the responses are similar (recall Wolfram's comment).  In Matt's graph, a day with maximum temperatures near 102.5 F (39.2 C) reduces time working by about 60 minutes (12.5% of an 8 hr day) relative to a day with maximum temperatures near 77.5 F (25.3 C).  In my graph, a day with average temperatures near 30.5 C reduces output to around 90% of what it would be relative to a day with average temperatures near 27 C.  Since daily average temperature is usually computed by averaging daily max and min temperatures, variations in average temperature should be approximately one half of variation in the max.  Using this rule to convert to common units, Matt and Josh found time worked fell by about 1.8% per 1 C in daily mean temperature [12.5%/((39.2 C-25.3 C)/2)] while I found that national output fell by about 2.9% per 1 C in daily mean temperature [10/(30.5-27)].  These numbers are not exactly identical, but they are certainly not statistically different give the uncertainty in both of our models.  In fact, I would say that they are extremely close given how different our techniques are.  To me, this feels like a research success.

It's worth noting that reductions in worker output have never been included in economic models of future warming (see here and here) despite the fact that experiments fifty years ago showed that temperature has a strong impact on worker output (see here and here).  In my dissertation I did some back-of-the-envelope estimates using the above numbers and found that productivity impacts alone might reduce per capita output by ~9% in 2080-2099 (in the absence of strong adaptation).  This cost excedes the combined cost of all other projected economic losses combined (eg. see here and here).

8.05.2011

Climate CoLab

Hannah Lee sent me this interesting project put together by the MIT Center for Collective Intelligence: the Climate CoLab.

What should we do about climate change?
Somehow we have to answer this question. You can help.
The Climate CoLab seeks to harness the collective intelligence of contributors from all over the world to address global climate change.
The Climate CoLab is a forum where teams create proposals for what to do in a series of annual contests.

The current contest is to respond to the prompt: "How should the 21st century economy evolve, bearing in mind the risks of climate change?"  I'm excited to learn what our collective intelligence has to say about this.

In unrelated but hilarious news, see this comic about doing science courtesy of Reed Walker. (If you're not an active researcher then this probably isn't funny to you, sorry.)

8.03.2011

The causal effect of going to my high school

An old high school friend* (crony?) sent me a Gothamist article linking to a new NBER paper by Abdulkadiroglu, Angrist, and Pathak on the causal effect of going to a New York City or Boston specialized public high school:
[...] We estimate the causal effect of exam school attendance using a regression-discontinuity design, reporting both parametric and non-parametric estimates. We also develop a procedure that addresses the potential for confounding in regression-discontinuity designs with multiple, closely-spaced admissions cutoffs. The outcomes studied here include scores on state standardized achievement tests, PSAT and SAT participation and scores, and AP scores. Our estimates show little effect of exam school offers on most students' achievement in most grades. We use two-stage least squares to convert reduced form estimates of the effects of exam school offers into estimates of peer and tracking effects, arguing that these appear to be unimportant in this context. On the other hand, a Boston exam school education seems to have a modest effect on high school English scores for minority applicants. A small group of 9th grade applicants also appears to do better on SAT Reasoning. These localized gains notwithstanding, the intense competition for exam school seats does not appear to be justified by improved learning for a broad set of students.
For the non-economists on this blog, Josh Angrist is one of the top empirical economists in the world (as well as enormously fun to read, viz my beach reading from last spring break) so having him evaluate your high school's academic outcomes is sort of like having John Madden come in and critique your JV football team.

As the authors openly admit in the paper, the experimental design (regression discontinuity, which was begging to be used to evaluate NYC specialized high school outcomes) is inherently limited in what it can say about students who were not near the cutoff. By assumption one treats students who barely make it into the school as being more or less the same as students who barely fail to make it in, and thus "going to the specialized high school" can be considered quasi-randomly assigned. Given that (and all of the tests that are run to make sure this assumption is valid) it looks like the simple act of going to a specialized high school when you're on the cusp is pretty nil. This may be surprising (and of course gets summarized in the Gothamist as "Stuyvesant, Bronx Science, Top Public Schools Not Worth It") but I think becomes less so with a little unpacking...

First, it's important to note that there's a very big difference between being in the bottom 10% at one school versus the top 10% at another. Kids who barely make it into a specialized high school are competing and comparing themselves against the remaining 90% of students who had little difficulty getting in. Meanwhile, their comparables at other schools are at the high end of the distribution. Given the complex nature of peer effects, teacher attention, and everything else, I'd say that these populations end up having very different experiences.

Second, I'd argue that a major reason specialized schools exist is not to help marginal kids do better but to allow superstar kids to do extraordinarily well. Stuy is famously referred to as a "haven for nerds" and like many top schools succeeds by virtue of giving driven and talented kids the opportunity and resources to do what they want. I imagine it'd be difficult to tease out (perhaps something geographic? I know a lot of kids from my neighborhood in the Bronx who went to Bronx Science despite getting into Stuy because it was much closer...) but I strongly suspect that the causal effect of going to the school is hugely nonlinear in ability.

Lastly, I'd say that even if we were to grant that the local treatment effect identified in the RD design reasonably proxied for the school's impact, test scores might not be the best place to look at outcomes. The specialized high schools are often touted as a means of leveling the playing field between poor (often immigrant) public school kids and rich private school ones. I suspect that if the outcomes of interest were not test scores but rather admission to elite colleges or wages in one's mid-20s, the results would be rather different.

In sum, the paper is super tightly identified, but given the populations they can plausibly claim to compare and the outcomes evaluated I'm not hugely surprised that the authors find little effect. My friends on Facebook and G+ who've been forwarding this to me can calm down. Unless they scored within 10% or so of the cutoff, in which case: sorry, guys, it was all for nothing...

* Full disclosure: I went to one of these schools (Stuyvesant) and *barely* made the cutoff, an experience that scared me into overperforming on standardized tests for the rest of my life.

8.02.2011

Infectious cancers, genotyping, and indirect extinction pressure

The cover story of last week's PNAS covers attempts by scientists to analyze the Tasmanian devil genome. The hope is that by doing so they might better understand the devils' vulnerability to the infectious cancer that's currently wiping them out at an alarming pace:
The Tasmanian devil (Sarcophilus harrisii) is threatened with extinction because of a contagious cancer known as Devil Facial Tumor Disease. The inability to mount an immune response and to reject these tumors might be caused by a lack of genetic diversity within a dwindling population. Here we report a whole-genome analysis of two animals originating from extreme northwest and southeast Tasmania, the maximal geographic spread, together with the genome from a tumor taken from one of them. A 3.3-Gb de novo assembly of the sequence data from two complementary next-generation sequencing platforms was used to identify 1 million polymorphic genomic positions, roughly one-quarter of the number observed between two genetically distant human genomes. Analysis of 14 complete mitochondrial genomes from current and museum specimens, as well as mitochondrial and nuclear SNP markers in 175 animals, suggests that the observed low genetic diversity in today's population preceded the Devil Facial Tumor Disease disease outbreak by at least 100 y. Using a genetically characterized breeding stock based on the genome sequence will enable preservation of the extant genetic diversity in future Tasmanian devil populations.
Devil facial tumor disease manifests itself in a fashion as horrific as one might guess, and is considered a major threat to the species' existence. Infectious cancers are more common outside of humans, though DFTD is still particularly odd: it isn't viral but rather consists of parasitic cells which spread from host to host via blood-to-blood contact. It's very similar to canine cancers with similar behavior and is believed to originate in a subtype of neural cell called Schwann cells (want to scare a friend? show them the title to the Schwann cell Science paper and underline "clonally transmissible cancer").

DFTD is interesting to think about above and beyond its clinical oddness for the fact that it seems to be largely our fault: humans likely introduced the cancer in the first place via dogs (an invasive species in Tasmania), and are doubly at fault for having reduced the devil population (the 100 year old dip in genetic diversity mentioned above) thereby reducing genetic variation and the potential for resistance. Short of a human intervention soon (researchers are apparently trying to figure out why a very small number of females are partly immune), the species will likely go extinct.