Weak labor markets make army-building cheaper

It's generally accepted that civil conflict is more likely when a county's economic growth slows down (see here).  One hypothesis to explain this phenomena is that when economies perform poorly, soldiering becomes a relatively more appealing job and attracts individuals who might otherwise work elsewhere.  The argument goes that this makes it cheaper for opposition groups to build forces to fight an established government, thereby making conflict more likely.

In what seems like support for this idea (albeit in a different context), I heard this story reporting that these labor market dynamics appear to be obvious in the United States and are accepted wisdom at the Pentagon:
"A recession really does make recruiting less challenging than it otherwise would be," says Dr. Curtis Gilroy, who oversees active duty military recruiting nationwide. "We're in our third year in which all active duty services have achieved their numerical recruiting goals and either met or exceeded their recruit quality benchmarks as well." 
So the military gains not just more recruits, but better ones. Test scores are up along with the number of recruits who graduated high school. Today the military is letting in fewer recruits with waivers for minor criminal histories or past drug use. 
For all services, the quality of recruits is the highest it's been in nearly two decades. This, even as the nation is at war — and the risks of military service are as clear as ever.
Fraction of recruits graduating from high school,
 a measure of "recruit quality" used by the military.


Forecasts of the Somali famine were ignored

I heard this interview by Neal Conan about the Somali famine today (transcript here).  The excerpt below caught my attention: apparently ENSO forecasts were used by people in the humanitarian agencies to predict declines in the food supply, but the early warning system was ignored by donors. [If anyone knows about this, I want to learn more about what happend.]
CONAN: The World Food Program aid is going to those [Kenyan refugee] camps. Is it getting into Somalia? 
QUIST-ARCTON: Oh, it arrived in Mogadishu today, the Somali capital, because apart from the Somalis who were fleeing the drought-hit areas into neighboring countries, as we said into Kenya and Ethiopia, many thousands are also heading towards Mogadishu, the capital, in search of food, shelter and medical care. And that is where the U.N. World Food Program has started sending food. 
It's also going to make similar deliveries, we're told, towards the Ethiopian border, hoping, of course, to get into the zones that have been declared famine areas. But many people are saying too little, too late. This drought was predicted up to two years ago. How come the global community has waited to the last minute, has waited, as usual, they say, until they see emaciated children in images flashed all over the world before they act? 
CONAN: The drought conditions are part of a cycle called La Nina. It's a reflection of the cycles of weather that come from the Pacific Ocean. And as you say, they were predicted two years ago. Do aid officials have any explanation when people say why didn't you have supplies here already? 
QUIST-ARCTON: Well, they say they have been talking about it. They sounded the alarm. There's an early warning system and that it's not just now that they have been talking about it. But they - they say they have a shortfall as we speak of one and a half billion dollars. 
Not only for the emergency. They say the problem is when it reaches this emergency stage, it costs a lot more than if people are fed where they are, in Somalia, before things reach crisis level. So the humanitarian agencies are saying, look, we did our part. We sounded the alarm. But you know, we need the response from the U.N. member states and others to help these people before things reach an emergency, which of course they have done.



People often ask me about using different climate data sources for social science research. Reanalyses.org is a new wiki site that will be a useful resource for people looking to use reanalysis data (see this RealClimate post for a short explanation of what reanalysis is and why this site is new/useful).

For social scientists looking at climate impacts, it's probably worth considering using reanalysis data.

First, if you work on problems in low income countries, there may be a relatively thin observational record in your region of interest (see here and here).  Using reanalysis helps interpolate between existing observations using a physical model, as opposed to using only statistical techniques. 

Second, the record of weather observations (again, particularly in low income countries) is influenced by social events (again, see here).  If you are a researcher concerned about endogeniety, this may be a serious issue.  For example, Vecchi and Knutson (2008) document the endogeneity of hurricane observations (and I can say with certainty that  endogenous weather observations influence climate impact estimates because I have a paper showing it).  My guess is that reanalyses can help mitigate this issue somewhat, since human observations are balanced against model estimates.  But, as Gavin and others have pointed out, reanalyses are still strongly influenced by the observational record so it is unlikely that using them can completely fix this issue.


Hyperinflation and the birth of the real

NPR has a fascinating piece on how the Brazilian real (plural reais) was created in the early nineties to end hyperinflation. In short: it was originally a non-circulating, purely denominational currency meant to stabilize price expectations :
"We called it a Unit of Real Value — URV," Bacha says. "It was virtual; it didn't exist in fact."

People would still have and use the existing currency, the cruzeiro. But everything would be listed in URVs, the fake currency. Their wages would be listed in URVs. Taxes were in URVs. All prices were listed in URVs. And URVs were kept stable — what changed was how many cruzeiros each URV was worth.

Say, for example, that milk costs 1 URV. On a given day, 1 URV might be worth 10 cruzeiros. A month later, milk would still cost 1 URV. But that 1 URV might be worth 20 cruzeiros.

The idea was that people would start thinking in URVs — and stop expecting prices to always go up.

"We didn't understand what it was," says Maria Leopoldina Bierrenbach, a housewife from Sao Paulo. "I used to say it was a fantasy, because it was not real."
Yes, there are other ways of ending hyperinflation, but you have to say this one has its special charm.


Arctic sea ice on track for a new record minimum

One of the research areas in which I work is how humans deal with new information about risk, especially climate risk. Thus I was very interested (and, of course, dismayed) to hear that we're currently on track for a record low in Arctic sea ice coverage.

The red line is us right now (2011) and note that it's currently tracking or exceeding the record minimum set in 2007. The decline in Arctic sea ice cover is widely considered to be one of the strongest pieces of evidence we have that we're dangerously altering the climate, though it also has its own more immediately concerning problems.

There's a discussion thread hosted by our friends at uber-climate blog Real Climate if you're so inclined, and a general-interest article about the decline and its relevance to non-climatologists on Andrew Freedman's Washington Post blog here.


Earth Institute online stuff

See the EI blog "Vanishing Tropical Glaciers"
I recently had the pleasure of working with Kevin Krajick and David Funkhouser at the Earth Institute (EI) and they pointed me to some useful EI resources on how to effectively discuss science with the media.

Also an interesting find is the impressive blog of the EI: State of the Planet.  They have a steady stream of solid coverage for global events, general issues and ongoing research at the EI (for example, check out this sequence of posts about the Arctic Switchyard Project).

As Jesse always says, check it out.


Some economics of digitized data

[this is a guest post by Kyle Meng]

We often pride ourselves for living in the age of information. The internet and its many services have made available to us a wealth of information and data with unprecedented ease of access. An important question, however, is whether we're gaining access to accurate information or the most useful information.

In a recent article published in Climatic Research (and reviewed HERE), researchers estimate that some 80% of all climate data collected has yet to be digitized. Much of this is data collected in the pre-industrial era, and some of it covering parts of the world we still don't know much about climatically (i.e. Africa). That's tremendously valuable information. When we think about anthropogenic climate change, one of the best ways to understand impacts in the future is to look to the past. The further back we can go, the more we can account for any natural variation in the Earth's climatic system and thus better understand how society respond to such changes. The social benefit is clearly there, and so too is the classic market failure.

So where is the private provision of this information? Let's take Google for example. Of all companies, Google, which tasks itself with cataloging and making accessible the world's information, seems the most likely candidate for digitizing climate information. Yet, if one looks at Google's efforts in digitizing information - from books to locations to artwork - its pretty clear that there's a profit motive (or more specifically an advertising motive) behind most of it. What would be the private returns of digitizing climate data? Zilch, unless you count the handful of academics that would use the data - but that's just a handful of adSense clicks from poor researchers.

Its a classic case for government funding. This is information that can potentially change the way we think about anthropogenic climate change. It's out there and yet there are few incentives or efforts to digitize it (with some notable exceptions). Of the deluge of information that's out there, shouldn't this rank higher than the latest celebrity tweet?

note: figure from Brohan P (2009) Why historical climate and weather observations matter? In: ACRE data and data visualisation meeting. Exeter Meeting, 15 September 2009. Available at: www.met-acre.org/meetings-and-workshops-1/acre-dataand-visualisation-meeting


If you're in the continental US this week you've no doubt experienced one of the worst heat waves in a while. It's hitting 100 (Fahrenheit, sorry) in New York today and the same air mass broke records across the country as it slid east. To that end, here are a few explanatory science links to get you through your needlessly hot Friday afternoon:


The impact of piracy on general circulation models

Sometimes the interdisciplinary research grant proposals write themselves. From last week's EOS:
Pirate Attacks Affect Indian Ocean Climate Research
Pirate attacks in the Gulf of Aden and the Indian Ocean off the coast of Somalia nearly doubled from 111 in 2008 to 217 in 2009 [International Maritime Bureau, 2009, 2010]. Consequently, merchant vessel traffic in the area around Somalia significantly decreased. Many of these merchant vessels carry instruments that record wind and other weather conditions near the ocean surface, and alterations in ship tracks have resulted in a hole sized at about 2.5 million square kilometers in the marine weather–observing network off the coast of Somalia.
The data void exists in the formation region of the Somali low-level jet, a wind pattern that is one of the main drivers of the Indian summer monsoon. Further, a stable, multidecadal record has been interrupted, and consequently, long-term analyses of the jet derived from surface wind data are now showing artificial anomalies that will affect efforts by scientists to identify interannual to decadal variations in the climate of the northwestern Indian Ocean.
Link to abstract (full article is behind paywall). For more Fight Entropy posts on piracy click here. Note that this relationship has serious endogeneity issues.

List of OpenCourseWare

I've used MIT's OpenCourseWare multiple times in the past few years and liked it a lot. Which is why I was particularly happy to stumble across this article on the life-hacking site Marc and Angel. It has a long list of university OpenCourseWare sites *as well* as a fairly long and exhaustive list of other online educational resources, from Duke's Law Center for the Public Domain to Google's Code University to Science.gov.

Check it out.


Make better graphs

I rarely actually finish reading books. But when I do it means they're useful and interesting, so I'll tell you when that happens.

I finally finished Tufte's famous The Visual Display of Quantitative Information (here on Amazon).  There are gazillions of reviews online, so I'll be brief.

If you are an applied researcher and you plan on making your living doing statistics or theory and writing papers, ask each of your colleagues for $1 so you can buy and read this book.  They will gladly pay up, since it will make deciphering your future papers much more pleasant.

The book is overpriced and Tufte's writing style is patronizing. But the book is still worth reading and if I ever teach a stats class, it will be assigned.  If the book does nothing else, it brings together a wide array of statistical graphics from various fields, graphs I never would have found on my own, and critiques them.  The critiques sound like the kind of thing that photographers, artists and designers do for each other all the time.  But its uncommon for us stats-types to ever really talk about the organization and aesthetics of our graphs. I was surprised how useful the exercise was for me.

Tufte is a designer and he names his rules-of-thumb-for-graphs things that only a designer would come up with.  But he's not dumb and this is a book for applied statisticians.  He doesn't do any fancy math, but he assumes you know what variance, a marginal distribution and r-squared are on page 2.

If you pride yourself on being a pragmatist with no taste for unnecessary glitz, you may be surprised to find that Tufte-the-designer is on your side.  All of his tools and tips help you focus on the data and avoid unnecessary decorations in your graphs.  He even critiques several of the "recommended graph styles" used in top journals as being over-drawn.

Recommendation: buy the book and read it.  It's easy to read and will make all of your future work more valuable.  Constructing a single smoking graph as the centerpiece in each paper/presentation is one of the most useful things you can do as a researcher.  And this book will help you get there.  I didn't like all of his advice, but I'm certain that he altered how I approach my own graphs (and I'm someone who already spent a lot of time thinking about their design).

Besides, even if you end up hating the book, at least you can appreciate its beauty: it's probably the only statistics book that I would consider leaving out on the coffee table when guests are over.


Important questions vs. interesting questions

In a recent conversation over drinks with a colleague, we got into the classic debate of whether researchers should select interesting questions or important questions. There are lots of "research advice" documents out there discussing this tradeoff (eg. this one by Donald Davis, which DRDR recently pointed out), but I don't believe there really is a tradeoff per se.

Here's what I think my opinion is (for now, at least): there are many interesting but un-important questions but there are no important but un-interesting questions.

If a question really is important for society, then it is by definition interesting.  If an important question remains unanswered, it is probably because it is difficult to answer.  But if it really is important, then there will be many people interested in its answer (so long as you can come up with a good one).  Clearly, this means the question is interesting.

I think there are two reasons that important questions sometimes get mistakenly categorized as "uninteresting" by researchers. First, researchers sometimes have a strategic incentive to label a question they do not know how to answer as uninteresting, since it gives them justification for not investigating the problem.  Second, questions may be so difficult to answer that we assume any answer will be unsatisfying and thus uninteresting.  In both these cases, I think we confuse the difficulty of a problem with whether that problem is interesting. 

It's fine with me if some academics want to work on interesting but unimportant questions, I don't see any way to stop this from happening.  But I get angry when researchers thoughtlessly dismiss as uninteresting a question that they are willing to recognize is important.


Unmet need: daytime indoor lighting

One of the deeply humbling aspects of doing development-related work is the frequency with which one says "... I didn't even know that was a problem." Today's example: daytime indoor lighting solutions.
Isang Litrong Liwanag (A Liter of Light), is a sustainable lighting project which aims to bring the eco-friendly Solar Bottle Bulb to disprivileged communities nationwide. Designed and developed by students from the Massachusetts Institute of Technology (MIT), the Solar Bottle Bulb is based on the principles of Appropriate Technologies – a concept that provides simple and easily replicable technologies that address basic needs in developing communities.
First, a hole is cut in a corrugated iron sheet, and a one-liter plastic bottle that has been filled with water and about four teaspoons of bleach is inserted. A hole is cut in the house’s roof, the bottle is put in, and then the iron sheet is fixed to the roof with rivets and sealant.

“Unlike a hole in which the light will travel in a straight line, the water will refract it to go vertical, horizontal, 360 degrees of 55 watts to 60 watts of clear light, almost ten months of the year,” Diaz said.
Yes, it's a bottle of water-bleach solution that one sticks in a hole in one's ceiling. It solves a problem that's difficult to imagine until you've been in a dark shed during the middle of the day. It costs pennies. It's wildly popular. How can you not love it?

The Satellite Sentinel Project

While reading about the new nation of Southern Sudan, I notice a citation to the Satellite Sentinel Project. I hadn't heard of this project, so I looked it up. It sounds like a great use of modern technology and I'll be interested if it's effective.  [Note to social scientists: evaluating its effectiveness seems like an excellent, albiet difficult, research topic.] From their site:

About the Satellite Sentinel Project 
George Clooney initiated the Satellite Sentinel Project (SSP) while on an October 2010 trip to Southern Sudan with Enough Project Co-founder John Prendergast. SSP combines satellite imagery analysis and field reports with Google's Map Maker technology to deter the resumption of war between North and South Sudan. The project provides an early warning system to deter full-scale civil war between Northern and Southern Sudan and to promote greater accountability for mass atrocities by focusing world attention and generating rapid responses on human rights and human security concerns. 
SSP was launched as a six-month pilot project on December 29, 2010, as the result of an unprecedented collaboration between Not On Our Watch, the Enough Project, Google, the United Nations UNITAR Operational Satellite Applications Programme (UNOSAT), DigitalGlobe, the Harvard Humanitarian Initiative, and Trellon, LLC. UNITAR/UNOSAT's role concluded when the pilot phase ended on June 30, 2011. 
The project works like this: Commercial satellites passing over the border of northern and southern Sudan are able to capture possible threats to civilians, observe the movement of displaced people, detect bombed and razed villages, or note other evidence of pending mass violence. 
Google and Trellon collaborated to design the web platform for the public to easily access the images and reports. Harvard Humanitarian Initiative provides system-wide research and leads the collection, human rights analysis, and corroboration of on-the-ground reports that contextualizes the satellite imagery. The Enough Project contributes field reports, provides policy analysis, and, together with Not On Our Watch, and our Sudan Now partners, puts pressure on policymakers by urging the public to act. DigitalGlobe provides satellite imagery and additional analysis. 
The Satellite Sentinel Project marks the first sustained, public effort to systematically monitor and report on potential hotspots and threats to security along a border, in near real-time (within 24-36 hours), with the aim of heading off humanitarian disaster and human rights crimes before they occur. 
Not On Our Watch -- co-founded by Don Cheadle, George Clooney, Matt Damon, Brad Pitt, David Pressman, and Jerry Weintraub -- has provided seed money to launch the project. 


Superman: A transitional energy source

Happy Summer Sunday afternoon from Fight Entropy, thanks to our friends at Saturday Morning Breakfast Cereal. Click through for the whole cartoon.

Weekend reading on the global environment

Trophic Downgrading of Planet Earth
Estes et al.
Science 2011

Until recently, large apex consumers were ubiquitous across the globe and had been for millions of years. The loss of these animals may be humankind’s most pervasive influence on nature. Although such losses are widely viewed as an ethical and aesthetic problem, recent research reveals extensive cascading effects of their disappearance in marine, terrestrial, and freshwater ecosystems worldwide. This empirical work supports long-standing theory about the role of top-down forcing in ecosystems but also highlights the unanticipated impacts of trophic cascades on processes as diverse as the dynamics of disease, wildfire, carbon sequestration, invasive species, and biogeochemical cycles. These findings emphasize the urgent need for interdisciplinary research to forecast the effects of trophic downgrading on process, function, and resilience in global ecosystems.

A Large and Persistent Carbon Sink in the World’s Forests
Pan et al.
Science 2011

The terrestrial carbon (C) sink has been large in recent decades, but its size and location remain uncertain. Using forest inventory data and long-term ecosystem C studies, we estimated a total forest sink of 2.4 ± 0.4 Pg C yr–1 globally for 1990–2007. We also estimated a source of 1.3 ± 0.7 Pg C yr–1 from tropical land-use change, consisting of a gross tropical deforestation emission of 2.9 ± 0.5 Pg C yr–1 partially compensated by a C sink in tropical forest regrowth of 1.6 ± 0.5 Pg C yr–1. Together, the fluxes comprise a net global forest sink of 1.1 ± 0.8 Pg C yr–1, with tropical estimates having the largest uncertainties. This forest sink is equivalent in magnitude to the terrestrial sink deduced from fossil fuel emissions and constraints of ocean and atmospheric sinks.


The Psychology of Climate Change Communication

This looks useful: The Psychology of Climate Change Communication: A Guide for Scientists, Journalists, Educators, Political Aides, and the Interested Public

Download a PDF of the book for free here.

The book is put together by a group here at Columbia: the Center for Research on Environmental Decisions (CRED).  I saw a flyer for it today while at a seminar they were hosting on rainfall index insurance. Here's what J. Sachs has to say about it:

“The ultimate solutions to climate change are workable, cost-effective technologies which permit society to improve living standards while limiting and adapting to changes in the climate. Yet scientific, engineering, and organizational solutions are not enough. Societies must be motivated and empowered to adopt the needed changes.  
For that, the public must be able to interpret and respond to often bewildering scientific, technological, and economic information. Social psychologists are aware, through their painstaking scientific research, of the difficulties that individuals and groups have in processing and responding effectively to the information surrounding long-term and complex societal challenges.  
This guide powerfully details many of the biases and barriers to scientific communication and information processing. It offers a tool—in combination with rigorous science, innovative engineering, and effective policy design—to help our societies take the pivotal actions needed to respond with urgency and accuracy to one of the greatest challenges ever faced by humanity: global-scale, human-induced environmental threats, of which the most complex and far reaching is climate change.”  
—Jeffrey Sachs, Director, The Earth Institute, Columbia University


Complementarity in economic development policies

[This is a guest post by Anna Tompsett.]
If you work in development, or think or read about it, you’ll be familiar with the idea of complementarity.  You may not have called it by that name, but you’re sure to be familiar with the idea; that a package of interventions can be much more effective than interventions on their own.  
For example, if there is no road access to a village, then people inside can’t travel to access healthcare, teachers can’t get to schools in order to teach and farmers can’t get their goods to market.  Improving the clinic in the nearby town, or paying teachers extra to show up on time, or creating a mobile phone price information system, has little impact, because of the constraints of the existing infrastructure.  On the other hand, building the road doesn’t, in itself, improve the clinic, change incentives for teachers, or resolve agricultural market failures.  

Some schoolgirls in a very isolated area of Nigeria 
(Dadiya Hills, Gombe State) looking out across the 
valley.  You can't see the road, because there's a 
two-hour walk and a thigh-deep river in between,
but that's kind of the point.

It’s a critically important idea in policy; it has informed the Millennium Development Goals, and it’s a large component of the philosophy behind theMillennium Villages (love them or hate them).  Yet we have staggeringly little evidence for how important these complementarities really are.  If they are significant, they could cause us to systematically underestimate how effective development interventions could be in conjunction when we assess them in isolation.   

However, there’s a very good reason why there is little robust evidence on their magnitudes; it’s hard enough to design an effective field test, or find a natural experiment, when you’re interested in a single intervention or policy.  And the size of your required sample increases with the square of the number of interventions whose interactions you would like to measure.  (Anecdotal evidence suggests that the amount of luck required to find natural experiments may even increase with the cube of that number.)

Every time I get disheartened by these odds, however, I read an article like this one, and I’m reminded of why I shouldn’t lose give up looking for a context in which to study this issue with the rigour it deserves. 


Quotes for the discouraged researcher

Research is hard.  And keeping oneself inspired to keep at it is even harder.  These are some favorites that keep me going.  (If these aren't enough, see Xavier Sala-i-Martin's hilarious collection of rejected ideas.)

"Somewhere, something incredible is waiting to be known."
- Carl Sagan

"The significant problems we face cannot be solved at the same level of thinking we were at when we created them."
- Albert Einstein

"I have not failed. I have just found 10,000 ways that won't work."
-Thomas Edison on attempting to make an incandescent light bulb

"Good and bad research take the same amount of time."
- Some economist?
(tell me who if you know). Larry Summers

"If you want to make incremental contributions to the literature in your field — let us say, if you want to generalize an existence theorem by relaxing the condition of semi-strict quasi-concavity to one of mere hemi-demi-proper pseudoconcavity — then stick to the technical journals. If you want to change your field in more fundamental ways, then obtain your primary motivation from life, and use it to look for fundamental shortcomings of previous thinking in the field."
- Avinash Dixit describing Thomas Schelling after he won the Nobel Prize
(I owe this one to Matt Notowidigdo)

"He well knows what snares are spread about his path, from personal animosity... and possibly from popular delusion.  But he has put to hazard his ease, his security, his interests, his power, even his... popularity.... He is traduced and abused for his supposed motives.  He will remember that obloquy is a necessary ingredient in the composition of all true glory: he will remember...that calumny and abuse are essential parts of triumph.... He may live long, he may do much.  But here is the summit.  He never can exceed what he does this day."
-Edmund Burke's eulogy of Charles James Fox for his attack upon the tyranny of the East India Company, House of Commons, December 1, 1783
(I owe this one to JFK)

"Good sense is, of all things among men, the most equally distributed; for every one thinks himself so abundantly provided with it, that those even who are the most difficult to satisfy in everything else, do not usually desire a larger measure of this quality than they already posess. And in this it is not likely that all are mistaken: the conviction is rather to be held as testifying that the power of judging aright and of distinguishing truth from error, which is properly what is called good sense or reason, is by nature equal in all men; and that the diversity of our opinions, consequently, does not arise from some being endowed with a larger share of reason than others, but solely from this, that we conduct our thoughts along different ways, and do not fix our attention on the same objects. For to be possessed of a vigorous mind is not enough; the prime requisite is rightly to apply it. The greatest minds, as they are capable of the highest excellences, are open likewise to the greatest aberrations....
I will not hesitate, however, to avow my belief that it has been my singular good fortune to have very early in life fallen in with certain tracks which have conducted me to considerations and maxims, of which I have formed a method that gives me the means, as I think, of gradually augmenting my knowledge, and of raising it by little and little to the highest point which the mediocrity of my talents and the brief duration of my life will permit me to reach."
-Rene Descartes, Discourse on Method, 1637

Doctors Without Borders: lessons learned

This looks like a really interesting new publication by the Nobel Prize winning organization MSF. Download the book in pdf form for free here.
"Medical Innovations in Humanitarian Situations explores how the particular style of humanitarian action practiced by Doctors Without Borders/Médecins Sans Frontières (MSF) has stayed in line with the standards in scientifically advanced countries while also leading to significant improvements in the medical care delivered to people in crisis. 
Through a series of case studies, the authors reflect on how medical aid workers dealt with the incongruity of practicing conventional evidence-based medicine in contexts that require unconventional approaches."


Rare earth metals in the deep ocean

Nature Geoscience just published a letter by several scientists from Tokyo University suggesting that rare Earth metals are actually not that rare within deep sea sediment beds. Here's the abstract, with some added emphasis in bold:
World demand for rare-earth elements and the metal yttrium—which are crucial for novel electronic equipment and green-energy technologies—is increasing rapidly. Several types of seafloor sediment harbour high concentrations of these elements. However, seafloor sediments have not been regarded as a rare-earth element and yttrium resource, because data on the spatial distribution of these deposits are insufficient. Here, we report measurements of the elemental composition of over 2,000 seafloor sediments, sampled at depth intervals of around one metre, at 78 sites that cover a large part of the Pacific Ocean. We show that deep-sea mud contains high concentrations of rare-earth elements and yttrium at numerous sites throughout the eastern South and central North Pacific. We estimate that an area of just one square kilometre, surrounding one of the sampling sites, could provide one-fifth of the current annual world consumption of these elements. Uptake of rare-earth elements and yttrium by mineral phases such as hydrothermal iron-oxyhydroxides and phillipsite seems to be responsible for their high concentration. We show that rare-earth elements and yttrium are readily recovered from the mud by simple acid leaching, and suggest that deep-sea mud constitutes a highly promising huge resource for these elements.
Given the increasing (and admittedly often rather hyperbolic) concern about China's control over the world's rare earth metal supply (e.g., this Bloomberg article from last month) this is some pretty fascinating news.


Asymmetric citation behavior

Our colleague James Rising indirectly pointed me to a PNAS paper from 2008 by Rosvall and Bergstrom that provides a very interesting map of citation structure across the entirety of the sciences [click through for larger versions]:

As noted in the lower right, arrows convey intensity of citation from one field to the next. The structure is very similar to that revealed by eigenfactor.org's plots (previously blogged here; apparently at least one of the authors is involved in the eigenfactor project) albeit with some additional and rather useful dimensionality.

Or particular interest is the social sciences, which Rosvall and Bergstrom map out in detail:

Note how high the flow intensity within economics (circle darkness) is compared to flow outside (circle border), as well as the relative patterns to other fields, almost all of which cite economics more than economics cites them. The pattern is similar though not quite as extreme for psychology. There are a variety of reasons one could imagine for this (rigor of work, differences in strategic citation behavior, range of topics studied, etc. etc.) but it's a pretty nice bit of academic-social network structure to ponder on its own.


"We choose to go to the moon."

[This is a guest post by Kyle Meng.]
I woke up this morning to a radio story on NASA's last shuttle launch. Today's launch from Cape Canaveral marks the final flight for Atlantis, and indeed for our much storied space shuttle program. The story made me think of a figure I saw a few months back in a presentation made by the NYT reporter Andy Revkin. The figure, which I believe is from the White House's Office of Science and Technology Policy, shows the portion of non-defense R&D spending since the 1950s. What struck me when I first saw this figure was just how much was spent on the space race in the 1960s. In 2010 dollars, we were spending some $20-$30 billion dollars a year to get ourselves onto the moon. I'm not sure how that compares to defense spending during the same period but it certainly dwarfs all other public non-defense R&D expenditures back then and even today (with the exception of healthcare).

I'm not sure I know how best to think about this. President Kennedy's "we choose to go to the moon" speech certainly prompted this massive allocation of public funds but at what cost and towards what benefit? Many argue that its hard to place a dollar sign on placing a man on the moon, or on beating the Soviet Union during the hottest years of the cold war. There's also the argument that the achievements of this program inspired a generation of scientists and engineers. Yet, despite all this, which I sincerely believe has true value and import, I can't help but wonder, given our space program today, its retired fleet of shuttles, and the very real technological challenges we face, whether that bulge of R&D funding might have yielded better returns had it been invested elsewhere over these past 50 years.

Technologists often argue that what we really need to address climate change is another Apollo project. Certainly, the magnitude of investment needed probably rivals that of the Apollo project. But is the Apollo project really the best example? Looking back in hindsight, I'm not so sure. I watched Atlantis' launch, and, as always, found it amazing to behold. But now what?



Samuel Arbesman and Nicholas Christakis of Harvard's Institute for Quantitative Social Science have a short and interesting article in the current issue of PLoS Computational Biology on the science of scientific discoveries. It's more a position piece observing that quantitative analysis of scientific discoveries is an emerging field, but it does include one lovely set of figures :

(A) Mean diameter (kilometers) of minor planets discovered, 1802–2008. (B) Mean physical size (g) of mammalian species discovered, 1760–2003. (C) Mean inverse of atomic weight of chemical elements discovered, 1669–2006.

The authors are open about the obvious problem with "eurekometrics," i.e., that scientific discoveries aren't easily systematically defined. That said, that sort of difficulty is pretty common in bibliometrics and scientometrics research, and trying to come up with quantifiable proxies for relevant concepts is part of the fun (and sometimes leads to fruitful metrics, e.g., the H index). All in all, the piece is short and worthwhile and ties in with a bunch of other interesting recent bibliometrics papers (e.g., Wuchty et al.'s Science paper on teams) for those who are inclined. Check it out.

Insurgency: an example of learning by doing

This paper (published this week in Science) seems to be an important contribution to our empirical understanding of civil conflicts.  To date, most work in the field has focused on the onset and duration of conflicts.  In contrast, this paper examine the rates at which specific conflicts escalate.

The extraordinarily interdisciplinary team (including but not limited to a political scientist, a physicist, a biologist and an engineer) makes the point that successful insurgent activity at a specific location has a non-stationary inter-arrival time that follows the approximate power law:
Basically, insurgents learn how to fight more effectively with time. This may not be a surprising point, but the regularity of the statistical structures is remarkable.  Furthermore, insurgents in different locations exhibit different apparent rates of "learning."  The authors hypothesize that some of this heterogeneity can be explained by how opposition to the insurgents (occupying forces) learn how to defend themselves from insurgent attacks.

Pattern in Escalations in Insurgent and Terrorist Activity
Neil Johnson, Spencer Carran, Joel Botner, Kyle Fontaine, Nathan Laxague, Philip Nuetzel5, Jessica Turnley, Brian Tivnan

Abstract: In military planning, it is important to be able to estimate not only the number of fatalities but how often attacks that result in fatalities will take place. We uncovered a simple dynamical pattern that may be used to estimate the escalation rate and timing of fatal attacks. The time difference between fatal attacks by insurgent groups within individual provinces in both Afghanistan and Iraq, and by terrorist groups operating worldwide, gives a potent indicator of the later pace of lethal activity.


Some shapefile utilities for Matlab (and data transfer to Stata)

Here are some extremely simple but extremely useful functions when working with shapefiles in Matlab.

Functions to drop or keep polygons in your workspace based on their attributes (drop_by_attribute.m, keep_by_attribute.m).

A simple command to plot all your polygons (plot_districts.m).

A Matlab function to merge attributes (combine_attributes.m) and an accompanying Stata function to "un-merge" those attributes (break_var.ado).  This is particularly useful if you want a single attribute to uniquely identify a polygon (eg. combining State and County into a single attribute State_County) which you use when you do some calculations in Matlab (eg. compute avg. temperature by State_County) and export to Stata (using my scripts here) where you then want to retrieve the original distinct attributes (eg. State and County) for statistical analysis.


The Momentum Externality

I liked this new NBER working paper that was presented at the recent Stanford meeting.
Pounds that Kill: The External Costs of Vehicle Weight 
Michael Anderson, Maximilian Auffhammer 
NBER Working Paper No. 17170 
Heavier vehicles are safer for their own occupants but more hazardous for the occupants of other vehicles. In this paper we estimate the increased probability of fatalities from being hit by a heavier vehicle in a collision. We show that, controlling for own-vehicle weight, being hit by a vehicle that is 1,000 pounds heavier results in a 47% increase in the baseline fatality probability. Estimation results further suggest that the fatality risk is even higher if the striking vehicle is a light truck (SUV, pickup truck, or minivan). We calculate that the value of the external risk generated by the gain in fleet weight since 1989 is approximately 27 cents per gallon of gasoline. We further calculate that the total fatality externality is roughly equivalent to a gas tax of $1.08 per gallon. We consider two policy options for internalizing this external cost: a gas tax and an optimal weight varying mileage tax. Comparing these options, we find that the cost is similar for most vehicles.
And the important number from the intro (it was left out of the abstract):
When we translate this higher probability of a fatality into external costs (relative to a small baseline vehicle), the total external costs of vehicle weight from fatalities alone are estimated at $93 billion per year
The focus of the paper's second half in on the design of policies that might correct this externality, which I really like because it makes the results instantly usable.

One thing that I think could enrich the paper further is if the authors discussed this result with a some attention to the physics of inelastic collisions.  One reason this might be useful is that it instantly points to a second policy option that is not addressed in the paper: changing speed limits.  In a collision, it's the momentum of the colliding cars that matters if we're thinking about the amount of kinetic energy that's available to kill the people in the cars.  As the paper rightly points out, momentum (and thus available kinetic energy) increases with vehicle mass.  But momentum is the product of mass and velocity.  So we could, in theory, maintain fatality rates while increasing average vehicle weight so long as we decreased speed limits.  The effect of vehicle speed on accident fatalities (via its influence on momentum) was shown in Ashenfelter and Greenstone's JPE paper.  I'm not sure if Americans would prefer to drive slower in bigger cars, but I think its worth pointing out that there is a tradeoff between mass and velocity (when one is talking about fatality risks).  These two papers are looking at two sides of the same coin: momentum.