Hurricanes and the social safety net in US counties

The social safety net catches people after a hurricane, but this cost to society is generally not accounted for in standard estimates of a hurricane's economic impact.

The Role of Transfer Payments in Mitigating Shocks: Evidence from the Impact of Hurricanes
Tatyana Deryugina
Abstract: Little is known empirically about how aggregate economic shocks are mitigated by social safety nets. I examine the effect of hurricanes on US counties. While I find no significant changes in population, earnings, and the employment rate 0-10 years after landfall, there is a substantial increase in non-disaster government transfers. An affected county receives additional non-disaster government transfers totaling $654 per capita, which suggests that the lack of changes in basic economic indicators may be in part due to existing social safety nets. The fiscal costs of natural disasters are also much larger than the cost of disaster aid alone.

click to enlarge

click to enlarge 

Deryugina writes:
The number of construction firm locations (establishments) declines by 1.6% each year with no change in the mean. Construction employment is on average 7.6% lower in the ten years following the hurricane, and declines by 2.0% per year. The overall decline in employment suggests a drop in construction demand. This is confirmed by estimates of per capita single family housing starts, which are 8% lower on average. Wages increase by an average of 6.8%, but then fall by 0.9% each year, suggesting there may be a temporary change in the composition of construction labor demand (e.g., more demand for specialized workers) or lower labor supply… 
One possible interpretation of the decline in the local construction sector is spatial: the con- struction industry may have simply moved to nearby counties without any net effect on the sector. The implications of spatial changes, while non-trivial for the local economy, are different than if there’s a widespread capital shock. However, the fall in per capita housing starts provides evidence of a significant decrease in construction demand. Thus, the downturn in the local construction sector is not solely driven by spatial shifts in construction activity. 
There is no change in the employ- ment rate or per capita net earnings. Using 95% confidence bounds, I can rule out a decrease in mean earnings greater than 1.8% and a decrease in the mean employment rate greater than 0.5% The mean shift test for transfers indicates a 2.1% average increase in per capita government to individual transfers, equivalent to about $69 per person per year. Per capita business to individ- ual transfers in the eleven years following the hurricane are estimated to be 4.8% higher than the pre-hurricane transfers, or about $3.9 per year. There are no significant changes in the trends of any of these variables. Assuming a 3% discount rate, the present discounted value (PDV) of all government transfers is about $654 per capita, and the PDV of transfers from businesses is $37 per capita. Thus, post-hurricane transfers from general social programs are larger than transfers from disaster-specific programs and much larger than insurance payments. Because the non-disaster transfers are still significantly larger 10 years after the hurricane, the estimate of $654 per capita should be viewed as a lower bound.

The subcomponents of total government transfers to individuals are: retirement and disability insurance benefits (which includes workers’ compen-sation), medical benefits (which includes Medicare and Medicaid), income maintenance (which includes Supplemental Security Income, family assistance, and foot stamps), unemployment ben- efits, veterans’ benefits, and federal education assistance. A separate analysis of each of these components (following the same procedure as for total transfers) reveals that increases in medical and unemployment benefits explain the overwhelming majority of the net increase in total non- disaster transfers. Specifically, public medical benefits increase significantly by $435 per capita in PDV, of which $106 is Medicare spending. The estimated change in Medicare spending is not significant. 18 Because there is no significant increase in Medicare spending, the increase in pub- lic medical spending is likely due to changes in the number of people eligible for public medical benefits rather than increased medical spending on existing recipients. 
Unemployment benefits increase by about $280 per capita in PDV. There is no significant change in aggregate income maintenance (although some subcomponents, such as family assis- tance, do increase slightly) and no significant change in retirement and disability insurance bene- fits, per capita federal education assistance, or per capita veteran benefits. Thus, the majority of the increase in transfers is accounted for by unemployment insurance and public medical benefits.

Access to hardened infrastructure and hurricane mortality

I just happen to be working on a review of hurricane's socio-economic impacts right now, and since there seems to be widespread interest in these things at the moment, I'm just going to post some of the more interesting/important papers as I go. The figure from this 1993 paper is compelling (although I think a brace would be clearer than the arrow).

Risk factors for mortality in the Bangladesh cyclone of 1991
C. Bern, J.Sniezek, G.M. Mathbor, M.S. Sidiqi, C. Ronsmans, A.M.R. Chowdhury, A.E. Choudhury, K. Islam, M. Bennish, E. Noji, & R.l.Glass
Abstract: Cyclones continue to pose a dangerous threat to the coastal populations of Bangladesh, despite improvements in disaster control procedures. After 138 000 persons died in the April 1991 cyclone, we carried out a rapid epidemiological assessment to determine factors associated with cyclone-related mortality and to identify prevention strategies. A nonrandom survey of 45 housing clusters comprising 1123 persons showed that mortality was greatest among under-10-year-olds (26%) and women older than 40 years (31%). Nearly 22% of persons who did not reach a concrete or brick structure died, whereas alpersons who sought refuge in such structures survived. Future cyclone-associated mortality in Bangladesh could be prevented by more effective warnings leading to an earlier response, better access to designated cyclone shelters, and improved preparedness in high-risk communities. In particu- lar, deaths among women and under-10-year-olds could be reduced by ensuring that they are given special attention by families, neighbours, local authorities, and especially those in charge of early warnings and emergency evacuation.

From the results:

Type of housing and shelter-seeking activity were directly related to the risk of dying in the cyclone (Fig. 2). No deaths occurred among the 27 individuals (2%) who lived in pukka houses or remained in pukka public buildings for the duration of the cyclone. However, 1094 individuals (98%) were not in a safe shelter prior to the cyclone warning. In response to the warning, which most respondents reported hearing 3-6 hours prior to the storm surge, only 40 individuals (4%) sought and reached safe shelter.When the flood waters first reached the area 10-60 minutes before the storm surge, 151 persons (13%) were insafeshelter.In all, 385 persons (33%) had reached safe shelter by the moment of impact of the storm surge; none of these persons died In con- trast, of 736 persons at risk, 162 (22%; P < 0.0001) drowned in the flood waters.
Of 736 persons at risk at the time of the cyclone impact, 285 were swept away in the storm surge; of these, 112 (39%) died. Another 179 per- sons were able to float on some object, generally the thatch roof of their house; of these, 27 (15%) died. Mortality was 22% among those who sought high ground to escape the storm surge, and 11% among those who took refuge in trees.


Will Hurricane Sandy affect the US presidential election?

The literature says: yes. Now let's see if Ohio get's a disaster declaration...

The Political Economy of FEMA disaster Payments 
(non-paywalled working paper)
ABSTRACT: We find that presidential and congressional influences affect the rate of disaster declaration and the allocation of FEMA disaster expenditures across states. States politically important to the president have a higher rate of disaster declaration by the president, and disaster expenditures are higher in states having congressional representation on FEMA oversight committees. Election year impacts are also found. Our models predict that nearly half of all disaster relief is motivated politically rather than by need. The findings reject a purely altruistic model of FEMA assistance and question the relative effectiveness of government versus private disaster relief.

Myopic Voters and Natural Disaster Policy

ABSTRACT: Do voters effectively hold elected officials accountable for policy decisions? Using data on natural disasters, government spending, and election returns, we show that voters reward the incumbent presidential party for delivering disaster relief spending, but not for investing in disaster pre- paredness spending. These inconsistencies distort the incentives of public officials, leading the government to underinvest in disaster preparedness, thereby causing substantial public welfare losses. We estimate that $1 spent on preparedness is worth about $15 in terms of the future damage it mitigates. By estimating both the determinants of policy decisions and the consequences of those policies, we provide more complete evidence about citizen competence and government accountability.

Christopher H. Achen and Larry M. Bartels
ABSTRACT: Students of democratic politics have long believed that voters punish incumbents for hard times. Governments bear the responsibility for the economy in the modern era, so that replacing incompetent managers with capable alternatives appears to be a well-informed, rational act. However, this vision of a sophisticated retrospective electorate does not bear close examination. We find that voters regularly punish governments for acts of God, including droughts, floods, and shark attacks. As long as responsibility for the event itself (or more commonly, for its amelioration) can somehow be attributed to the government in a story persuasive within the folk culture, the electorate will take out its frustrations on the incumbents and vote for out-parties. Thus, voters in pain are not necessarily irrational, but they are ignorant about both science and politics, and that makes them gullible when ambitious demagogues seek to profit from their misery. Neither conventional understandings of democratic responsiveness nor rational choice interpretations of retrospective voting survive under this interpretation of voting behavior.  

Probabilistic forecast of direct damage from Hurricane Sandy

Posted at G-FEED.


Strategic contamination of science by industry

I found this gem in the American Journal of Public Health. If you apply a mental "find-replace" to any of many industries, this should sound familiar.  Brandt is a science historian, for any readers who might otherwise be skeptical.

Inventing Conflicts of Interest: A History of Tobacco Industry Tactics
Alan M. Brandt
ABSTRACT: Confronted by compelling peer-reviewed scientific evidence of the harms of smoking, the tobacco industry, beginning in the 1950s, used sophisticated public relations approaches to undermine and distort the emerging science. 
The industry campaign worked to create a scientific controversy through a program that depended on the creationofindustry–academic conflicts of interest. This strategy of producing scientific uncertainty undercut public health efforts and regulatory interventions designed to reduce the harms of smoking. 
A number of industries have subsequently followed this approach to disrupting normative science. Claims of scientific uncertainty and lack of proof also lead to the assertion of individual responsibility for industrially produced health risks.
Brandt writes:
By late 1953, the tobacco industry faced a crisis of cataclysmic proportions. Smoking had been categorically linked to the dramatic rise of lung cancer. Although health concerns about smoking had been raised for decades, by the early 1950s there was a powerful expansion and consolidation of scientific methods and findings that demonstrated that smoking caused lung disease as well as other serious respiratory and cardiac diseases, leading to death. These findings appeared in major, peer- reviewed medical journals as well as throughout the general media. 
As a result, the tobacco industry would launch a new strategy, largely unprecedented in the history of US industry and business: it would work to erode, confuse, and condemn the very science that now threatened to destroy its prized, highly popular, and exclu- sive product. But this would be no simple matter. After all, in the immediate postwar years–the dawn of the nuclear age–science was in high esteem. The industry could not denigrate the scientific enterprise and still maintain its public credibility, so crucial to its success… 
 What is so upsetting is the level of intentional manipulation and coordination [emphasis is mine]:
By the early 1950s, the emerging science on tobacco’s harms documented in the elite peer- reviewed literature, especially the causal linkage to lung cancer, threatened to undo more than a half century of unprecedented corporate success. With considerable anxiety and rancor within the tobacco industry, the industry’s highly competitive CEOs came together in December 1953 at the Plaza Hotel in New York City to map a strategy. They realized that the threat they now faced was unprecedented and would require new, collaborative approaches and expertise. Not surprisingly, given their history, they turned again to the field of public relations that had served them so well in the past. They called upon John W. Hill, the president of the nation’s leading public relations firm, Hill & Knowlton. 
The public confidence the industry required could not be achieved through advertising, which was self-interested by definition. It would be crucial for the industry to assert its authority over the scientific domain; science had the distinct advantage of its reputation for disinterestedness. Hill shared with his public relations predecessor Bernays a deep skepticism about the role of advertising in influencing public perceptions of tobacco. To those schooled in public relations, advertising ran the risk of exposing corporate self-interest. Good public relations relied on scrupulous behind-the-scenes management of media. As Bernays had demonstrated in the 1920s and 1930s, the best public relations work left no fingerprints. 
Hill offered the companies powerful advice and guidance as they faced their crisis. Hill understood that simply denying emerging scientific facts would be a losing game. This would not only smack of self-interest but also ally the companies with ignorance in an age of technological and scientific hegemony. So he proposed seizing and controlling science rather than avoiding it. If science posed the principal–even terminal–threat to the industry, Hill advised that the companies should now associate themselves as great supporters of science. The companies, in his view, should embrace a sophisticated scientific discourse; they should demand more science, not less. 
Of critical importance, Hill argued, they should declare the positive value of scientific skepticism of science itself. Knowledge, Hill understood, was hard won and uncertain, and there would always be skeptics. What better strategy than to identify, solicit, support, and amplify the views of skeptics of the causal relationship between smoking and disease? Moreover, the liberal disbursement of tobacco industry research funding to academic scientists could draw new skeptics into the fold. The goal, according to Hill, would be to build and broadcast a major scientific controversy. The public must get the message that the issue of the health effects of smoking remains an open question. Doubt, uncertainty, and the truism that there is more to know would become the industry’s collective new mantra…. 
The very nature of controlling and managing information in public relations stood in marked contrast to the scientific notion of unfettered new knowledge. Hill and his clients had no interest in answering a scientific question. Their goal was to maintain vigorous control over the research program, to use science in the service of public relations. Al- though the tobacco executives had proposed forming a cigarette information committee dedicated to defending smoking against the medical findings, Hill argued aggressively for adding research to the committee’s title and agenda. ‘‘It is believed,’’ he wrote, ‘‘that the word ‘Research’ is needed in the name to give weight and added credence to the Committee’s statements.’’ a Hill understood that his clients should be viewed as embracing science rather than dismissing it…. 
Hill & Knowlton had successfully produced uncertainty in the face of a powerful scientific consensus. So long as this uncertainty could be maintained, so long as the industry could claim ‘‘not proven,’’ it would be positioned to fight any attempts to assert regulatory authority over the industry. Without their claims of no proof and doubt, the companies would be highly vulnerable in 2 crucial venues: regulatory politics and litigation…. 
In their work to control the science, the companies had also found that they had secured considerable advantages in the realms of media, law, and public opinion. All of this was dependent on maintaining the notions of controversy, uncertainty, and doubt. In 1961, Hill & Knowlton celebrated its successes on behalf of its tobacco client. The total number of cigarettes sold annually had risen from 369 billion in 1954, the company’s first full year of service to the industry, to 488 billion. Per capita consumption had risen from 3344 a year in 1954 to 4025 in 1961, the highest ever. ‘‘From a business stand- point,’’ Hill & Knowlton crowed, ‘‘the tobacco industry has weath- ered this latest spate of health attacks on its products.’’ In less than a decade, the in- dustry had been stabilized and was thriving… 
And finally, why we need to devise some kind of institution to prevent this:
The story of the tobacco ‘‘controversy’’ and the industry’s deliberative attempts to disrupt science is now, fortunately, fairly well known. In large measure, this story emerged only as a result of whistle blowers and litigation that led to the revelation of millions of pages of internal tobacco documents that both laid out this strategy and documented its implementation. But what has often gone over- looked in the assessment of the tobacco episode was the highly articulated, strategic character of seizing the scientific initiative, the engineering of science. This, however, was a factor well understood by John Hill and the public relations teams that advised the companies. They carefully documented what the scientific investment would buy and how best for the companies to protect and defend that investment. 
A wide range of other industries have carefully studied the tobacco industry strategy. As a result, they have come to better understand the fundamentals of influence within the sciences and the value of uncertainty and skepticism in deflecting regulation, defending against litigation, and maintaining credibility despite the marketing of products that are known to be harmful to public health. Also, they have come to understand that the invention of scientific controversy undermines notions of the common good by emphasizing in- dividual assessment, responsibility, and judgment.
There are so many research ideas in here that I'm not even going to bother trying to list them.  But the field is wide open. I don't think I know more than one or two people who work on the political economy of science.

Bad control

If you're an empiricist working on the environment's impact on human systems, read Marshall Burke's post at the G-FEED blog.


Wanted: smarter global agriculture

Closing yield gaps through nutrient and water management

Nathaniel D. Mueller, James S. Gerber, Matt Johnston, Deepak K. Ray, Navin Ramankutty & Jonathan A. Foley

Abstract: In the coming decades, a crucial challenge for humanity will be meeting future food demands without undermining further the integrity of the Earth’s environmental systems. Agricultural systems are already major forces of global environmental degradation, but population growth and increasing consumption of calorie- and meat-intensive diets are expected to roughly double human food demand by 2050 (ref. 3). Responding to these pressures, there is increasing focus on ‘sustainable intensification’ as a means to increase yields on underperforming landscapes while simultaneously decreasing the environmental impacts of agricultural systems. However, it is unclear what such efforts might entail for the future of global agricultural landscapes. Here we present a global-scale assessment of intensification prospects from closing ‘yield gaps’ (differences between observed yields and those attainable in a given region), the spatial patterns of agricultural management practices and yield limitation, and the management changes that may be necessary to achieve increased yields. We find that global yield variability is heavily controlled by fertilizer use, irrigation and climate. Large production increases (45% to 70% for most crops) are possible from closing yield gaps to 100% of attainable yields, and the changes to management practices that are needed to close yield gaps vary considerably by region and current intensity. Furthermore, we find that there are large opportunities to reduce the environmental impact of agriculture by eliminating nutrient overuse, while still allowing an approximately 30% increase in production of major cereals (maize, wheat and rice). Meeting the food security and sustainability challenges of the coming decades is possible, but will require considerable changes in nutrient and water management.


Various data visualizations for Stata by Nicholas Cox

I just ran into an amazing trove of plotting commands for Stata, all written by Nicholas J. Cox. The commands can all be downloaded with findit in the Stata command line and have integrated help files. See the list with examples here.


Potential catastrophe and climate negotiations

This is an important and elegant paper out last week in PNAS.

Climate negotiations under scientific uncertainty
Scott Barrett and Astrid Dannenberg

Abstract: How does uncertainty about “dangerous” climate change affect the prospects for international cooperation? Climate negotiations usually are depicted as a prisoners’ dilemma game; collectively, countries are better off reducing their emissions, but self-interest impels them to keep on emitting. We provide experimental evidence, grounded in an analytical framework, showing that the fear of crossing a dangerous threshold can turn climate negotiations into a coordination game, making collective action to avoid a dangerous threshold virtually assured. These results are robust to uncertainty about the impact of crossing a threshold, but uncertainty about the location of the threshold turns the game back into a prisoners’ dilemma, causing cooperation to collapse. Our research explains the paradox of why countries would agree to a collective goal, aimed at reducing the risk of catastrophe, but act as if they were blind to this risk.

I like that the authors investigate what is going on in the game, rather than simply showing that their theory predicts the overall outcome:
Communication is the essence of negotiation, and it is striking how the players used their proposals and pledges differently depending on threshold uncertainty. When the threshold was known, players communicated so as to coordinate to the thresh- old. When the threshold was unknown, communication was more strategic. Mean proposals for the Certainty and Impact Uncertainty treatments are very close to 150 (Table 1), with 83% of subjects in Certainty and 94% in Impact Uncertainty proposing precisely this amount. Mean proposals in Threshold Uncertainty and Impact- and-Threshold Uncertainty were significantly larger (Mann–Whit- ney–Wilcoxon test, n = 20, P < 0.05 each), with 29% of subjects in Threshold Uncertainty and 35% in Impact-and-Threshold Uncertainty proposing 200. Why did not more participants propose the collectively optimal 200? Answers to questions in our follow- up questionnaire provide a strong clue. Participants perceived their proposals as serving to motivate other students to contrib- ute; they thought that a proposal below 200 was more credible and so was more likely to stimulate contributions by others. 
Fig. 3 shows the relationship between pledges and actual contributions. In Certainty and Impact Uncertainty, almost all players (98% in both treatments) contributed at least as much as they pledged. Two of 200 contributed substantially less than they pledged, causing the two breakdowns in collective action in the Certainty treatment. By contrast, in the Threshold Uncertainty and Impact-and-Threshold Uncertainty treatments, most (82 and 75%, respectively) contributed less than they pledged, indicating that pledges, like proposals, were used strategically. 
Our follow-up questionnaire revealed that the reason for these differences had to do with the context in which decisions were made. Fairness and trust were more important considerations for the coordination games than for the prisoners’ dilemmas. Players were more trusting in the Certainty and Impact Uncertainty treatments because each recognized that the others had a strong incentive to be trustworthy in these situations. 
A final observation concerns attitudes toward risk, which can play a crucial role in the analysis of collective best outcomes (29). Our theory assumes that people are risk-neutral. Our questionnaire reveals that a majority of subjects are risk averse, but sta- tistical analysis shows that whether a person is risk averse has no discernable influence on behavior (SI Results). Once again, the context of these games seems to shape how people behave.

Click to enlarge


The Nobels and Repugnance

It's marriage week on FE, which is apt given Monday's economics Nobel* announcements. Both Lloyd Shapley and Alvin Roth contributed to market design for markets lacking price signals, e.g. in kidney transplants, medical school matriculation, and marriage (see, for example, Shapley's 1962 paper College Admissions and the Stability of Marriage).

Something common to these markets is a tendency for humans to think of prices in these context to be particularly repugnant ideas. Most of us flinch at the idea that one should be able to buy and sell organs, for example, a stance with which I largely agree given that it seems very hard to design mechanisms that would prevent abusive extraction (e.g., person A needs money and thus pressures their spouse into donating a kidney and claiming it's voluntary). But Shapley and Roth were exactly the kind of economists who took thorny mechanism design as a challenge. Which brings us to my favorite paper that's emerged from this Nobel corpus so far:
Repugnance as a Constraint on Markets 
Alvin Roth 
Why can’t you eat horse or dog meat in a restaurant in California, a state with a population that hails from all over the world, including some places where such meals are appreciated? The answer is that many Californians not only don’t wish to eat horses or dogs themselves, but find it repugnant that anyone else should do so, and they enacted this repugnance into California law by referendum in 1998. Section 598 of the California Penal Code states in part: “[H]orsemeat may not be offered for sale for human consumption. No restaurant, cafe, or other public eating place may offer horsemeat for human consumption.” The measure passed by a margin of 60 to 40 percent with over 4.6 million people voting for it (see http://vote98.ss.ca.gov/Returns/prop/00.htm).  
Notice that this law does not seek to protect the safety of consumers by governing the slaughter, sale, preparation, and labeling of animals used for food. It is different from laws prohibiting the inhumane treatment of animals, like rules on how farm animals can be raised or slaughtered, or laws prohibiting cockfights, or the recently established (and still contested) ban on selling foie gras in Chicago restaurants (Ruethling, 2006). It is not illegal in California to kill horses; the California law only outlaws such killing “if that person knows or should have known that any part of that horse will be used for human consumption.” The prohibited use is “human  consumption,” so it apparently remains legal in California to buy and sell pet food that contains horse meat (although the use of horse meat in pet food has declined in the face of the demand in Europe for U.S. horse meat for human consumption). 
The repugnance of eating horses is not limited to California. On September 72006, the U.S. House of Representatives passed by a vote of 351–40, and sent to the Senate, H.R. 503: “To . . . prohibit the shipping, transporting, moving, delivering, receiving, possessing, purchasing, selling, or donation of horses and other equines to be slaughtered for human consumption.” (That bill seems unlikely to pass into law, however.) Apparently, some kinds of transactions are repugnant in some times and places and not in others. This essay examines repugnance and its consequences for what transactions and markets we see. When my colleagues and I have helped design markets and allocation procedures, we have often found that distaste for certain kinds of transactions can be a real constraint on markets and how they are designed, every bit as real as the constraints imposed by technology or by the requirements of incentives and efficiency. In this essay, I’ll first consider a wide range of examples, including slavery and indentured servitude, lending money for interest, price-gouging after disasters, selling pollution permits and life insurance, and dwarf tossing.
Hat tip Andres Marroquin.

Update: the NY Times has a great Nobel winners' "reading list" here.

* what, did you want a disclaimer?


Sibling externalities in the marriage market

I have weddings on the brain since my lovely fiance and I are planning ours this year, so I thought it would be nice to do a "marriage series" of posts on FE. Here's the first installment.

Lately, I've been thinking about whether we can empirically measure the long-run effect (on HH capital) of spending lots of money on expensive weddings, reminding me of this recent paper on the potentially long-run effects of sibling competition in the marriage market.

Marriage Institutions and Sibling Competition: Evidence from South Asia
Tom Vogl
Using data from South Asia, this paper examines how arranged marriage cultivates rivalry among sisters. During marriage search, parents with multiple daughters reduce the reservation quality for an older daughter's groom, rushing her marriage to allow sufficient time to marry off her younger sisters. Relative to younger brothers, younger sisters increase a girl’s marriage risk; relative to younger singleton sisters, younger twin sisters have the same effect. These effects intensify in marriage markets with lower sex ratios or greater parental involvement in marriage arrangements. In contrast, older sisters delay a girl’s marriage. Because girls leave school when they marry and face limited earnings opportunities when they reach adulthood, the number of sisters has well-being consequences over the lifecycle. Younger sisters cause earlier school-leaving, lower literacy, a match to a husband with less education and a less-skilled occupation, and (marginally) lower adult economic status. Data from a broader set of countries indicate that these cross-sister pressures on marriage age are common throughout the developing world, although the schooling costs vary by setting.


Global crop area data at subnational resolutions

Bob Kopp showed me a new data interface for global crop area-harvested assembled by the FAO (here). Not many crops have anything close to complete coverage, but this is strong progress. The site says the data are downloadable, but I haven't tried so I don't know how user-friendly it is.


Attention is a scarce resource

I think this is important in many contexts (not just farming). For example, to harp on some of our recent themes, I don't think that most people are aware of how much their productivity falls when their temperature rises or how much quicker they are to anger when their environment is uncomfortable. My guess is that most people know that these kinds of things are issues, in some sort of qualitative story-like sense, but that they don't have any idea about the quantitative size of the effect -- and they certainly don't optimize their lifestyle by taking these things into account.

In an example I was just mentioning to a colleague: I was recently trying to sell two air-conditioning units to students from my PhD program (most of whom know about my work on temperature and productivity). At $50 a unit, buying one of these as an investment would have paid for itself in productivity after one or two hot days.  But I couldn't sell the ACs until I dropped the price and just gave them away!  Despite knowing that it will dramatically improve their productivity, even students trained to think about these problems will make mistakes in the optimization of their lifestyle. (Although, it's also possible that in this case the students were colluding to get me to lower the price...) Enough editorializing, here's the paper:

Rema Hanna, Sendhil Mullainathan, Joshua Schwartzstein
ABSTRACT: Existing learning models attribute failures to learn to a lack of data. We model a different barrier. Given the large number of dimensions one could focus on when using a technology, people may fail to learn because they failed to notice important features of the data they possess. We conduct a field experiment with seaweed farmers to test a model of “learning through noticing”. We find evidence of a failure to notice: On some dimensions, farmers do not even know the value of their own input. Interestingly, trials show that these dimensions are the ones that farmers fail to optimize. Furthermore, consistent with the model, we find that simply having access to the experimental data does not induce learning. Instead, farmers change behavior only when presented with summaries that highlight the overlooked dimensions. We also draw out the implications of learning through noticing for technology adoption, agricultural extension, and the meaning of human capital.


Mashup: watercolor regression of reported rapes and daily temperature in US counties

I was working with Matthew Ranson's crime and temperature data recently for a review article when Andrew Gelman tossed in his two cents on replotting the main figures, so figured I'd see how one of the plots looked if we showed the results as a watercolor regression, since that was a recent innovation that arose from discussions on FE and Gelman's blog (watercolor regression is a type of visually-weighted regression).

I found the rape-vs-temperature plot particularly striking/perplexing/upsetting/interesting (yes, county, month-by-county and year-by-county effects have been removed from the data), so I converted the number of rape cases reported each month into percentages of the mean monthly number of reported rape cases. Temperature is the monthly mean (across days) of daily maximum temperature. Dark coloration depicts the probability that the conditional mean is at a specified value, and the estimated mean is the thin white line.

Click to enlarge

This is the largest sample I've run the watercolor regression code on (N > 1.4M), but it took less than ten minutes to plot with a few hundred resamples and 300 bins in the x-variable.  As my first attempt to use the code to show real data, I think I'm pretty satisfied with how clear the depiction of uncertainty is, without distracting from the main message (an issue described here).

[I've posted a Matlab function to compute and plot watercolor regressions here and Felix Schönbrodt posted an implementation in R here.]