Some people have asked me if I was disappointed in the selection of the winners of the Nobel Prize for Economics, since I have been critical at times of some of the champions of randomized control trials (RCTs) as a way of measuring the impact of development programs. Not at all! I congratulate Esther Duflo, Abhijit Banerjee and Michael Kremer. David Roodman predicted to me long ago that Duflo would one day win the prize, and he was correct! This recognition is well-deserved.
My arguments with the recipients and others who collaborate with them, in particular with Professor Dean Karlan, are, however, quite real and part of the public record. (By the way, I applaud how gracious the prize recipients were in noting that they were part of a much larger community of researchers who deserved recognition.)
In short, I have had issues not so much with the techniques used by these researchers, but rather how they have at times over-emphasized what they characterized as the unexpectedly muted positive impact of microcredit (perhaps in order to catch the attention of the media and policy-makers) and also how they have, in my view (and also that of another Nobel laureate in economics mentioned below), over-hyped RCT’s advantages over other research methodologies.
In addition, I have taken issue with the fact that they have implied that the microcredit programs they have studied have been representative of microcredit programs generally – which would suggest that their findings could easily be generalized. Clearly, the discipline of random assignment that is so important to the RCT methodology itself has not been applied to which microcredit programs were studied using this technique – a fact that one might not gather from reading the press releases and statements by some researchers. Most important, to my knowledge no microcredit provider in Bangladesh has ever been studied using the RCT methodology despite it being the most vibrant market in the world.
The most extensive treatments of my quibbles and my more serious disagreements with the so-called randomistas can be found in chapter 7 of Tim Ogden’s impressive book Experimental Conversations: Perspectives on Randomized Trials in Development Economics and in my own book Changing the World Without Losing Your Mind (pp. 172-180).
I have also addressed these issues when the Center for Financial Inclusion published my review of Dean Karlan’s book (with Jacob Appel) More Than Good Intentions and how it covers microcredit in a highly distorted way. (I wrote a supplemental review of the book’s treatment of other development interventions, which I found much more compelling.)
In addition, Grameen Foundation published a speech I gave at a World Bank conference where six RCT studies of microcredit programs were presented, in which I criticized how those studies were being characterized.
I summarized a compelling critique of RCTs and their champions by a previous winner of the Nobel Prize in Economics, Professor Angus Deaton. (The full critique appears in the same book by Tim Ogden cited above.) Here is another thoughtful reflection on the limitations of RCTs published in the aftermath of the Nobel Prize announcement, and Vikram Akula weighs in powerfully here.
And at one point I published a response to Karlan’s attempt to needle me on social media.
Finally, I summarized an excellent paper by Tim Ogden about the accomplishments of microcredit and microfinance – accomplishments that have been partially obscured by how many people have digested and tried to make sense of the RCT studies – and the exciting roles that subsidy and grant-making can still have in building on those accomplishments.
Indeed, my dissatisfaction with how many policy-makers and donors came to view the research on microfinance prompted me to commission (but in no way to control) three meta-studies of the research literature, the most recent one being this excellent paper by Professor Kathleen Odell of Dominican University.
One of my central problems with philanthropy in America is that it is much too fad-driven, especially compared to our peers in Europe. Fads come and go without much regard to the true strengths and limitations of the social innovations that appear on the philanthropic community’s radar screen. This goes for social science research too.
In my view, RCTs are one of many ways of gaining insights into the effectiveness of development programs and many other endeavors to improve the human condition. They are also imperfect. They can be done meticulously as well as carelessly; they typically cost a lot of money to do well; thoughtfully interpreting their results is much more art than science, and sometimes researchers, policy-makers, and the media have failed miserably at this important task. I wonder if RCTs will become just another fad, over-hyped for a time and then largely forgotten for a generation.
Regardless, I congratulate Duflo, Banerjee and Kremer for their achievements and their hard work, despite the problems I have had at times with how some people have used their innovations and findings to shape policy, practice, and public perceptions of microfinance.