My last post (# 49) provided compelling evidence of consumption-smoothing and shock-coping that is improved by having access to microfinance. Can we (development interventionists) increase that positive effect by providing financial education?
For a long time, there has been great enthusiasm and effort for building financial literacy through financial education of the populations of both developed and developing countries, broadcast to all or targeted to special cases. A major effort was recently made to adapt the content and methods of financial education to the perspectives, needs and constraints of microfinance clients, current and potential. This was a three-way partnership project of Microfinance Opportunities, Freedom from Hunger and Citi Foundation called the Global Financial Education Program (GFEP). Freedom from Hunger has gone on to adapt the GFEP education modules for use with pre-literate, very poor women for a variety of financial and non-financial service organizations in a variety of poor countries.
Regarding the effectiveness of financial education, the GFEP modules in particular, my Freedom from Hunger colleague, Bobbi Gray, led the GFEP Financial Education Outcomes Assessment in collaboration with Jennefer Sebstad, Monique Cohen and Kathleen Stack. The paper that emerged from this study—Can Financial Education Change Behavior? Lessons from Bolivia and Sri Lanka—remains the best guide to the issues involved in doing financial education that leads to financial capability impacts. I also recommend the more recent review, Bridging the Gap: The Business Case for Financial Capability, by Anamitra Deb and Mike Kubzansky of the Monitor Group (commissioned and funded by the Citi Foundation)—see also my cautionary commentary on this paper, particularly regarding the difficult tension between achieving a commercially viable business model for large-scale “education” vs. building true financial capability of individuals, households and communities.
Both papers conclude the evidence that financial education leads to financial capability is mixed at best. Outcomes are sensitive to the great variety of financial education objectives, content, delivery methods, delivery quality, delivery channels, audiences and contexts—not to forget the difficulties of measuring impacts when this variety is not clearly defined and controlled by researchers. In fact, researchers have often differed in what outcomes they are looking for—what financial capability looks like in real life—and the length of time and the circumstances in which we should expect impacts on financial capability to become evident. Most evaluations have not been set up in a way to really see how people put newly taught financial behaviors into action. The most common benefit of microfinance products seems to be that they support resilience by helping to smooth consumption and build assets to anticipate major expenses and deal with shocks. It follows that financial education for microfinance clients is most likely to be effective when it enhances the resilience-building effects of microfinance. Therefore, the evaluation of financial education should focus on changes linked to consumption smoothing, building assets and dealing with shocks. So far, we haven’t been doing this well or at all.
In short, the body of evidence for financial education effectiveness is a mess. But it is not so messy that we cannot discern some themes.
Relevance of education content to the lives of the intended learners is critical. Education to promote a positive attitude toward saving for the future may assume falsely that people do not want to save—the real constraint may be that they have no access to savings opportunities that offer both safety and liquidity. Relevance can be achieved only by diligent learning about the learners’ financial lives, such as through financial diaries. We know that even very poor people lead complicated financial lives, already using a variety of informal financial management tools, such as borrowing from friends, family, moneylenders and shopkeepers and saving in various ways, even lending to and guarding savings for others. How does our financial education add value to what these people already know and do? This is not to say they don’t have a lot to learn, but what they learn won’t be put into practice unless it improves upon or enhances what they already know and do. And that varies by age, experience, education and livelihood.
Education objectives strongly affect the probability of different outcomes, but often the objectives are not clearly manifest in the education design. The most important and common difference in education objectives is, on one hand, education for full financial capability (to know the choices and make decisions among the full range of product offerings), and on the other hand, promotion of the relatively narrow range of products provided by a particular financial service provider. The latter is not always aligned with the interests of clients seeking full financial capability. If the objective is to promote savings behavior, the outcome assessment should look for changes in the learner’s total savings by all methods, not just saving in an account provided by a particular microfinance provider. However, a microfinance provider may have no business case for financial education unless the education also promotes the provider’s products. Fortunately, product-linked education can serve the financial interests of both provider and recipient of the information. Moreover, it can be integrated with general financial education within the same financial education module, as Freedom from Hunger is doing with some of its partners. Learning about a real, locally available commitment savings account can be an effective way to promote more general savings habits. I suggest you read the more detailed commentary by Freedom from Hunger on this particular issue.
Simpler is better, but how simple does it have to be? One of the best-known recent randomized trials tested the effectiveness of simple vs. complex accounting training for microenterprise operators (Greg Fischer, Alejandro Drexler and Antoinette Schoar on training for ADOPEM clients in the Dominican Republic). Note that this study examined a narrow topic of business training, not “financial literacy” as the paper’s title claims. Simplified, rule-of-thumb training (basically just repeated exhortation to keep business money separate from household money – not much different from equally successful text message reminders to promote saving behavior) produced significant impact on business outcomes and on saving, while the other training (fairly elaborate training in accounting techniques) did not. The authors rather obviously concluded that the simpler training was easier for the learners to understand and put into practice (simple always works better for me, too). While it is very good to confirm that such simple training can produce meaningful outcomes, this study could have been so much more interesting and meaningful if they had compared the “rule of thumb” training with a more comprehensive financial education module that had been designed for low-income learners (a module that adhered to the relevance guidance above). How much more impact could have been achieved by a still-simple but somewhat more comprehensive module? We need to know how much education is enough and how much is too much.
Quality matters! Education of all types for all sorts of learners has a well-deserved reputation for not working well or at all, because the way the education is designed and delivered is so often doomed by its own ineptness to fail. This is particularly, painfully true of education for low-income, low-literacy adult learners. My Freedom from Hunger colleagues, Edouine François and Matilde Olazabal, have written poignantly of their personal struggles to become the expert adult educators they are now. In this light, it is unfortunate that behavioral economics researchers tell their readers little to nothing about the educational techniques they used as their main experimental interventions; clearly there is insufficient appreciation of the nuances of technique that often spell the difference between an educational intervention likely to succeed and one that is guaranteed to fail. Without clarity about the technique used, we find it difficult to interpret the results, especially negative results, and even more difficult to replicate the study design in other contexts. The need for clarity about content and delivery applies even to relatively simple informational interventions, like Dupas and Robinson offering Kenyan ROSCA members different devices for health savings and Flory alerting communities to the availability of savings accounts at nearby stops for mobile banking units. They do not report precisely what they told these people and how they told it to get such remarkably powerful impacts on savings behavior and related outcomes. Quality of the intervention matters!
In summary, financial education clearly can work but often it does not—it just depends … This post is really a teaser for the Gray et al. paper. If you care about this topic, you should read that paper. Its conclusions are echoed by this post on “Takeaways from the 2012 Citi-FT Financial Education Summit.”