In my seminal paper, “Distance to health services influences insecticide-treated net possession and use among six to 59 month-old children in Malawi,” I indicated that Euclidean (straight line) measures of distance were just as good as more complicated, network based measures.
I didn’t include the graph showing how correlated the two were, but I wish I had and I can’t find it here my computer.
Every time I’ve done presentations of research of the association of distances to various things and health outcomes, someone inevitably asks why I didn’t use a more complex measure of actual travel paths. The idea is that no one walks in a straight line anywhere, but rather follows a road network, or even utilizes a number of transportation options which might be lost in a simple measure.
I always respond that a straight line distance is as good as any other when investigating relationships on a coarse scale. Inevitably, audiences are never convinced.
A new paper came out today, “Methods to measure potential spatial access to delivery care in low- and middle-income countries: a case study in rural Ghana” which compared the Euclidean measure with a number of more complex measurements.
The conclusion confirmed what I already knew, that the Euclidean measure is just as good in most cases, and the pain and cost of producing sexy and complicated ways of calculating distance just isn’t worth it.
It’s a pretty decent paper, but I wish they had put some graphs in to illustrate their points. It would be good to see exactly where the measures disagree.
Access to skilled attendance at childbirth is crucial to reduce maternal and newborn mortality. Several different measures of geographic access are used concurrently in public health research, with the assumption that sophisticated methods are generally better. Most of the evidence for this assumption comes from methodological comparisons in high-income countries. We compare different measures of travel impedance in a case study in Ghana’s Brong Ahafo region to determine if straight-line distance can be an adequate proxy for access to delivery care in certain low- and middle-income country (LMIC) settings.
We created a geospatial database, mapping population location in both compounds and village centroids, service locations for all health facilities offering delivery care, land-cover and a detailed road network. Six different measures were used to calculate travel impedance to health facilities (straight-line distance, network distance, network travel time and raster travel time, the latter two both mechanized and non-mechanized). The measures were compared using Spearman rank correlation coefficients, absolute differences, and the percentage of the same facilities identified as closest. We used logistic regression with robust standard errors to model the association of the different measures with health facility use for delivery in 9,306 births.
Non-mechanized measures were highly correlated with each other, and identified the same facilities as closest for approximately 80% of villages. Measures calculated from compounds identified the same closest facility as measures from village centroids for over 85% of births. For 90% of births, the aggregation error from using village centroids instead of compound locations was less than 35 minutes and less than 1.12 km. All non-mechanized measures showed an inverse association with facility use of similar magnitude, an approximately 67% reduction in odds of facility delivery per standard deviation increase in each measure (OR = 0.33).
Different data models and population locations produced comparable results in our case study, thus demonstrating that straight-line distance can be reasonably used as a proxy for potential spatial access in certain LMIC settings. The cost of obtaining individually geocoded population location and sophisticated measures of travel impedance should be weighed against the gain in accuracy.
Was reading Chris Blattman’s list of books that development people should read but don’t and found this in the Amazon description of “The Anti-Politics Machine: Development, Depoliticization, and Bureaucratic Power in Lesotho.”
Development, it is generally assumed, is good and necessary, and in its name the West has intervened, implementing all manner of projects in the impoverished regions of the world. When these projects fail, as they do with astonishing regularity, they nonetheless produce a host of regular and unacknowledged effects, including the expansion of bureaucratic state power and the translation of the political realities of poverty and powerlessness into “technical” problems awaiting solution by “development” agencies and experts.
Note that I do not harbor any ill will toward development or even, as a general rule, “technical solutions.” Having been involved with bed net distributions and having watched the outcomes of reproductive health interventions, for example, I can say that there are many positive outcomes of development projects. In my area, fewer kids are dying and women are becoming pregnant a whole lot less, decreasing the risk of maternal mortality.
Disclaimers aside, there is no doubt that development projects often fail for a number of reasons, the first of which is that leaders have no interest in seeing that they succeed. While leaders are indifferent to the outcomes, they happily take on the power that comes with them, embracing bureaucratic reforms, which are mostly just expansions of power at all levels of government.
This wouldn’t necessarily be a bad thing, except that African countries never embraced many of the protections of individual rights which restrict the powers of the state. Independence movements in much of Africa was predicated on an eventual return of power to the majority. Not many (none?) of these movements sought to protect the rights of the minority, much less the individual. Thus, there is little restriction on the types of rules which may be created and since many of these development projects influence policy, development projects unwittingly feed into the autocracy machine.
In the past, surveys were done on paper, either through a designed questionnaire or by someone frantically writing down interview responses. When computers came around, people would be hired to type in responses for later analysis.
Nowadays, with the advent of cheap and portable computing, research projects are rapidly moving toward fully digital methods of data collection. Tablet computers are easy to operate, can be cheaply replaced, and now can access the internet for easy uploading of data from the field.
Surveyors like them because large teams can be spread out over a wide space, data can be completely standardized and the tedious process of data entry can be avoided.
Of interest to me, however, is whether the technology is influencing the nature of the responses given. That is, will someone provide that same set of responses in a survey using digital data collection methods as in a paper survey?
Recently, we attempted using a tablet based software for a small project on livestock possession and management on Mbita Point in Western Kenya. I intended it as a test to see if a particular software package might be a good fit for another project I`m working on (the one that`s paying the bills).
We had only limited success. The survey workers found the tablets clunky and a number of problems with the Android operating system made it more trouble than the survey was actually worth. Of interest, though, was how the technology distracted the enumerators from their principle task, which was to collect data.
Enumerators would become so wrapped up in trying to navigate the various buttons and options of the software that they couldn`t effectively concentrate on performing the survey. Often they appeared to skip questions out of frustration or would just frantically select one of the many options in the hope of moving on to the next one.
In a survey of more than 100 questions, the process started taking far more time than households were willing to give. We eventually had to abandon the software and revert to a paper based method.
Surveys went from lasting more than one hour, to taking under 30 minutes. Workers were more confident and had more time to interact with the respondents. Respondents had more of an opportunity to ask questions and consider the meaning of what they were being asked. They offered far more information than we expected and felt that they were participating in the survey as a partner and not just as a passive victim.
One of our enumerators noted that people react differently to a surveyor collecting data on the tablets than with paper. She described collecting data with technology as being “self absorbed” and alienating to the respondents. Collecting data on paper, however, was seen as a plus. “They can see me writing down what they say and feel like their words are important.”
I`m thinking that the nature of the responses themselves might be different as well. Particularly with complex questions of health and disease, often the surveyors will have to explain the question and give a respondent a chance to ask for further clarification. Technology appears to inhibit this process, perhaps compromising the chance for a truly reasoned response.
While I am absolutely not opposed to the use of technology in surveys, I think that the survey strategy has to be properly thought through and the challenges considered. At the same time, however, data collection is a team effort and requires a proper rapport between community members and surveyors who often know each other.
Is technology restricting our ability to gather good data? Could the use of technology even impact the nature of the response by pushing them in ways which really only tell us what we want to believe rather than what actually exists?
I’m not exactly sure what we’re all supposed to be doing on World Malaria Day that we shouldn’t be doing every day, but at least we have a day! There’s no such thing as “World Helminth Day,” unfortunately.
What I think we should be doing on World Malaria Day:
1. Reducing ridiculous bureaucracy in developing countries which inflates the price of goods at the border.
2. Eliminate ridiculous protectionist policies in wealthy countries which selectively hobbles imports from developing countries.
3. Encourage true democracy in African States (where it doesn’t already exist) and eliminate unproductive authoritarian dead weight.
4. Guarantee rights to representation, legal fairness, political expression and property.
5. Create a global tax on capital and reinvest monies fairly in locally developed infrastructure projects in developing countries.
6. Encourage deep state investments in health care and health delivery in malarious countries while creating conditions favorable for the private sector to meet health needs.
7. Invest in the development of new pharmaceutical tools to prepare for the day when ACTs are no longer effective.
Wait, only points 6 and 7 had anything to do with malaria, you say, but I say they all do. Malaria is a complex disease, the root cause of which is poverty, the root cause of which is politics and economics. We will never be able to eliminate malaria unless we take care of all of the other problems which create the context that allows it to exist.
According to the “pathogen stress theory of values,” the evolutionary case that Thornhill and his colleagues have put forward, our behavioral immune systems—our group responses to local disease threats—play a decisive role in shaping our various political systems, religions, and shared moral views.
If they are right, Thornhill and his colleagues may be on their way to unlocking some of the most stubborn mysteries of human behavior. Their theory may help explain why authoritarian governments tend to persist in certain latitudes while democracies rise in others; why some cultures are xenophobic and others are relatively open to strangers; why certain peoples value equality and individuality while others prize hierarchical structures and strict adherence to tradition. What’s more, their work may offer a clear insight into how societies change.
This is a reasonable view, and something I’ve long observed from working on infectious diseases in developing countries. The developmental trajectory of a country is influenced by the deliberate avoidance of illness. An example can be seen in the locations of African cities. Many African administrative capitals are located on isolated, cool hilltops, far away from rivers and lakes. Colonialists would intentionally set up shop in areas where they were unlikely to encounter malaria.
Developmentally, this has had major implications for trade within Africa. European cities are often placed along water ways amenable to domestic European trade. The lack of trade between African countries is one of the reasons the continent has developed so poorly. This is the direct result of not only colonial priorities of resource extraction to Europe, but also the unfortunate placement of economic centers in response to malaria.
Certainly, the nature of cities themselves have much to do with the control of infectious diseases. Public works often involve the management of sewage waste and the delivery of clean water. Thornhill might suggest that the development of democracy, citizen involvement and taxation to pay for urban improvements are in direct response to enteric diseases.
However, while it is interesting to try to apply this view, it can be taken to the extreme:
Fincher (a graduate student of Thornhill) suspected that many behaviors in collectivist cultures might be masks for behavioral immune responses. To take one key example, collectivist cultures tend to be both more xenophobic and more ethnocentric than individualist cultures. Keeping strangers away might be a valuable defense against foreign pathogens, Fincher thought. And a strong preference for in-group mating might help maintain a community’s hereditary immunities to local disease strains. To test his hypothesis, Fincher set out to see whether places with heavier disease loads also tended toward these sorts of collectivist values.
I’m not sure it’s that easy to boil down political differences between Asia and Europe to a need to manage infectious disease. Certainly, Sweden is more collectivist than England, but I wouldn’t say that their infectious disease profiles are all that different.
Worse yet, if taken to the extreme, this “hunt for significance” will provide one with evidence to support any crazy theory at all. Pathogens exist wherever humans do. Moreover, we risk attributing the contribution of pathogens to human development based on current conditions, assuming that the present is deterministically preordained centuries ago. Until very recently, nearly the entire world was at risk for malaria, but despite this, various societies have embarked on different social and political trajectories.
The biggest problem I have with the theory is in its basic in rational theory. It assumes that humans are making rational choices based on pathogen threats, when we know, and particularly those of us who work in the tropics, that humans often have poor conceptions of disease transmission and causes of illness. At times, despite very obvious threate, humans will act in manners which exacerbate that threat. The history of enteric disease is filled with tales of ignorance and folly.
If we are going to subscribe to a rational model of political and social development which includes pathogens, then we have to also address first, the ability of pathogens to hijack human behavior to create new opportunities for replication and survival and second, that social changes can exacerbate the worst effects of infection. For the first point, I would look to the development of international trade systems which allow pathogens such as influenza to move around the world quickly, increasing opportunities for mutation to avoid immune responses. For the second I would point to polio, a disease which becomes a problem on after the introduction of water sanitation practices.
Thornhill’s ideas are interesting, and certainly provide good material for the popular press and BBQ conversation, but they require that the reader suspend too much consideration of the details of the complex history of human social and political development. Taken with restraint, as in the example of the locations of African cities, they can provide interesting insights into how current conditions are impacted by past pathogenic threats.
Every once in a while, you run across something that just gives you the chills.
“A report presented to the World Health Organization (WHO) in 1948 states: “It is not enough to quote that about 3,000,000 deaths are caused yearly by malaria in the world, or that every year about 300,000,000 cases of malaria occur …… that malaria is prevalent in tropical and subtropical areas where food production and agricultural resources are potentially very high, and that, by affecting the mass of rural workers, it decreases their vitality and reduces their working capacity and thus hampers the exploitation of the natural resources of the country. At a time when the world is poor, it seems that control of malaria should be the first aim to achieve in order to increase agricultural output” (WHO, 1948).
Snow RW, Amratia P, Kabaria CW, Noor AM, Marsh K: The changing limits and incidence of malaria in Africa: 1939-2009. Adv Parasitol 2012, 78:169-262.
Today, April 7th. is World Health Day, an annual event sponsored by the World Health Organization to help bring attention to pressing public health issues.
This years event focuses on vector borne diseases like dengue fever and Chagas disease, which are transmitted through a third party host such as Aedes mosquitoes or triatomines (kissing bugs).
Both of these diseases are becoming increasingly relevant as the world urbanizes. Dengue and malaria form a complementary nexus of diseases. Malaria is largely associated with rural areas, and rarely found in cities, where dengue fever is almost exclusively found in urban areas. Generally speaking, dengue is a disease of development, where malaria is a disease of the lack of development.
While known to be distributed widely through Latin America and Southeast Asia, dengue has yet to make it on Africa’s radar yet, simply (in my opinion) because not enough people are looking hard enough. Africa, as the most rapidly urbanizing area of the world will eventually face a double burden of dengue and malaria and health facilities aren’t yet prepared to deal with it.
It’s a reasonable question to which no one really has an answer. I work in a field site located on Lake Victoria, the office of which is based out of the International Centre for Insect Physiology and Ecology (ICIPE) station on Mbita Point.
We do malaria field surveys and have a large health and demographic surveillance system that has monitored births, deaths, migration and health events of nearly 50,000 people over the past six years.
The goals of the project are to monitor changes in demographics, outbreaks and changes in the dynamics of the transmission of infectious diseases and gauge the effectiveness of interventions.
While I view those as scientifically important, I don’t think that people on the ground experience any immediate benefit from scientific research activities. In fact, I’m pretty sure that, unless they’re getting a free bednet, it’s mostly an annoyance. Of course, we appreciate their cooperation and they are free to tell us to bugger off at anytime.
We are seeing rapid declines in malaria incidence, infant mortality and fertility in the communities we study. This is, of course, cause for celebration. Less kids are dying and people are having fewer of them.
In fact, the shift in the age distribution was so dramatic from 2011 to 2012, that we thought it an aberration of the data: the mean age of 12,000 people rose nearly two years from the beginning of 2011 to the latter part of 2012. Old people died off, and fewer babies were there to replace them, resulting in an upward shift in the age distribution. Cause for celebration in an area where women normally have anywhere from 5 to 10 children, who often end up malnourished, poorly housed and uneducated.
But we have to ask ourselves, how much of this is representative of trends in communities similar to the ones we study and how much is directly influenced by the presence of the research station itself?
A recent article in Malaria Journal documents the positive impacts that a research facility had on the local community:
To make the community a real partner in the centre’s activities, a tacit agreement was made that priority would be given to local people, in a competitive manner, for all non-professional jobs (construction workers, drivers, cleaners, field workers, data clerks, and others). Of the 254 people employed at the CRUN, about one-third come from Nanoro. This has strengthened the sense of ownership of the centre’s activities by the community. Through the modest creation of new jobs, CRUN makes a substantial contribution to reducing poverty in the community. In addition, staff members residing in Nanoro contribute to the micro-economy there.
Another crucial benefit for Nanoro and CRUN stemming from their productive engagement was electrification for the area. This was made possible by the mayor of Nanoro leading the negotiations for extending the national electrical grid to the CRUN, and with it, to the village of Nanoro. Electrification spurred a lot of economic activity and social amenities that enhance the wellbeing of the community, such as: (1) improved water supply through use electricity instead of generator; (2) ability to use electrical devices, such as fans during the hot season (when temperatures can reach 45-47°C), lighting so students can study at night, the use of refrigeration to safely store food and the extension of business hours past sunset.
Health care services have been improved through CRUN’s new microbiology laboratory. Before this laboratory was established, local patients had to travel about 100 km to the capital city, Ouagadougou, for the service.
This agrees with my experience on Lake Victoria. The presence of the research facility (built originally in the 1960’s) and the subsequent scale up of research activities has been transformative for the area. As more and more people have moved to the area, a bridge to Rusinga Island has been built, two new ferry routes have been installed, the existing ferries have been upgraded, power has been extended to the area and finally, after years of waiting, a paved road has been built from Kisumu to Mbita Point.
..which brings back me to my initial question. It is clear that the building of research facilities can be a major spur for economic development and economic activity in a previously desolate and marginalized area. In case of Mbita Point, it is possible that these gains can be sustained even following an eventual cessation of research activities and strangled funding. In this sense, field research projects are doing at least some of the world good.
However, the gains which these communities are experience really have little to do with the research projects themselves and more to do with the influx of employment and infrastructure that come with research stations and research projects. This is non-controversial and I’m sure that the locals appreciate it.
But the quality and goals of research need to be assessed. Are the results we are seeing truly representative of communities which may be similar to the Mbita Point of the past? Are we unnecessarily influencing the outcomes of the research and then perhaps inappropriately generalizing them to contexts which little resemble our target communities? From a scientific perspective, this is troubling.
Of greater concern, however, are we claiming that gains against malaria are being made, when in fact, morbidity and mortality in communities we haven’t looked at is increasing? This could result in a dangerous shift away from scaled up ITN distributions or even a total reduction in international funding. If this happens, kids will die.
A new study which just appeared in Malaria Journal, however, calls this optimism into question.
This review presents two central arguments: (i) that empirical studies measuring change are biased towards low transmission settings and not necessarily representative of high-endemic Africa where declines will be hardest-won; and (ii) that current modelled estimates of broad scale intervention impact are inadequate and now need to be augmented by detailed measurements of change across the diversity of African transmission settings.
So, our ability to accurately determine whether transmission intensity has declined is hampered by the fact that most studies of the disease occur in areas of low transmission. This would make sense. It is much easier for us to evaluate the malaria situation in Kenyan context than in the Democratic Republic of Congo due to availability of surveillance infrastructure, official mechanisms which allow research projects to move forward, and security issues.
The obvious problem with this, is the relationship of governance, economy an instability to malaria itself. People in the poorest countries are at the highest risk for malaria and people in the poorest parts of the poorest countries are at the highest risk of all. The trouble is, despite being the populations we are most concerned about, they are the hardest to reach, and the hardest to help.
Worse yet, the estimates of malaria prevalence found in a number of studies were considerably lower than estimates for the entire African continent.
The combined study area represented by measurements of change was 3.6 million km2 (Figure 1), approximately 16% of the area of Africa at any risk of malaria . The level of endemicity within these studied areas (mean PfPR2-10 = 16%) was systematically lower than across the continent as a whole (mean PfPR2-10 = 31%) (Figure 2). While 40% of endemic Africa experienced ‘high-endemic’ transmission in 2010 (PfPR2-10 in excess of 40%) , only 9% of the studied areas were from these high transmission settings.
This is a huge issue and one that shouldn’t be limited to malaria. While it is helpful to hear good news of malaria declines in formerly afflicted areas, we need to be careful about overstating the impact of interventions. Funding for malaria projects such as the distribution of insecticide treated bed nets was incredibly high throughout the 00’s but it is unlikely that trend will continue. Offering an positive picture can show that our efforts are valuable, but might also lead policy makers and donors to suggest that money be put toward other goals. If Sri Lanka is any indication, where malaria was nearly eliminated at one time but experienced a rapid and devastating resurgence, even a brief relaxation of malaria control efforts could erase current gains completely.