Deaths in Darfur: Keeping Ourselves Honest
How many people have died in Darfur and what is the value of this information? The recent ruling by the British Advertising Standards Authority that Save Darfur was guilty of misrepresenting the figure of 400,000 deaths as "fact" rather than, in its view, as "opinion," has ignited a controversy that has long haunted advocacy around humanitarian disaster.
The offending advertisement by Save Darfur and Britain’s Aegis Trust read: "SLAUGHTER IS HAPPENING IN DARFUR. YOU CAN HELP END IT. In 2003, Sudanese President Omar al-Bashir moved to crush opposition by unleashing vicious armed militias to slaughter entire villages of his own citizens. After three years, 400,000 innocent men, women and children have been killed."
There are two parts to the case. The first is that the figure of 400,000 deaths during the crisis is an upper-limit estimate not supported by the best studies, and therefore cannot be regarded as "fact" but rather as a disputed interpretation. This is the major concern of this posting.
The second is the implication that the deaths are wholly "slaughter" by the Sudan government and its militias, rather than predominantly due to hunger and disease. While such famine deaths may have their ultimate cause in the war, and especially the government’s conduct of the war during the extreme phase of 2003-04, there is an important difference between violent killing and death by these other causes. As Sam Dealey noted in his August 12 opinion column in the New York Times, different policy prescriptions follow: stopping massacres demands a different response to stopping hunger and disease. There is no disagreement on this: nobody claims that all, or even a majority, of the dead are killed in violence.
The authoritative overview of mortality in Darfur up to late 2005 is the evaluation by the U.S. General Accounting Office. This exercise asked a panel of experts to examine the different mortality surveys done. They did this task with a remarkable thoroughness and professionalism, requesting those who had undertaken the surveys for their data, and then replicating the methods to test the findings. They used a high standard of peer review criteria. The studies had to be replicable. All the studies involved making assumptions, and these assumptions were varied to explore the range of outcomes that came with varying each one, which in turn would assess the robustness of each study’s methods.
There is a great deal of science and statistics involved in the estimation of mortality. There is also good judgement based on experience, which is where selecting and examining assumptions becomes important.
I was asked to be a member of the evaluation panel, based on my history of conducting disaster demography, including in Darfur in the 1980s. I was not able to take up the invitation because I was tied down at the Abuja peace talks. But my assessment of the different studies does not differ from the consensus of the experts.
The most reliable study is the one conducted by the Centre for Research on the Epidemiology of Disasters. The least reliable estimates were those provided by John Hagan and Eric Reeves. (See the GAO experts’ verdicts in the tables posted below.) The CRED study is also towards the lower end of estimates. That is significant. On close reading of the CRED method—which is a survey of surveys rather than an original field study—it is clear that the authors were careful to make pessimistic assumptions at every stage. This is unusual—normally they would have made medium or marginally optimistic assumptions about mortality. The reason for the pessimism was that the figures for mortality in Darfur had become politicized, with many advocates speaking about extremely high death rates that were not, in fact, supported by systematic evidence. CRED’s researchers did not want to be accused of minimizing the crisis.
Personally, I would have preferred to have seen CRED stick to median assumptions, and therefore come with a slightly lower figure for estimated excess deaths. The reason for this is my own personal history on assessments of famine mortality, which goes back to my PhD research in the 1980s. The basic question I sought to address in my study, later published as Famine that Kills, was: why did outside experts predict such high death rates in Darfur in 1984-85 which then failed to materialize, despite lack of relief assistance? I compared the predictions for excess deaths in that famine, which ranged from 175,000 to 2 million (most erring on the upper side), with the best demographic and epidemiological estimate, which was 95,000, and sought to explain why the foreign experts had got it so badly wrong.
The basic reason: the people of Darfur were far hardier and more skilled at surviving food crisis than outsiders appreciated. The commonest fallacy of that time (often repeated now) was that people not reached by foreign aid were ipso facto worse off than people who were reached.
In fact there are three pretty reliable rules of thumb concerning mortality during humanitarian crises. Number one, if an informed source gives a range of estimates for actual or predicted deaths, say "50,000 to 200,000 dead in the next year," the press and the humanitarian advocates will seize on the high end figure and neglect the low end or the middle range. Even when post-crisis studies bring the numbers down, the headline figure somehow remains imprinted on the historical record. So the Ethiopian famine of 1984-85 is still widely said to have killed one million, because senior UN officials said so. But—allowing for the fact that accurate figures are impossible to come by—the best estimates by demographers and epidemiologists are about half as high. I could give many other examples.
Number two, the high end figures rarely if ever materialize and the low end ones turn out to be more accurate. Relief workers on the ground, toiling among the most hungry and sick, also tend to run surveys that lead to mortality estimates that are slightly higher than would be found by a survey of the general population. For this reason, demographers often advise that the study of famine mortality should be undertaken by an external, experienced specialist, rather than a field worker.
The glaring exception to this is the mass murder in Rwanda in 1994, when the figures for deaths surpassed any expectation, but the Rwandan genocide is an exception to just about every rule. The crises of hunger and disease that followed in the refugee camps in 1994 and 1996 did however follow the rule.
Number three, a crisis of mass displacement follows a regular and predictable mortality curve. There is a sharp peak, followed by a decline that slowly levels off, bringing death rates down to normal levels and often slightly below normal. (See the posting of June 20: “Are things getting worse in Darfur?”) The reason for the "slightly below" normal is a combination of (a) good health facilities provided by relief agencies and (b) lower fertility rates during the crisis, which means fewer young children, and as children under five represent about two thirds of the mortality in a Sudanese population, ipso facto this reduces the crude death rate. A relief intervention can blunt the peak of mortality, accelerate the decline, and prevent further bumps as the crisis stretches on, but cannot change the basic pattern. Thus it is common to have a protracted humanitarian crisis in which mortality rates in displaced populations are lower than pre-crisis levels. In Darfur, the peak followed the big offensives of 2003-04, and the falling away occurred in 2004-05.
Number four (as mentioned), is the "these are the lucky ones" fallacy. How many times does a journalist cradle a stick-thin child in a feeding center and say that this child is luckier than those who haven’t made it to the relief camp? It’s rarely correct, for reasons that disaster epidemiologists came to appreciate in the 1980s, with the result that wherever possible relief programs try to keep people at home rather than attracting them to relief centers.
Thus, experience counsels caution: people (especially tough and resourceful people, as in Darfur) have survived better than we expect in just about every crisis. Had I been a member of the GAO panel, I would have added my assessment to the majority opinion: CRED is credible, and (dare I say it) possibly on the high side.
This discussion has focused so far on those who have died from hunger and disease. What of deaths from violence? On this question, well-established epidemiological patterns are of less value. Methods for estimating violent deaths in a general population are much less refined than those for estimating deaths from disease and hunger. Jennifer Leaning of Physicians for Human Rights and I conducted a study of deaths in the Mogadishu war of 1992, in which we tried to develop some survey techniques. I also tried to do this in the Nuba Mountains a few years later. This is remarkably difficult, as the range of estimates for deaths in the Iraqi civil war attest. So more caution is in order than for the famine-type mortality estimates. But, once again, I would concur with the GAO’s panel that CRED estimate is the most reliable. An estimated 30% of the deaths during the period of intense war in 2003-04 are attributable to violence.
What of studies published after the GAO completed its assessment? John Hagan and Alberto Palloni published a paper in Science in September 2006. This is a broad review of the various published studies of mortality in Darfur which uses methods comparable to those of CRED to examine the data and extract estimates for mortality. Its final estimates are within the range 170,000-255,000. While the authors describe these estimates as “conservative”, it should be noted that this is a figure for total deaths, not excess deaths over what would normally be expected.
There is also a study by the group Bloodhound which focuses on those killed by violence in attacks on villages between April 2003-September 2005. This is a rich data source that confirms the basic patterns, namely that the majority of attacks were by government and Janjaweed, and that most of the killing took place in late 2003 and early 2004. When the authors come to extrapolate from their dataset to a global estimate of violent deaths they are on less firm ground, because of the methodological problems with estimating violent deaths alluded to above. Their estimate is that 87,200 people (range 57,000—128,000) were killed in attacks on villages. If correct, this more than doubles the CRED estimate for violent deaths, and raises the total excess deaths to the region of 250,000. However, until this study has been subjected to the same rigorous peer review as the others, its findings must be treated with caution.
In conclusion, the CRED study remains the best to date, though we must be prepared to revise our figures, upwards or downwards, as better data become available.
Let me repeat: there is no certainty in these figures. The reality could be different. But the pattern is both clear and familiar, and the best guess is approximately 200,000 excess deaths, plus or minus.
What of mortality since the end of 2005, after the reference period of the studies under review? The data for the displaced populations indicate a pattern, familiar from protracted emergencies, of crude death rates at normal levels, albeit with occasional bumps. There are few data for the populations not reached by relief agencies, but established patterns, survey reports and rapid assessments indicate that relatively normal conditions exist in many areas. (We have to remember of course that for decades, most of Darfur’s population has lived without access to any modern health services or relief distributions.) There is certainly no evidence for a famine. We know what a killing famine looks like, and rural Darfur does not resemble that.
Since the end of the major offensives in 2004, reports of violent deaths are compiled by the UN on a regular basis, though not published. There are peaks and lulls but the reports—which cover all significant incidents—indicate between 6,000 and 7,000 fatalities over the last two and a half years. The last three major spikes in violent deaths have been intra-Arab fighting (Terjam versus Mahariya militia) and the defeat of Sudan government forces by rebel groups in north Darfur.
It’s possible to vary the assumptions or introduce new variables which lead to wildly different estimates of excess deaths. At first glance, the arithmetic can seem plausible. I urge those who like to indulge in desk-top math to examine the GAO study carefully. It’s main conclusion is absolutely valid: the data indicate the severity of the crisis, but the accuracy of the estimates could be improved.
The estimation of mortality ought to be a purely scientific affair, free from moral coloring. But it never is. Predictions of mortality are used by relief agencies to sound the alarm and accelerate a response, or by human rights advocates to raise the political stakes. In Darfur, the figures have become more politicized than any in recent history.
In my view, it is imprudent to use upper-end estimates. A long-term consideration is that inflating the estimates can cheapen the currency of suffering. After famines that are said to kill hundreds of thousands, a crisis with "only" 50,000 expected deaths might not meet the grade for our response. A medium-term concern is, what happens if a scientific survey finds out that (for example) "only" 100,000 people died instead of the 250,000 that were claimed. Does that mean that our effort was wasted? Or (in the case of killing) that the mass murderers are suddenly innocent of a lot of crimes? A mass murderer who is convicted of ten murders and acquitted of ten is no less a mass murderer. We should be able to detach ethics from statistics.
I fear that the Save Darfur campaign fell into the trap of using inflated estimates, and was caught out by the European Sudanese Public Affairs Council (ESPAC), which takes a consistently pro-Khartoum line. The inflation was unwise because it was unnecessary. The news headline has been "Save Darfur exaggerates," which is deeply unfortunate. Noting that President Omar al Bashir has said that only 9,000 people died in Darfur, it would be equally valid to trumpet that "Sudan’s spokesmen concede that 200,000 died." And the death of an estimated 200,000 people in Darfur, from massacre and man-made humanitarian disaster, is a crime of the first order.
A few months ago, there was an attack by Sudanese military helicopters on the village of Amarai. The first reports, put out by local SLA commanders, mentioned 26 dead. The SLA’s humanitarian coordinator, from his detention in a hospital in Kadugli, chided the commanders and called journalists to give the correct figure for fatalities, three. "We do not need to persuade the world there are atrocities committed in Darfur," he said, "but we need to keep ourselves honest."
GAO Experts’ Evaluation of Darfur Mortality Estimates:
In response to a communication from John Hagan, who pointed out an important error in the characterization of the study that he and Alberto Polloni published in Science in 2006, the relevant paragraph has been corrected. My apologies to Profs Hagan and Palloni. All comments are gratefully received, and corrections made where they are due.
Truthfulness Claims About Genocide in the New York Times
John Hagan
Should you take at face value the opinions of Sam Dealey, the author of the New York Times August 12 op ed, “An Atrocity That Needs No Exaggeration”? His New York Times op ed falsely claims that Save Darfur was found to have “breached standards of truthfulness” – Clause 7.1 of the British Advertising Standards Association – in reporting the death toll in Darfur. The British Association considered invoking this clause in response to a complaint by a Sudanese business group. The Association decided not to do so. Instead, it cited clauses about “divisions” and “matters of opinion.” It offered an “action” recommendation that Save Darfur in the future state its sources of opinions and claims. This was not a finding of “untruthfulness.”
Sam Dealey further mislead New York Times readers by suggesting that a Government Accounting Report highly evaluated an estimate of 131,000 “excess” deaths in Darfur. He neglected to indicate that the first sentence of this GAO report said, “The experts we consulted did not consistently rate any of the death estimates as having a high level of accuracy and noted that all of the studies had methodological strengths and shortcomings.” This included an earlier higher estimate of our own. We subsequently published a “floor estimate” in Science to address the kinds of concerns raised by the GAO, as described briefly below.
Dealey also neglected to report that the GAO seriously questioned an April 2005 State Department report which estimated that as few as 63,000 died in the Darfur conflict, saying “Further, many experts believed that the lower end of State’s estimate was too low and found that published documents describing State’s estimate lacked sufficient information about its data and methods to allow it to be replicated and verified by external researchers.” Dealey did not mention that for more than a year following this dubiously low State Department estimate, major news organizations downgraded their reports to say that “tens of thousands” had died in Darfur, a number that sounded uncomfortably similar to Sudanese President Al Bashir’s claim of ten thousand deaths.
Dealey neglected to inform readers that our co-authored September 2006 Science article on “Death in Darfur” is the only peer reviewed scholarly estimate of mortality in Darfur to appear after the studies reviewed by the GAO report. Science is among the most highly regarded journals in the world. He neglected to inform readers that our estimate was of a “range between 170,000 and 255,000 deaths.” The purpose of our Science article was to establish a figure below which no reputable news source would go in its reporting. Eric Reeves, who has worked this issue the longest and hardest, offers higher estimates, which should also be seriously considered.
We said in Science that “although we cannot overcome the limitations in the basic information, on the basis of the surveys available, we conclude that the death toll in Darfur is conservatively estimated to be in the hundreds of thousands rather than the tens of thousands of people.” We then emphasized that “it is possible that the death toll is much higher.” Since the September 2006 Science article was published, nearly all news organizations, including the New York Times, have adopted the 200,000 or greater figure. However, just the month before our Science publication, on August 23, 2005, the New York Times had reported the low ball “tens of thousands” figure.
Sam Dealey makes false and misleading claims in his op ed about genocide exaggeration. Would the New York Times have published his op ed without the false “truthfulness” attack on Save Darfur? We shouldn’t be debating Save Darfur’s truthfulness, we should be acting to end this genocide and to allow its survivors to rebuild their lives. The New York Times was seriously misled by Dealey’s false and misleading claims.
I appreciate Dr. Hagan’s constructive criticism of my article. Nevertheless, I do feel his main points must be addressed:
* "[Dealey’s] New York Times op ed falsely claims that Save Darfur was found to have ‘breached standards of truthfulness’ – Clause 7.1 of the British Advertising Standards Association – in reporting the death toll in Darfur."
I do not, in fact, specifically cite clause 7.1 in my op-ed. The ASA found Save Darfur violated two codes: "Division of Opinion" and "Matters of Opinion." I do not think it is in error to sum these up as breaching standards of objectivity and truthfulness. An opinion is an argument, not fact or truth. An egregious opinion is an argument that demands "robust substantiation" — which is precisely what the ASA requested from Save Darfur and obviously did not receive.
* "Sam Dealey further mislead New York Times readers by suggesting that a Government Accounting Report highly evaluated an estimate of 131,000 ‘excess’ deaths in Darfur. He neglected to indicate that the first sentence of this GAO report said, ‘The experts we consulted did not consistently rate any of the death estimates as having a high level of accuracy and noted that all of the studies had methodological strengths and shortcomings.’"Â
In fact, I wrote: "So how many are dead in Darfur? As the G.A.O. study notes, reliable numbers are hard to come by. But the estimate that garnered the highest confidence was the one from the Center for Research on the Epidemiology of Disasters." That’s hardly misleading.
Dr. Hagan is correct to point out that the panel of experts regarded no study to be flawless in its accuracy. But as the remaining 70-odd pages of the report evidence, these experts also found more redeeming qualities in some studies than in others. The clear and convincing leader in this respect was CRED’s.
* "Dealey also neglected to report that the GAO seriously questioned an April 2005 State Department report which estimated that as few as 63,000 died in the Darfur conflict, saying ‘Further, many experts believed that the lower end of State’s estimate was too low and found that published documents describing State’s estimate lacked sufficient information about its data and methods to allow it to be replicated and verified by external researchers.’"Â
Dr. Hagan is again correct. I did not mention the State Department estimate. Neither did I write about Jan Coebergh’s, Eric Reeves’ nor the World Health Organization’s. But since Dr. Hagan raises the topic, it is worth mentioning that, in the G.A.O. study, the panel of Dr. Hagan’s peers consistently found State’s estimate of 63,000 to 146,000 excess deaths to be more reliable and sound than his own.
* Dealey neglected to inform readers that our co-authored September 2006 Science article on "Death in Darfur" is the only peer reviewed scholarly estimate of mortality in Darfur to appear after the studies reviewed by the GAO report. Science is among the most highly regarded journals in the world. He neglected to inform readers that our estimate was of a "range between 170,000 and 255,000 deaths."
In fact, in clear reference to his Science article, I wrote that "Dr. Hagan revised his estimate sharply downward" — in this instance, by more than one-third.
* Sam Dealey makes false and misleading claims in his op ed about genocide exaggeration.
I wrote that the exaggerated death tolls "hamper aid-delivery groups, discredit American policymakers and diplomats, and harm efforts to respond to future humanitarian crises." I firmly stand behind what I wrote — and from what I heard before and hear now, so too do the aid and diplomatic corps.
Dr. Hagan is right to be upset that his initial study for the Coalition for International Justice was panned by a panel of his peers. Rather than lash out at a journalist, he might ask Save Darfur why it did not do as he did: Back away from what appears to be a highly flawed study.
The New York Times today (August 22, 2007) in the lower left hand corner of it’s op-ed published a correction to Sam Dealey’s "An Atrocity That Needs No Exaggeration" (published August 12, 2007).
It reads as follows:
Correction
A recent Op-Ed article about the death toll in Darfur incorrectly characterized a ruling by the British Advertising Standards Authority on Save Darfur Coalition advertisements. The authority did not find that the ads, which put the number of dead at 400,000, "breached standards of truthfulness." Rather, it told Save Darfur to present the figure as opinion, not fact.
*It is my opinion that the "exaggeration" in Dealey’s story was his "truthfulness."Â*
Dr James Smith, CEO Aegis Trust weighs in:
Note also Dr Smith’s article on the Aegis Trust webpage.
Professor Hagan,
I think it is misleading to claim that your article in Science is the only peer-reviewed estimate. Certainly Science is a prestigious journal, but to imply that the GAO report was not a “review” by your academic peers or that a review by Science is somehow more rigorous than the GAO review, sells the quality of the GAO report short.
For the same reasons that the estimates in Science should be taken seriously, the CRED estimates should be held in higher regard than the CIJ and Reeves estimates.