DARE to question things implemented with no evidence

A retrospective of the madness of crowds…


DARE Marks a Decade of Growth and Controversy

Today [Sep. 9, 1993], 5,200 communities in all 50 states have DARE programs, and more than 5.5 million children this year will learn about drugs through police officers using the DARE curriculum. At home, DARE has been called the first test of community-based policing, a law enforcement notion favored by Police Chief Willie L. Williams that involves bringing police officers into contact with the communities they patrol.

A brief history of DARE

DARE was founded in 1983 as a partnership between the Los Angeles Police Department and the L.A. public schools. The idea was simple: Officers would go into schools to talk to kids, “boosting the self-esteem of students so that they can resist the temptation to use drugs,” as the Los Angeles Times put it in a 10-year retrospective on the program in 1993.

The program drew bipartisan praise and spread like wildfire. Politicians realized that by supporting DARE, they could paint themselves as pro-cops and pro-kids: a win-win. President Ronald Reagan proclaimed the first “National DARE Day” in 1988, a tradition that continued well into the Obama administration.

Eventually, the program was in place in up to 75 percent of the nations school districts, by DARE’s own count. At its height, the group boasted an eight-figure budget, with much of that money coming from government sources. Individual state affiliates raised millions more.

But with success came scrutiny. Public health researchers started looking for evidence that the program was meeting its goals of reducing teen drug use. The first wave of studies, published in the early 1990s, didn’t find any.

Studies on effectiveness

1992 – Indiana University

Researchers at Indiana University, commissioned by Indiana school officials in 1992, found that those who completed the D.A.R.E. program subsequently had significantly higher rates of hallucinogenic drug use than those not exposed to the program.

1994 – RTI International

In 1994, three RTI International scientists evaluated eight previously-done quantitative analyses on DARE's efficacy that were found to meet their requirements for rigor. The researchers found that DARE's long-term effect couldn't be determined, because the corresponding studies were "compromised by severe control group attrition or contamination." However, the study concluded that in the short-term "DARE imparts a large amount of information, but has little or no impact on students' drug use," and that many smaller, interactive programs were more effective.

After the 1994 Research Triangle Institute study, an article in the Los Angeles Times stated that the "organization spent $41,000 to try to prevent widespread distribution of the RTI report and started legal action aimed at squelching the study." The director of publication of the American Journal of Public Health told USA Today that "D.A.R.E. has tried to interfere with the publication of this. They tried to intimidate us."

1995 – California Department of Education

In 1995, a report to the California Department of Education by Joel Brown Ph. D. stated that none of California's drug education programs worked, including D.A.R.E. "California's drug education programs, D.A.R.E. being the largest of them, simply doesn't work. More than 40 percent of the students told researchers they were 'not at all' influenced by drug educators or programs. Nearly 70 percent reported neutral to negative feelings about those delivering the antidrug message. While only 10 percent of elementary students responded to drug education negatively or indifferently, this figure grew to 33 percent of middle school students and topped 90 percent at the high school level." In some circles educators and administrators have admitted that DARE in fact potentially increased students exposure and knowledge of unknown drugs and controlled substances, resulting in experimentation and consumption of narcotics at a much younger age. Criticism focused on failure and misuse of tax-payer dollars, with either ineffective or negative result state-wide. 

1998 – National Institute of Justice

In 1998, a grant from the National Institute of Justice to the University of Maryland resulted in a report to the NIJ, which among other statements, concluded that "D.A.R.E. does not work to reduce substance use." D.A.R.E. expanded and modified the social competency development area of its curriculum in response to the report. Research by Dr. Dennis Rosenbaum in 1998 found that D.A.R.E. graduates were more likely than others to drink alcoholsmoke tobacco and use illegal drugs. Psychologist Dr. William Colson asserted in 1998 that D.A.R.E. increased drug awareness so that "as they get a little older, they (students) become very curious about these drugs they've learned about from police officers." The scientific research evidence in 1998 indicated that the officers were unsuccessful in preventing the increased awareness and curiosity from being translated into illegal use. The evidence suggested that, by exposing young impressionable children to drugs, the program was, in fact, encouraging and nurturing drug use. Studies funded by the National Institute of Justice in 1998, and the California Legislative Analyst's Office in 2000 also concluded that the program was ineffective.

1999 – Lynam et al.

A ten-year study was completed by the Donald R. Lynam and colleagues in 2006 involving one thousand D.A.R.E. graduates in an attempt to measure the effects of the program. After the ten-year period, no measurable effects were noted. The researchers compared levels of alcohol, cigarette, marijuana and the use of illegal substances before the D.A.R.E. program (when the students were in sixth grade) with the post D.A.R.E. levels (when they were 20 years old). Although there were some measured effects shortly after the program on the attitudes of the students towards drug use, these effects did not seem to carry on long term.

2001 – Office of the Surgeon General

In 2001, the Surgeon General of the United StatesDavid Satcher M.D. Ph.D., placed the D.A.R.E. program in the category of "Ineffective Primary Prevention Programs". The U.S. General Accounting Office concluded in 2003 that the program was sometimes counterproductive in some populations, with those who graduated from D.A.R.E. later having higher than average rates of drug use (a boomerang effect).

2007 – Perspectives on Psychological Science

In March 2007, the D.A.R.E. program was placed on a list of treatments that have the potential to cause harm in clients in the APS journal, Perspectives on Psychological Science.

2008 – Harvard

Carol Weiss, Erin Murphy-Graham, Anthony Petrosino, and Allison G. Gandhi, "The Fairy Godmother—and Her Warts: Making the Dream of Evidence-Based Policy Come True," American Journal of Evaluation, Vol. 29 No.1, 29–47(2008) Evaluators sometimes wish for a Fairy Godmother who would make decision makers pay attention to evaluation findings when choosing programs to implement. The U.S. Department of Education came close to creating such a Fairy Godmother when it required school districts to choose drug abuse prevention programs only if their effectiveness was supported by "scientific" evidence. The experience showed advantages of such a procedure (e.g., reduction in support for D.A.R.E., which evaluation had found wanting) but also shortcomings (limited and in some cases questionable evaluation evidence in support of other programs). Federal procedures for identifying successful programs appeared biased. In addition, the Fairy Godmother discounted the professional judgment of local educators and did little to improve the fit of programs to local conditions. Nevertheless, giving evaluation more clout is a worthwhile way to increase the rationality of decision making. The authors recommend research on procedures used by other agencies to achieve similar aims.

2009 – Texas A&M

"The Social Construction of 'Evidence-Based' Drug Prevention Programs: A Reanalysis of Data from the Drug Abuse Resistance Education (DARE) Program," Evaluation Review, Vol. 33, No.4, 394–414 (2009). Studies by Dave Gorman and Carol Weiss argue that the D.A.R.E. program has been held to a higher standard than other youth drug prevention programs. Gorman writes, "what differentiates D.A.R.E. from many of the programs on evidence-based lists might not be the actual intervention but rather the manner in which data analysis is conducted, reported, and interpreted." Dennis M. Gorman and J. Charles Huber, Jr.

The U.S. Department of Education prohibits any of its funding to be used to support drug prevention programs that have not been able to demonstrate their effectiveness. Accordingly, D.A.R.E. America, in 2004, instituted a major revision of its curriculum which is currently being evaluated for possible effectiveness in reducing drug use.

The U.S. Substance Abuse and Mental Health Services Administration (SAMHSA) identified alternative start-up regional programs, none of which have longevity nor have they been subjected to intense scrutiny.

Project D.A.R.E. Outcome Effectiveness Revisited

Objectives. We provide an updated meta-analysis on the effectiveness of Project D.A.R.E. in preventing alcohol, tobacco, and illicit drug use among school-aged youths.

Methods. We used meta-analytic techniques to create an overall effect size for D.A.R.E. outcome evaluations reported in scientific journals.

Results. The overall weighted effect size for the included D.A.R.E. studies was extremely small (correlation coefficient = 0.011; Cohen d = 0.023; 95% confidence interval = −0.04, 0.08) and nonsignificant (z = 0.73, NS).

Conclusions. Our study supports previous findings indicating that D.A.R.E. is ineffective.