When We Got It Wrong

Facts are facts and will not disappear on account of your likes. — Jawaharlal Nehru

For the last week and a half, I have been at summer camp for social psychologists.

No, that isn’t a made-up thing. And yes, I am an adult. (Well, kind of.)

From 9:00am – 5:00pm every day, I have been taking classes with two of the foremost professors on psychological interventions: programs that implement social psychology to improve the world around them. But social programs like this aren’t always based on rigorous empirical research, and when they’re not, they may not work as we expect…

So, let’s look at some social programs that began with good feelings, rather than good facts.


Beginning in the mid-80’s and through the 2000’s, the U.S. government implemented an anti-drug program (Drug Abuse Resistance Education; DARE) across 75% of the nation’s schools, costing taxpayers three-quarters of a billion dollars annually.

Designed by a team at the L.A. police department, the program sent uniformed officers to schools across the country to educate students about the dangers and consequences of drug use. And from educators to parents, everyone supported it.

The advice was coming from authority figures. The message was clear and concise with straightforward conclusions. What wasn’t there to like?

. . .Maybe the fact that every piece of evidence showed the program was useless.

Shortly after D.A.R.E.’s creation, researchers (i.e., social psychologists) began running short- and long-term studies on the program, each one returning to find its complete ineffectiveness. In fact, in some cases, it actually made kids more likely to use drugs.

Now, the program directors at D.A.R.E. obviously didn’t like this, and because the government had already earmarked hundreds of millions of dollars for the program, there was a lot of pushback (e.g., the national D.A.R.E. organization bullied scientists into suppressing research and filed lawsuits against newspapers that published unflattering reports).

Amazingly, even after study after study after study (and these are just a few) showed its ineffectiveness, the program still flourished because it “felt right”—which highlights the importance of social psychological work:

Feelings do not equal facts, and only empirical research can disentangle the two—a point made clear in another educational intervention.


Already, decades and decades of research have shown that abstinence training in schools is completely bunk (i.e., no reliable study shows that these “abstinence education” programs actually reduce high school sexual activity). However, another attempt to reduce the likelihood of teen sex (or at least teen pregnancies) is to have students care for a “virtual baby.”

What the virtual babies look like.

Maybe yours was a sack of flour or a decorated egg, today’s dolls are the cutting edge of baby simulation. With the ability to cry, burb, and need cooing, a pack of ten dolls and equipment cost about $18,000; however, this price point hasn’t stopped schools from using them: over 2,000 schools in Australia and over 65% of US schools use these virtual babies.

Again, the original idea behind this program stems from people’s intuition: if a student sees how hard it is to care for something like a sack of flour, they would recognize how hard (and unappealing) it would be to care for an actual child.

However (and you knew this “however” was coming), what does the empirical research have to say?

A recent study with over 3,000 students and over 55 schools showed that students who used one of these virtual babies (vs. not) were actually more likely to get pregnant. Whereas only 4.3% of students in the control condition had a teen baby, 7.6% in the baby-simulation program did.


So, as we’ve learned today, any program—even those with the best intentions based on the best intuitions (i.e., hunches)—can result in less than optimal outcomes. And were it not for social psychologists intervening on these interventions, we would continue to fund programs that were having the exact opposite intention.

Of course, empirical data (i.e., science) is far from perfect, but it’s much better than what “feels right” at actually getting what is right.


Psychophilosophy to Ponder: Can you think of some reasons as to why the baby-simulation program backfired? For example, some of the young women reported becoming attached to the doll. Can you think of other reasons? Related, why may the D.A.R.E. program, where these anti-drug messages are coming from police officers, have been ineffective or backfired?


Brinkman, S. A., Johnson, S. E., Codde, J. P., Hart, M. B., Straton, J. A., Mittinty, M. N., & Silburn, S. R. (2016). Efficacy of infant simulator programmes to prevent teenage pregnancy: a school-based cluster randomised controlled trial in Western Australia. The Lancet388(10057), 2264-2271.

West, S. L., & O’Neal, K. K. (2004). Project DARE outcome effectiveness revisited. American journal of public health94(6), 1027-1029.

Author: jdt

Jake writes weekly posts every Wednesday on the intersection of psychology and philosophy. To learn more about him, or to propose a topic you'd like him to cover, go to https://everydaypsychophilosophy.com/contact.

Share This Post On