Facts are facts and will not disappear on account of your likes. — Jawaharlal Nehru
For the last week and a half, I have been at summer camp for social psychologists.
No, that isn’t a made-up thing. And yes, I am an adult. (Well, kind of.)
From 9:00am – 5:00pm every day, I have been taking classes with two of the foremost professors on psychological interventions: programs that implement social psychology to improve the world around them. But social programs like this aren’t always based on rigorous empirical research, and when they’re not, they may not work as we expect…
So, let’s look at some social programs that began with good feelings, rather than good facts.
D.A.R.E.
Beginning in the mid-80’s and through the 2000’s, the U.S. government implemented an anti-drug program (Drug Abuse Resistance Education; DARE) across 75% of the nation’s schools, costing taxpayers three-quarters of a billion dollars annually.
Designed by a team at the L.A. police department, the program sent uniformed officers to schools across the country to educate students about the dangers and consequences of drug use. And from educators to parents, everyone supported it.
The advice was coming from authority figures. The message was clear and concise with straightforward conclusions. What wasn’t there to like?
. . .Maybe the fact that every piece of evidence showed the program was useless.
Shortly after D.A.R.E.’s creation, researchers (i.e., social psychologists) began running short- and long-term studies on the program, each one returning to find its complete ineffectiveness. In fact, in some cases, it actually made kids more likely to use drugs.
Now, the program directors at D.A.R.E. obviously didn’t like this, and because the government had already earmarked hundreds of millions of dollars for the program, there was a lot of pushback (e.g., the national D.A.R.E. organization bullied scientists into suppressing research and filed lawsuits against newspapers that published unflattering reports).
Amazingly, even after study after study after study (and these are just a few) showed its ineffectiveness, the program still flourished because it “felt right”—which highlights the importance of social psychological work:
Feelings do not equal facts, and only empirical research can disentangle the two—a point made clear in another educational intervention.
WAAAAAAAA!
Already, decades and decades of research have shown that abstinence training in schools is completely bunk (i.e., no reliable study shows that these “abstinence education” programs actually reduce high school sexual activity). However, another attempt to reduce the likelihood of teen sex (or at least teen pregnancies) is to have students care for a “virtual baby.”
Maybe yours was a sack of flour or a decorated egg, today’s dolls are the cutting edge of baby simulation. With the ability to cry, burb, and need cooing, a pack of ten dolls and equipment cost about $18,000; however, this price point hasn’t stopped schools from using them: over 2,000 schools in Australia and over 65% of US schools use these virtual babies.
Again, the original idea behind this program stems from people’s intuition: if a student sees how hard it is to care for something like a sack of flour, they would recognize how hard (and unappealing) it would be to care for an actual child.
However (and you knew this “however” was coming), what does the empirical research have to say?
A recent study with over 3,000 students and over 55 schools showed that students who used one of these virtual babies (vs. not) were actually more likely to get pregnant. Whereas only 4.3% of students in the control condition had a teen baby, 7.6% in the baby-simulation program did.
FOR SCIENCE!
So, as we’ve learned today, any program—even those with the best intentions based on the best intuitions (i.e., hunches)—can result in less than optimal outcomes. And were it not for social psychologists intervening on these interventions, we would continue to fund programs that were having the exact opposite intention.
Of course, empirical data (i.e., science) is far from perfect, but it’s much better than what “feels right” at actually getting what is right.
Factually,
jdt
Psychophilosophy to Ponder: Can you think of some reasons as to why the baby-simulation program backfired? For example, some of the young women reported becoming attached to the doll. Can you think of other reasons? Related, why may the D.A.R.E. program, where these anti-drug messages are coming from police officers, have been ineffective or backfired?
Brinkman, S. A., Johnson, S. E., Codde, J. P., Hart, M. B., Straton, J. A., Mittinty, M. N., & Silburn, S. R. (2016). Efficacy of infant simulator programmes to prevent teenage pregnancy: a school-based cluster randomised controlled trial in Western Australia. The Lancet, 388(10057), 2264-2271.
West, S. L., & O’Neal, K. K. (2004). Project DARE outcome effectiveness revisited. American journal of public health, 94(6), 1027-1029.
Anonymous
A very interesting post.
It is funny (not in the Ha Ha way), that people in general will go with feelings rather that the (empirical) facts presented.
Case in point, I am affiliated with a casino, and no matter how much I share my (infinite:)) knowledge with friends and relatives on how they should bet in anyone particular game, they will ultimately bet the way their “gut” tells them to. In fact, when their “gut” bet gives them a winning out come, they will soon throw away the fact based betting chart, and play the way that they feel most comfortable with; which reinforces bad behavior, which usually leads to a phone call asking me for money.
jdt
I’m glad you enjoyed the post(!), and I’m sorry for the tardy response. (I’ve been returning from a two-week fellowship and moving into a new apartment!)
One of the things about feelings vs. facts is that facts can always be “misrepresented” or “discounted,” whereas feelings are impossible to ignore or deny. As you know, there are times when feelings can be accurate, but often, they are biased by a number of factors that fail to take in the whole picture.
And what you talk about with the gut-based betting is spot on! Indeed, your example succinctly captures a theory my adviser came up with, “the self-validation hypothesis.” In this instance, the theory would apply to the “validating” role of emotions in confirming that the behavior the person made (e.g., betting on their gut) was the correct one.