BEcurious

BEworks Conversations with David Pizarro: From Piaget to COVID-19; The Importance of Behavioral Science, Data Collection and Experimentation

Written by Kelly Peters | Jun 29, 2020 3:10:27 PM

Associate Professor, Department of Psychology, Cornell 

Chief Science Officer, BEworks 

Co-host of the Very Bad Wizards Podcast 

 

David Pizarro’s research interests are in moral judgment, and in the influence of emotions on thinking and deciding. He is particularly interested in moral intuitions (especially concerning moral responsibility, and the permissibility or impermissibility of certain acts), and in biases that affect moral judgment.  

 

In this conversation, Kelly and David discuss how we evaluate information, and how behavioral science, data collection, and experimentation are key to understanding the pandemic.

 

 

 

Kelly Peters: We're in an era where the stages of development around things like curiosity, and skepticism, have us in a place where a large percentage of our society is not exhibiting good cognitive discipline around what they believe to be true, or want to be true. Unfortunately, that's even the position of political leadership.

 

Left to our own devices, our environment is not conducive to that healthy scientific inquiry. I'd love to get your response to that.

 

David Pizarro: I was in a conversation the other day where somebody pointed out the importance of facts. I thought to myself, why would you ever need to point out the importance of facts? It seems right that factual information is something we would all want. I think there is a deep problem. I think one of the deepest problems that psychologists could contribute to is this very question. What is going on with people who don't seem receptive to what we would think of as reasonable information that comes from the scientific community, or from experts at large?

 

I think the easy thing to do, especially for people who believe themselves to be on the side of science, and championing the causes that people on that side of the political spectrum believe in, that it's very easy to say, “Well, some people are stupid, and I'm not.” I think that's exactly the wrong way to approach the problem. What we're seeing is amplification of blind spots. 

 

Importantly, I think nobody is immune to this. This is a problem we all face. What kind of information, or what kind of things you end up being biased toward obviously can have a great impact. If I'm unreasonable about what food I eat, who cares, but if I'm unreasonable about my views on climate change, that's a problem.

 

I think we're all deeply unreasonable. I think the problem is magnified by a set of conditions that we're seeing right now. I think the biggest condition is that we have, more than ever, access to 'information'. We have access to a ton of material. You might think we have access to more facts than ever; I can log on to snopes.com and see whether something is true or not. I can find authoritative sources just by googling it. Why would people be less amenable to the truth? I think it is simply because the more information that is out there, the more able we are to find information that is consistent with what we believe. The big irony of this information age is, I think, that it has caused us to believe that we have firm ground to stand on when in fact, all we're doing is repeating information that's out there that is consistent with what we already believe. That, I think, is a danger for everybody.

 

 

David Pizarro: One of the things I like to ask my students sometimes is “When's the last time you changed your mind about something important to you?” I think that when we reflect on this, hopefully it gives us a little sense of how rare it is. At least in my experience. Maybe I'll ask you, Kelly, when's the last time you changed your mind about something that was important?

 

Kelly Peters: So much of my time right now is trying to figure out what's true and what's not true around COVID-19. I'm finding it very difficult to keep up. It's such a fascinating process because when you think you understand something and you've got a hold on it, then those facts have such a short half-life right now. I don't even know what I'm standing on firmly enough to have such a strong conviction.

 

Having heard that question a few years ago, and also learning about the statistical technique of Bayesian modeling, which is just such a cool way to look at how we process new information, has at least made me more aware of the fact that we don’t change our minds. So, probably, I've changed my mind on the fact that I actually change my mind.

 

David Pizarro: COVID is, unfortunately, a very good example because part of the difficulty in acquiring beliefs rationally is knowing what it means to accept something to be true with some degree of probability. I think the most palpable way for me to explain it is ‘what does it mean when there is a 70% chance of rain?’ Well, luckily, that's an easy one. If you're of a certain personality type, you're just going to carry an umbrella. You might carry an umbrella if there's 10% chance of rain.

 

We have to convert probabilistic information, degree of uncertainty, into an actionable item. I don't know whether masks are effective, should I wear one? Well, I'm 63% sure that if everyone wears one, it'd be a good thing. Does that mean that you carry one? If it dips below that certainty level? Does it mean you don't?

 

These are pretty complex decisions that we have to make. When what our biology is really good at is driving behavior. All of this stuff is a way to navigate our environment and keep us safe. Our nervous system is processing information that's going to guide our behavior, and it wasn't built to rationally acquire beliefs in this very fine-grained manner. It was built to tell you ‘should I walk down that path or not?’ or ‘Should I eat that thing or not?’ and those two don't go well together. It takes a lot of thinking and undoing our natural ways of thinking.

 

 

Kelly Peters: How do you think behavioral science has been doing in terms of its contribution to COVID-19? Are we doing enough? Are we doing the right things? Have we been doing some of the wrong things?

 

David Pizarro: I have some real opinions on this. First, let me say as a behavioral scientist myself, we're privy to all the ugly errors that that we make. We know that the process can be fraught with error, but that doesn't mean that I don't believe that we're making progress and that we're doing good.

 

How we respond to this crisis is something that I'm a little worried about because a lot of behavioral scientists have jumped on giving advice, giving advice quickly and immediately, and this is well intentioned. I studied disgust. A lot of that is stuff about disease and disease avoidance. It seems as if I ought to have something really interesting and important to say about this pandemic, but I think that we've jumped the gun as a field; obviously not everybody. I think that the quickness with which we've sought to give advice based on what we know might have been a mistake.

 

I think that it might have been a mistake because there is a whole lot of speculation on how to apply some of the things we know. Even if we believe that we know those things 100%, it's unclear whether we should apply them in this specific context. Behavioral science, like most sciences, is extremely messy. What we're trying to do, usually, is to isolate variables. We're trying to narrow the question down and present it in a way that we can come up with a particular answer about how the mind works or why humans behave the way they do. That doesn't mean that we're very good at predicting human behavior.

 

That doesn't mean that we ought not use behavioral science. This is a great instance in which the scientific method ought to be applied to figure out how people are responding to this pandemic. Another thing is this that the only good way to know whether something is true or not is to try to verify it with empirical methods, unless you're talking about logical truths. If we want to know how people will respond to a message to wear a mask, we should study how people are responding to the message of wearing a mask. Now that can be informed by all kinds of studies we've done in the past, but we won't really know about this problem until we study this problem.

  

Kelly Peters: We've had many conversations about the differences between research that has been done in the lab and research that's done in the real world. BEworks was born primarily to overcome that gap and to improve the nature of what we know about research, about human behavior, because it's a much more complex and dynamic world. To be able to find things out through that real world can add such richness to the knowledge that we have. Just to playback what you said, one of the things that concerns you is the jump to advice. You're saying that because what we've tested in the lab is very basic, it's very controlled, and context matters, we're not in a position in a new and evolving science to say, even if it's strong in the lab, it's still a lab. It's not the real world.

 

Then you talked about the need for empirical evidence. Basically, the need for experiments. Let's get practical here. We have limited time, limited budget, so many unknowns. How would we run experiments?

 

David Pizarro: This is getting into the weeds but getting into the weeds in exactly the way that I think is important. Not only for what we do at BEworks, but for behavioral scientists at large. There are obvious things like where to prioritize and budget, and use budget to collect data.

 

I think one of the most important things that we should have is an infrastructure that allows us to continue collecting data. By that I don't mean creepy ‘Facebook knowing what you want to buy’ data. I mean things like taking regular measures of how people are responding in your county, in your city, in your state. It's sometimes very hard to do a true experiment where we're randomly assigning people to one condition versus another, but what we call pseudo-experiments are very possible. Where you get a city that instituted a lockdown and one that didn't that are matched very similarly, and you try to see what the outcomes are.

 

Right now, we're suffering from a lack of even acquiring data. The data looks really, really different if you assume that we don't have accurate reporting. In many cases, it's not because people are hiding the numbers, it is that we didn't have the infrastructure to collect these data.