From click bait and sponsored content to misrepresented stats and cherry-picked data, there’s a slew of not-quite-true to flat-out-false propaganda parading around as fact in this age of information sharing.
Before changing your diet in light of a new study or hitting that “share” button attached to a news story from an unknown source, you may want to ask yourself: Is this bit of information even plausible?
Daniel J. Levitin’s new book, “A Field Guide to Lies: Critical Thinking in the Information Age,” provides readers with valuable tools for deciphering the credible from incredible as they sift through the barrage of material thrown at them each day.
Levitin, 58, is a neuroscientist, cognitive psychologist, accomplished music producer and author of three consecutive best-selling books including “This is Your Brain on Music.” He’s dean of the College of Social Sciences at Minerva Schools at KGI and a two-time graduate of University of Oregon, earning his master's and doctoral degrees in Eugene after graduating from Stanford with honors.
He will be at Powell’s City of Books, 1005 W Burnside St., at 7:30 p.m. Sunday, Sept. 11, 2016, for a book signing and discussion of his latest title, released by Dutton publishing.
Emily Green: Is a book like “A Field Guide to Lies” increasingly important now that misinformation is so prevalent on the internet?
Daniel J. Levitin: I think critical thinking is increasingly important, and my book is my attempt to convey that. But I think what’s happened is educators, parents and grandparents, for so many generations, have focused on teaching kids what they need to know in terms of facts. You know, “In 1492 Columbus sailed the ocean blue. The sun is 93 million miles away and it takes 8 1/2 minutes for light to reach us from there.” All this stuff we had to memorize: “9 times 5 is 45.” But we’ve fallen behind in teaching kids how to think and to be critical thinkers. I think it’s more urgent now than ever because there’s so much misinformation. We have to be our own deciders of what’s true and what’s not, because there’s nobody on the internet doing it for us.
E.G.: If you had one piece of advice for someone who is about to click “share” on a piece of news or information – what should they think about before they decide to spread it?
D.J.L.: I would take a close look at it and see if the claim is even plausible. If there are numbers involved, do the numbers work out? I’m not talking about dealing with a difficult equation, but just basic knowledge.
You can test plausibility when someone makes a claim about gas mileage or financial matters or rate of return. There was a famous headline that said the cost of something went down by 1,200 percent. Whatever the cost of something is, if it goes down by 100 percent, it’s at zero. So if it goes down 200 percent, somebody is now paying you what you used to pay for the product. If it goes down by 1,200 percent, that just doesn’t seem plausible, that’s somebody misunderstanding percentages. The second is: I would just do a quick search to see if it’s a hoax. Snopes.com or other sites are helpful there, and then be willing to retract it later if it turns out you were wrong.
E.G.: One of the things you brought up in your book that really got me, because I have one of these stories, and I think a lot of people do, is that scenario when you’re overseas on vacation and you run into someone you know. It seems to be this incredible coincidence. Can you explain the error in thinking this way?
D.J.L.: This is a brief excursion into probability theory, and you have to be careful in how you express surprise at an outcome. In particular, you have to be careful in how you describe the event.
So, you’re overseas, you run into somebody that you haven’t seen since college. You’re surprised to see them, and you think, “Gee, what are the odds that I would run into Tony standing in front of the Mona Lisa in the Louvre?” That seems astronomical, and it is. The odds that you would run into Tony on that particular day in that particular place are very rare unless the two of you had coordinated your trips.
But then you have to back up and say: What if I hadn’t run into Tony in front of the Mona Lisa, but anywhere in the Louvre? Would I be just as surprised? Well, I guess I would have. What if I hadn’t run into him at the Louvre, but at a restaurant or a café or the Charles de Gaulle Airport? Would I be just as surprised? Yeah – I think I would be. For that matter, what if it wasn’t Tony? What if I ran into anybody that I’ve ever known and haven’t seen in awhile, anywhere in the world other than where I usually live? Would I still be surprised? Well, yes I would.
When you start looking at it that way, it’s fairly likely that you’re going to run into people somewhere at sometime. You just don’t know when. And of course when it happens, you’re surprised, but you’d be equally surprised wherever it happened and whoever it involved, usually.
E.G.: In the process of writing this book, have you learned anything about the ways in which you, yourself, may have been duped by misinformation in the past?
D.J.L.: Yes. I never really looked into what the word “homeopathic” meant. And then I saw a talk by James Randi, the magician and founder of Skeptic magazine. He gave a talk in which he explained what it means for something to be a “homeopathic remedy,” such as you’d find at a natural food store or even a lot of legitimate drug stores. What it means is that the substance that’s supposed to act on your body has been diluted so many times, like hundreds of thousands of times, that there’s no trace of it left anymore, and the water that they are delivering it in is supposed to have some sort of cellular or molecular memory of the thing that had at one point been in there. And this is just the most blatant pseudo-science and nonsense. Within the homeopathic community, the more they dilute the substance, the stronger it gets. Randi used to joke that the way you overdose on a homeopathic remedy is to take none of it – because that’s the strongest possible formulation.
E.G.: You talk about statistics in your book. Can you give me some examples of ways in which statistics can be very unreliable?
D.J.L.: The obvious one is averages. We don’t think of them this way, but averages are distortions – you’re taking a whole bunch of numbers and trying to narrow them down to a single number, and all manner of things can go wrong there.
This happens even to some reputable and well-meaning and rigorous sources. The Pew Research Center, which is very high up in my book of being a rigorous research source, reported recently something that doesn’t make a lot of sense.
They said that Americans read an average – and by that they meant the mean – of 12 books per year. That I’m willing to believe, but if you think about it, what does that really tell you?
There are two parts to the Pew story, the first part is, if the mean is 12, it’s possible that you have a whole bunch of people reading 24 books per year, and an equal number of people reading zero books, and that would give you a mean of 12 – but the number 12 isn’t a value that represents what any one person has done – there’s nobody reading 12, they’re either reading 24 or zero. So you’ve got this huge spread, and the average has distorted that.
The second problem that I have with the Pew report is that the typical American has read four books, and they say that that’s the median. Well, the median, in statistical language, is the value in the center of the distribution. Half your values are larger and half are smaller, so in the example I just gave, suppose there’s a whole bunch of people that read 24 books per year, another bunch who read zero, and then one person who read 12. That person is the median, because half the people are above him or her, and the other half the people are below, but now the median isn’t a number that really represents the typical person at all, it’s just one person out of a whole bunch, so we have to be careful when we look at these averages.
The next time you read an average, you should ask yourself, is it possible that this is distorting the underlying picture? And, which average are they using – the mean or the median, or are they using the mode, the value that appeared most often, which is sometimes very far from the mean or the median.
E.G.: During this presidential election cycle, is there a particular fallacy being thrown around by either of the candidates that you think is exceptionally deceptive?
D.J.L.: I’m not a political scientist, and I don’t want to go outside my area of expertise. I really hate it when people who are trained in one thing think that extends to expertise in another domain, and I talk about that in the book – I think that’s a problem. But what I will say, as an ordinary citizen – not speaking as a neuroscientist or as the author of a book on critical thinking – my impression is that an interesting feature of this election is that a whole lot of people decided some time ago who they wanted to support, and didn’t want to hear anything negative about their candidate, and didn’t want to hear anything positive about the opponent, and that’s created an election in which there’s not a lot of conversation across the aisle, and not a lot of open-mindedness, and not a lot of people willing to change their minds.
The bedrock of scientific reasoning, we call Bayesian reasoning, after the Rev. Thomas Bayes – that’s just a fancy way of saying when new information comes in, you update your understanding of the world.
Science is based on that, medicine is based on that, the reason we redesign freeways and airplanes and cancer drugs is because new information comes in. There’s a whole lot of us in this country that aren’t willing to look at new information, and I think that’s a problem.