During the 2016 Primaries, when I expressed my support for a particular candidate, I was told to do my research. The woman I was speaking with didn’t ask why I supported this candidate, but automatically assumed that the only possible explanation could be ignorance, and that as soon as I did my research there was no way I would support this candidate anymore. I have seen people, when presented with a litany of evidence in support of their opponent’s claim, dismiss the evidence with those same three little words — do your research.
“Do your research” has become the most popular (non)argument in debate. Any time someone doesn’t like what someone else is saying, they can bow out with that one demeaning rebuttal. It says so many things without saying them. You clearly don’t know what you’re talking about. You haven’t even bothered to educate yourself before entering into this conversation. You don’t know as much as me, so come back when you do.
The first and most obvious reason we need to drop this argument from our repertoire is that it assumes a lot and can make us look foolish. Particularly if someone is well-informed or has just presented you with evidence in support of what they’re saying, to tell that person to do their research doesn’t even make sense.
The second reason is that people don’t know how to do their research. On the face of it, the unlimited access to information the Internet provides us with seems like a good thing. You can find information on literally anything. The problem is, people don’t know what to do with the information that’s out there.
Take studies for instance.
“Studies Show…” or “A New Study Indicates…” are lead-ins to what we assume will be intelligent and scientific information. But people don’t seem to actually understand how studies work. The fact that the findings of an individual study don’t necessarily mean anything until they’re replicated, the fact that studies that prove correlation do not necessarily prove causation, or the fact that many studies are just flawed are all important things to bear in mind.
“A New Study Shows That…” makes for an interesting headline, but if the article (a summation of the study that the publication thinks their readers will find interesting) is the beginning and end of your knowledge on the subject, you are no expert. The excitement and sensationalism of the possibilities that the study indicate make for good publication fodder, but if you don’t follow-up to see what actually comes of the study, then don’t think you know what you’re talking about. That new study might receive critical reviews from others in the field, or the results may not be able to be replicated.
If you read about a study that piques your interest, delve deeper to see if further studies have supported the original finding. If it’s a new study, keep your eyes open for further development to see what happens. Just remember this one critical component — studies have “proven” (or “disproven”) just about anything you can think of, so while a single study’s results may be interesting, they are not proof of anything just yet.
Correlation vs. Causation
As I mentioned above, studies often show us correlation between things, but causation is much harder to prove. This does not seem to be a concept that many people are familiar with (I wasn’t for a long time), so allow me to elaborate.
I once read a list of reasons to be happier, the author telling us that “studies show” that people who are happier have better marriages, better work relationships, less stress, etc. The author did not cite which studies these were, so I can only assume, but I’m going to go out on a limb and say that she mistook the findings of correlation for causation.
In this author’s mind, the fact that happier people reaped this list of benefits meant that being happy caused these wonderful side effects. This may have been true, but without some very specific controls in place, it’s very likely that all we know is that being happy is correlated with the other items on the list. The difference? Any of the other items on the list (or some combination of them, or something not even on the list) could have equally been the cause.
For example, being happy might make you a better spouse and co-worker, and make you more Zen in the face of stress. Or being a Zen person who is not easily stressed might make you a better spouse and co-worker, and an all-around happier person. Or being in a healthy, stable marriage might make you happy, Zen, and easy to get along with at work. Or there could be some genetic disposition that causes its lucky bearers to be happy and Zen and easy to get along with, none of these things actually causing the others.
The point is that even though two or more things are connected to each other in some provable way does not necessarily mean that we understand that connection, and in fact we often don’t. The next time you’re reading a study showing a link between two things, look out for this correlation vs. causation, and you may be shocked at how little we truly understand about the connection between things.
Don’t Believe Everything That’s On the Internet
I wish I didn’t even have to say this, but this still seems to be a point that many people don’t grasp. Just because something is on the Internet doesn’t mean it’s true. The people I referenced earlier who turn up their noses at whatever proof you put in front of them will often point you to the obscure “truth-telling” websites that their own research brought them to. The fact that the support for their argument is published online bolsters their confidence, but let me be clear — it shouldn’t.
Anyone can build a website, keep a blog, or otherwise put whatever opinion they want into cyber space (case in point: this article. No one reviewed or approved this article before I posted it, I just wrote it and put it out there — bam! Of course if I said anything beyond the pale of what Medium deems acceptable it could always be reported, but if this were my own website or online publication, even that gateway would not exist.) So how do you know what to believe online? Here are a few tips….
1. Go to the source. Remember that sensationalized story I was talking about earlier that starts with “A new study shows…”? Well, look up the study. I have yet to encounter a legitimate study that is not published online, where you can read all about who conducted the study, when and where, what controls were used, how the study was carried out, etc. If, as in the case of the list of reasons to be happy, there is no source cited, take that as a red flag.
2. Account for context. If you’re getting your information from a website or publication that’s obscure or you’ve never heard of before, take into consideration that you don’t know their standards for vetting their information. They may have loose standards for what sort of verification they require before they publish things, or they may have none at all. And if it’s a website with a mission (say, all natural healing), know that whatever information you find there is going to have a strong bias, and you certainly can’t expect to glean the whole story.
3. Read between the lines. If an article is using sensational language (devastating, shocking, underhanded) understand that the author of this article is trying to cause you to feel a certain way. It’s not that there’s necessarily anything wrong with that, as long as you as the reader understand that there is a bias that exists on the author’s part — this information is not being presented in a neutral way, and so there may be another side to the story that’s worth checking out.
4. Remember confirmation bias. Studies show (see what I did there?) that we have a tendency to look for confirmation of what we believe instead of looking for evidence to the contrary. It’s a natural human tendency and nothing to be embarrassed about, but it’s certainly something to keep in mind. You can find junk science and anecdotal evidence to support whatever you want, so don’t let yourself get too excited at the first sign of confirmation.
These are just a few tips to help you escalate “do your research” from a petty argument to a useful tool. Have other tips to help make research a more powerful tool? Share them in the comments!