Free Vitamin D with Purchase of $30+, Learn More Here >>

Home / Blog

Philosophy | Skepticism in Health and Medical Science - Series

Part 2 Strategies to Improve Your Ability to Seek the Truth

Contributor Bio

Alex Tarnava is the CEO of Drink HRW, and the primary inventor of the open-cup hydrogen tablets. Alex runs the clinical outreach program for our company, working with over a dozen universities coordinating research. Alex has also published research of his own. You can find it on his ResearchGate. Additionally, he has been interviewed for many prominent publications, such as Entrepreneur and Forbes, and on many popular Podcasts. You can find all of his interviews and articles on his media page.

Part 2 Strategies to Improve Your Ability to Seek the Truth

A thing is not proved just because no one has ever questioned it. What has never been gone into impartially has never been properly gone into. Hence skepticism is the first step toward truth. It must be applied generally, because it is the touchstone.

- Denis Diderot

Many understand the importance of skepticism in the pursuit of truth and conclude that we should maintain unwavering skepticism against anything new, or that we initially view as suspect. This leads to significant differences in applied skepticism to different ideas, schools of thought and situations. To paraphrase the above quote from Diderot , “Anything that has not been gone into impartially has never been properly gone into.” This is true both for concepts we wish to be true, and for those we wish to be false. This also applies to ideas and concepts we initially react to, whether it be positive or negative in thought.

Daniel Kahneman details a heuristic which he has named the “familiarity heuristic,” in which people become more accepting of a notion the more familiar they become with it, and the more they have seen it, regardless of the actual evidence and logic behind it. This leads to many nonsensical ideas ingraining themselves into society and has also led to demonstrably false scientific findings being supported, and fought for, for decades after they have been shown to be flawed and incorrect. A famous example of this is many in the scientific community’s refusal to accept Leonard Hayflick’s research on cell division, called the “Hayflick Limit” which clearly refuted the findings of the French Nobel laureate Alexis Carrel. Despite overwhelming evidence, many older academics fought tooth and nail to invalidate Hayflick’s work, and this view only began changing with the passing of the torch to the next generation of scientists who had begun their careers while learning about Hayflick’s work.

As Max Planck famously said, “Science advances one funeral at a time.” I would even go further and say that in the hard sciences (not technology), cultures that have higher levels of regard for their elders, mentors and supervisors, cannot profoundly change consensus for two full generations. New legions of scientists feel obligations to defend their mentors’ positions, at least publicly, and it is not until they breed yet another new generation of scientists that old, wrong, ideas can truly die. 

Jumping to Conclusions and Defending Initial Positions

We tend to process situations and ideas with our emotions first, and then use our cognitive capacity to justify our emotional response. We also tend to emotionally feel much better when we are right, than when we are wrong, meaning that our initial, potentially incorrect, emotional response has a considerable amount riding on it being correct in considering protection of our own ego and emotional state. Further, we tend to feel more strongly and confidently about subjects of which we only have a working understanding. This is called the Dunning-Kruger Effect (explained below) and can lead to quick and poorly thought out positions, even by highly educated individuals with expertise in a similar field. Ironically, the related but not exact expertise can often lead to more dangerously wrong and overconfident initial conclusions, which are fought to be validated more fervently, based on their somewhat but not totally relevant expertise, and significant, but lacking for the particular discussion, knowledge base.

Many tend to think of the Dunning-Kruger effect as only applicable to idiots. However, this could not be further from the truth. This is a very interesting phenomenon that shows experts tend to be incredibly cautious about subjects they are highly educated on. Typically, when discussing new findings, I find that experts respond that they are not sure and need more time to think. However, when discussing findings with other highly intelligent and educated academics, physicians, etc, with a high level of general knowledge in the area but without true expertise on the specific area, opinions tend to be quickly put out, confident, even unwavering. Academics reading this can likely attest that when at a very specialized conference or symposium, speakers are hesitant to answer any questions that are not exactly about their research, and if they do, there will be caveats aplenty. This is because they know that not only are they not an expert on this question, but that there will likely be one, or many, in the audience better suited to answer it.

This is in stark contrast to the quick and firm opinions many give when questioned about a field they are unfamiliar with, but have knowledge in, such as questions about a specific molecule, out of context, when they are a medical doctor or a biochemist who has never actually researched or familiarized themselves with the particular molecule, or research on it. Unfortunately, the media relies on the “go to” expert contacts quoting on many topics outside their field, and these types of perceived experts tend to give better sound bites. A strong, firm opinion is better for readers than a cautious and fair scientific assessment. As Harry Truman once said, “Will someone give me a one-handed economist”, as a commentary on economists’ typical, cautious “On the other hand…” type of statements, the media also wants “one-handed experts” that come out with strong statements. Often, they pit two purported experts against each other to give two dichotomous views

False Equivalence in Experts

False equivalencies are always dangerous. A false equivalency occurs when one position is weighted equally to another, regardless of the order of magnitude of the event or ideas being compared. For example, consider the statement, “The Deepwater Horizon oil spill was no more catastrophic than when your neighbor spilled a jug of motor oil on their driveway. Both were accidental spills involving oil.” When boiled down to the ridiculous, it is difficult to contend that false equivalencies aren’t ridiculous at best. Unfortunately, most false equivalencies are far more subtle than this.

False equivalencies are particularly dangerous when they come to us as two conflicting opinions, both supposedly from experts. This can be seen with the opinions of chiropractors citing supposed anecdotes about vaccine injuries given equal weight to prominent immunologists and virologists citing hundreds of thousands of replicated studies. Or a naturopath citing personal philosophy being given equal footing to a plant or molecular geneticist who has been discussing thorough research over decades. I will dive into this in far more detail in the future, but the former have even started to believe in some sort of molecular homeopathy, in which isolated molecules, containing no genes, can somehow be imbued with the supposed negative qualities of the source material (with no credible evidence to support said negative qualities). This is mind-numbing, however, because of the “expertise” they are given by the media, the layman population is led to believe they have valid points. It is worth noting that it is infinitely easier to scare someone with no evidence, than to calm them with actual evidence.

This goes further into humanities’ predisposition to weigh experiences heavier than evidence. Often, an anecdote from a friend’s relative, or celebrity, is weighted equal to, or more than, the combined evidence and advice of all experts in the world. This is often not just a false equivalency, but a tremendously dangerous false superiority. This is not to say that anecdotes are worthless. I wrote an open letter regarding testimonials about a year ago. Testimonial evidence can lead to important research. It can give hints and clues about what may be going on, but until it is studied in a well controlled manner, conclusions should not and cannot be drawn. Unfortunately, some amongst us believe testimonials are all we need, while others, many researchers included, believe they are completely worthless and not even deserving of further attention. The latter position can become a tool to discredit scientists if testimonials end up leading to something meaningful. If a testimonial is common, we should ask ourselves the following important questions: 1) Has it ever been studied? 2) If it was studied and the results were good, was it properly controlled, blinded and replicated? 3) If the results were bad, did the protocols used involve similar practices and circumstances to where benefits are anecdotally observed? This last point is critical, and many skeptics fail to consider it. Differences, even nuances, between real-world use finding benefit and study design can lead to very different results. Was the dose/duration appropriate? Was the experiment designed to address a similar stress, or condition in which the population has anecdotally seen results? Unfortunately, even one or two negative finding early studies can completely nerf a budding idea. We need to ask these questions, always. To the media, to researchers, and to those who aim to sell us on ideas. Confrontation can lead to either growth or hardening of positions. Seek to utilize constructive criticism, while maintaining respect. Hard questions, asked tactfully, are far more likely to yield a positive outcome, both for the person asking the question, and for the one being questioned. 

When the media utilizes divergent experts, perhaps under a false equivalency, as an aspiring thinker, you must assess whether either “expert” is qualified to make the statement. Is it a black vs. white argument, or is it complex? Are there valid points on either or both sides, and have those points neglected any key pieces of information and context, perhaps due to these individuals’ lack of knowledge on the subject (the Dunning-Kruger effect)? Finally, what are the real experts saying? By real experts, of course, I mean the scientists and doctors that specialize in the exact area.
We cannot ever presume what is reported to us is true. Not only do all media companies have competing interests regarding political sway, they also rely on data from their base on what type of position their viewers, readers or listeners respond to. As such, tone and conclusion are often crafted with that in mind. Additionally, the current trend in journalism is leading to more content being needed, faster, and more sensationalized. This leads to lower quality of journalism and research regarding topics. Often, writers will seek out what has been done by their peers and regurgitate it. This is incredibly dangerous, as a seminal piece in the news, if factually incorrect, can lead to numerous articles with the same tone and conclusions. The larger the mound of “bad reporting” grows, the harder it is to insert truth. PR specialists fight hard to alter media coverage to be positive for their clients, and “negative news” sells more, meaning some reporting is overwhelmingly unfairly positive, but much is overwhelmingly unfairly negative. This is not balance, but a fundamental lack of calibration across most topics of conversation.
Finally, in regard to false equivalencies, there is often one side that is far more ‘correct’ than the other. As an aspiring thinker, it is critical to not disregard every position that one side makes, just because they are wrong most of the time. Our opposition makes us stronger, but all too often, a thorough walloping of an intellectual or ideological adversary leads to a weakening of one’s ability to seek the truth. If side A is correct in 90% of the debate, with side B making clear and demonstrable errors in the 90% they are wrong, it is easy to dismiss the other 10%. Each individual point and argument should be assessed based on the merits of the argument, regardless of the previous outcomes of an individual’s arguments.
We cannot presume truth, or ridiculousness, based solely on an individual’s past track record. This is a double-edged sword. If we discount what an outlier says, simply because this individual is usually wrong, it risks empowering them and their followers to believe there is a conspiracy to discredit their work and thoughts. If this is demonstrated in 10% of the arguments, many, including the outlier, may begin to believe the other 90% was also correct, and there is simply a conspiracy against them. Simultaneously, if the false conclusions of an otherwise highly accurate expert are ignored, or even supported based on their reputation, a danger emerges in creating an overconfident expert that loses the ability to properly self assess their positions, and work through criticisms. I have previously discussed this in a past series regarding health “experts” on both sides of the fence, and it is worth noting again:

“Canadian-American political science writer and University of Pennsylvania Professor Phillip Tetlock who has studied the positions and predictions of experts in the social sciences quite extensively noted that experts in areas such as Political Science and Economics are no better than attentive readers of the New York Times in following and predicting emerging situations. He goes on to argue that the more famous the “forecaster” the more flamboyant the forecast. He writes that ‘Experts in demand were more overconfident than their colleagues who eked out existences far from the limelight.’”

Critical vs. Analytical Thinking

Without careful and thoughtful analysis, criticism is not productive and it can be harmful. In fact, critical thinking alone, which relies on taking outside knowledge into account to evaluate a situation, can be a recipe for succumbing to the Dunning-Kruger Effect. Analytical thought is needed in order to break components down and evaluate each specific point, idea, or piece of purported evidence. Conversely, utilizing analytical thinking without critical thinking is a sure-fire path towards confirmation bias. If one breaks down each point, without a healthy dose of skepticism when evaluating the information, false conclusions can be drawn which pervert the overall conclusions. In the pursuit of truth, both critical AND analytical thinking need to be given equal footing.

Truth seeking should be indifferent to any specific outcome. The goal of critical thought is not to prove something incorrect; it is to remove any assumptions while evaluating the evidence at hand. Evidence must be deemed neither true nor false until proper assessment, a sort of Shroedinger’s cat regarding the truth, in that the assessor must remain under the assumption that the idea or evidence being assessed is simultaneously true and false. Each addition of new evidence must be further analyzed, again removing any supposition of expected results. In fact, once all evidence is assessed, the only reasonable conclusions are “likely true” or likely false,” with the caveat that this can change with new evidence. In science, ideas cannot be proven true, they can only fail to be falsified, meaning they are “probably true.” When there is such monumental empirical evidence regarding an inability to falsify, with supportive evidence in a hypothesis fitting into other reasonably known outcomes, a hypothesis can move to an accepted theory or law. New evidence can always dismantle this, provided it is sufficient in quality. The more monumental an overturn of established knowledge, the higher the quality of the opposing evidence must be. Likewise, ideas that have been falsified may be proven to be correct, at least in very specific situations, based on new evidence. This new evidence that contradicts previous falsifications must also be significant enough in quality and replicability.

It is a natural and understandable tendency to immediately reject evidence which contradicts established knowledge. This learned heuristic affords mental capacity to be utilized in matters viewed as more urgent or important. This heuristic must be addressed and subdued, as it is critical to give new and contradictory initial evidence an emotionally and intellectually neutral analysis, assessing it as any other piece of evidence. Once concluded as an individual piece of work or thought using critical and analytical thinking skills, the new information can then be weighed in relation to, and in accordance with, the entire body of evidence in order to formulate an overall conclusion.

Many dedicate themselves to high-paced hit and run “debunking,” a practice that, when done in an emotionally charged manner and without careful analysis, is both intellectually dangerous and dishonest. Effective skepticism in the lens of truth seeking isn’t creating a play book of rebuttals and traps in order to ridicule and attempt to discredit your adversary. Rather, it is questioning everything in the pursuit to find the truth. Too many skeptics allow their emotions, both in support of their friends and detestation of their targets (often before understanding the opponents’ position) to predetermine their positions.

Confirmation Bias

“Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or strengthens one's prior personal beliefs or hypotheses.”

Confirmation bias is perhaps the most dangerous cognitive bias that exists within humanity, and it ravages our society. Confirmation bias leads to the “echo chambers” we see form, in which we follow and converse with only those we agree with, which amplifies and intensifies positions as time goes on. We compile evidence for our point of view, weighing it strongly and assessing it with little to no criticism, and ignore or attempt to refute, with the help of our like-minded allies, the positions of those who contradict us. These formed echo chambers lead to the dehumanization of the opposition. We see this in politics, and we see this in health communication. When all of the information we gather and assess acts as a form of confirmation bias, and our social circle, the media or influencers we follow, etc., are all in agreement, it allows for the conclusion that those opposing us are either incredibly stupid, evil, corrupt, and dishonest.

Ideally, we should apply equal amounts of skepticism to ideas, beliefs and hypotheses we agree with as we do to those we oppose. In reality, it is impossible to remove all emotion, and so we need to compensate and become more critical of that which we initially agree with than we are of that with which we initially oppose. Just as we tend to weigh our own work and losses to a greater degree than those from others, a key reason why often both parties in a transaction feel they “lost” and were treated unfairly, we tend to be more critical of positions that oppose our bias, and weigh evidence supporting our case far more significantly. Good negotiators and successful business people learn one of two traits, to negotiate and bully to get their way, or to find fair compromises in which everyone wins. It would be great to say that the latter always sees more long-term success, but we know this to be untrue. As a society, we set up systems to hold the former accountable for their bullying and greed, and hold them personally and morally responsible. Likewise, fairness of opinion and position is not necessarily an indicator of success of said position. Just as we try to hold unfair business practices accountable, we need to hold unfair assessments of evidence for personal gain accountable. Many among us are adamant we could never bully to win in business.

Why do we allow ourselves to use confirmation bias and echo chambers to bully and “win” in discussions regarding thought and truth? Do we value monetary gain more than truth? Perhaps, but more likely, monetary gain is more tangible, whereas affronts to the truth more abstract. Hold the opinions of those you like and trust accountable, dolloping healthy servings of criticism and analysis. Follow positions that differ and contradict your own. Above all else, hold your own thoughts and feelings on a subjective accountable, spending time each day to think about them in detail, searching for flaws. This self analysis is necessary in the pursuit of the truth, both in finding truth around us and the truths within ourselves.

Adversarial Allies

We need sounding boards, friends and colleagues who have high levels of critical and analytical thinking skills to bounce thoughts. Look for people who have strong opinions, but ones that do not fit into any “mold” and ensure you are not mistaking agreement with their positions with actual high level critical and analytical thinking practices. In addition, or if unavailable instead of, if you do not have friends like this, you need to use adversaries as sounding boards. Do not present your emerging thoughts to those who believe what your thoughts are expressing; this is how you create an echo chamber. Look for those who oppose what you’re saying and hear them out. Pose inquisitive questions rather than rebuttals and retorts. Find out why they think the way they do, what evidence they have to support it, and reflect upon how that may contradict your own views. Seek to understand your opposition so that you can understand yourself. Sometimes, in understanding the opposition our positions change. Other times, it allows us to “steel man” their argument, a phrase popularized by Sam Harris as the opposite of a “straw man” argument, when we refute an adversaries position by refuting a position not actually held or stated by the adversary, or otherwise incomplete and as such unfair.

Language, Social Stature, Physical Stature

One heuristic thoroughly entrenched within us is the proclivity to unfairly assess an idea or position, both positively or negatively, based on determinations other than the idea itself. We tend to unfairly malign ideas using language we deem as being improper, or lower class, regardless of the inherent truth of the message. Conversely, we as humans are often dazzled by high level displays/competence of the language, and way words as unfairly correct, without assessing the inherent truth of the statements. We need to strive to find the meaning of the words, disregarding the exact language utilized. Do certain words trigger certain emotion within us? Do we have an inherent bias to trust a certain accent, or distrust another? Personally, I have experienced dislike for individuals on TV, etc., for no apparent reason. Perhaps, it is the shape of their face, or the tone of their voice; I’ve also had an immediate emotion to like others for no reason. This is human nature. A good practice is to rewind and transcribe the words that were said. Wait until your emotions, however slight, have calmed, and carefully read the words that came out of the personality’s mouth. Do you still disagree or agree? By reframing this, it is possible to eliminate an initial emotion attached to a person based on voice, language, or physical appearance. It takes practice. An additional tool is to insert a synonym of a “trigger” word into the text to see if your emotions regarding the position change.

We tend to allow these biases to form regarding social and physical stature as well. Statistically, we often conflate physical appearance both in beauty, height and stature and attire, social upbringing and career position with truth of a statement or idea. Clearly, physical features and dress have nothing to do with an individual’s intelligence, thoughtfulness or qualifications, but we expect certain stereotypical aspects nevertheless, and when there is surprise in this regard we tend to become more critical. Is a laywer’s advice any less valid if you run into them at the beach, where they’re wearing shorts and flip flops, opposed to sitting with them in their office while they’re in a suit and tie? Likewise, we tend to take the opinion of someone in a field we deem as “respectable” as more accurate or trustworthy than an opinion from a field we view as less successful, even if the opinion is on a piece of knowledge or information that has no relevance to their career. This is not to say that someone without proper credentials cannot have a high level of knowledge in an outside field. I have carved my path through this exact ability. It is to say that someone’s perceived stature has no relevance in terms of assessing their position. Statistically, someone with an M.D. is going to be correct on matters of health to an astronomical degree over a construction worker, as an example. This does not mean that for every single argument, the individual holding an M.D. will be correct and the construction worker will be incorrect. There is a certain percentage of times where the construction worker may know something the M.D. does not. Therefore, it is imperative to assess every piece of information on its merits, including the health advice of the construction worker compared to the medical doctor, while being careful not to weigh them equally, as the previously discussed false equivalency issue.

Being a Contrarian Is a Double-Edged Sword

By definition, to pursue change in the world we must maintain some level of contrarian thinking. As I will discuss later in this series, appealing to popularity is a dangerous and nonsensical fallacy. One thought experiment that promotes contrarianism is “The Emperor’s New Clothes.” Many readers have likely read this before, and if you haven’t, it is short and worth your time to do so. This short story is a great example of why you should always question things, no matter how popular a position is. However, it does not mean that simply because a thing has been questioned by you, or others, the contrary and “unpopular” view is correct. Thinking this way leads to conspiracy theories that have no basis in reality. Many desire greatly to be critical and skeptical thinkers, and so will accept any contrarian position, no matter how ludicrous, in order to maintain a self-illusion of intellectual superiority, whether conscious or subconscious. In this way, seeking to be a contrarian, especially by default, can lead us away from the truth.

Contrarianism is a tool that we must apply carefully, not liberally. We need apply contrarian thinking only after we have assessed the evidence critically and analytically. We must never shut ourselves out to arguments against our contrarian position, for it may have been arrived at due to mistake, or misunderstanding, not revelation. Our contrary views may well be correct, but if significant resistance occurs towards them, the chances are they are not. This is why we must ensure we have self-reflected on our desire to maintain the position, and constantly re-evaluate our position based on all new evidence.

Once our contrarian views are assessed, I recommend a piece of advice I read in Christopher Hitchens “Letters to a Young Contrarian” years ago. To paraphrase and sum it up, wake up every morning and review a list of the issues you see as most wrong with the world. Ensure you are still angry about them, and have not grown complacent. This is the only way in which you can work to utilize your contrarian views for positive change and growth. If you are no longer upset about an issue, ask yourself why not? Has your position changed due to complacency, or new evidence? If it is based on new evidence, this is fantastic, and an indication that your mind is overcoming your own confirmation bias. Continue on.

Why You Should Pursue to Improve Your Ability to Think

Critical and analytical thought in pursuit of the truth take tremendous effort and practice. Most of these strategies go against our emotional desires, and our instinct for self preservation, which includes our fragile egos. So why subject ourselves to it? In short, because our lvies will be better for it. Edzard Ernst has studied the real world benefits of critical thinking skills, in reducing the amount of negative life events, from his site:

Critical thinking predicts a wide range of life events. In a series of studies, conducted in the U.S. and abroad, my colleagues and I have found that critical thinkers experience fewer bad things in life. We asked people to complete an inventory of life events and take a critical thinking assessment (the Halpern Critical Thinking Assessment). The critical thinking assessment measures five components of critical thinking skills including verbal reasoning, argument analysis, hypothesis testing, probability and uncertainty, decision-making, and problem-solving. The inventory of negative life events captures different domains of life such as academic (e.g., I forgot about an exam), health (e.g., I contracted a sexually transmitted infection because I did not wear a condom), legal (e.g., I was arrested for driving under the influence), interpersonal (e.g., I cheated on my romantic partner who I had been with for over a year), financial (e.g., I have over $5,000 of credit card debt), etc. Repeatedly, we found that critical thinkers experience fewer negative life events. This is an important finding because there is plenty of evidence that critical thinking can be taught and improved.

In fact, he recently published a paper with colleagues showing that critical thinking outperformed intelligence in determining real life decision making. We may not be able to improve our base intelligence, but we can improve our critical and analytical thinking to supplement our work ethic and determination. Learn to think critically and analytically, your life will be better for it.

Question your allies. Question the media reports you hear, whether it comes from a source you like or not, and question the influencers, gurus and experts you follow. And, of course, always question yourself.

Next week, I dive into the positions of Mainstream Health Skeptics