I have already talked about how our minds make use of heuristics to solve problems when faced with cognitive limitations. These heuristics are shortcuts that are bound to be wrong sometimes. The 1970s marked the beginning of the heuristic and biases tradition which took over the study of human decision-making processes.
Currently, cognitive biases are being questioned by many researchers. Some believe they are a vague, overrated concept that unfairly dominates the field of reasoning studies. In order to explain this position, I will take a look at the confirmation bias today.
The notion of confirmation bias is simple to understand. In 1960, cognitive psychologist P.C. Wason coined the term . He used it to describe how when we search for information we tend to favor the one that confirms our preconceptions, hypotheses or personal beliefs. It is also known as myside bias and affects inductive reasoning (the type of reasoning in which the premises would lead to a probable solution, not a certain one).
The concept of confirmation bias has expanded and is supposed to affect not only to the task of information search but also to interpretation and recalling. In fact, confirmation bias has become an umbrella term that covers any instance in which beliefs or preconceptions influence information processing.
Confirmation bias is seemingly unrelated to intelligence  and has been theorized that confidence levels might be at play. Individuals with low confidence levels might tend to avoid information that contradicts their beliefs while those with higher confidence levels tend to seek out antagonistic opinions to form their arguments.
Examples of confirmation bias
Considering the preceding definition, the instances in which we incur this cognitive bias are endless. From politics to the belief in pseudoscience, to rushed medical diagnosis, to police investigations, to almost any everyday social interaction; confirmation bias could be affecting us constantly.
It is quite common to disregard any evidence that doesn’t sustain our political point of view by saying that it is flawed or inaccurate. It is just as common that scientific innovations or rare experimental results are heavily criticized by the scientific community.
Someone who believes in paranormal psychic activity is predisposed to remember vividly every time that event A was followed by event B. This same person will ignore or even forget all those instances in which event A wasn’t followed by event B.
We can find countless accounts of myside bias. But the question is, are they the result of a systematic cognitive bias that affects human reasoning equally or is it something else?
The problem with cognitive biases
As I said before, cognitive biases are not universal. They are dependent on individual differences – intelligence, previous training or certain personality traits. Training to solve experimental tasks used in cognitive bias studies as well as simply knowing about the existence of cognitive biases makes us less likely to fall prey to them.
Moreover, most studies that support their existence feature highly subjective tasks. In the case of confirmation bias, for instance, there is little evidence that it affects estimations of numerical results. This is the reason why I think we should be using the concept of motivated reasoning instead of confirmation bias.
I think it is problematic that the effect of cognitive biases cannot be identified across a wide range of contexts, seems to be task-specific as well as restricted to some people. As a result, we cannot affirm that cognitive biases are systematic patterns of irrationality that govern human judgment.
 Wason, P. C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly
Journal of Experimental Psychology, 12, 129–140.
 Stanovich, K.E. & West, R.F. (2007). Natural Myside Bias is Independent of Cognitive Ability. Thinking and Reasoning, 13 (3), 225-247.
 Stanovich, K.E.; West, R.F.; & Toplak, M.E. (2013). Myside Bias, Rational Thinking, and Intelligence. Association for Psychological Science, 22 (4), 259-264.