Bayesian Problem-Solving: Avoiding Logical Fallacies
How do we respond to new information?
For instance, if I were to tell you that a person is timid, has a vast book collection, and reads a lot, and then ask you whether this person is more likely to be a teacher or a librarian, which would you pick?
Most, I gather, would pick librarian; the characteristics given seem to be more aligned with those who take part in that occupation. However, statistically speaking, there are many more teachers for every librarian. In fact, the American Library Association cites 166,164 librarians in the United States as of 2014 and the National Center for Education Statistics reports 3.7 million teachers in the United States as of 2020: over 20 teachers for each librarian. Therefore, even if only 5% of teachers fit this hypothetical description, and 100% of librarians fit it, the person is still approximately equally likely to be a teacher as a librarian.
The fallacy demonstrated here is one of drawing on previous experience; when taking in new information, we must realize that this information does not exist in a vacuum but is in fact on the backdrop of reality. As such, in order to come to rational conclusions, it is our responsibility to integrate the input information with our previous knowledge and experience in order to arrive at rational conclusions. This is nothing new — perhaps jargon for the application of the common adage, “don’t judge a book by it’s cover”, to problem-solving.
It is also a natural implication of Bayes’ Theorem, which was first formalized in probability theory by the 18th-century English mathematician and philosopher Thomas Bayes (d. 1761). I will save you from the mathematics, but the theorem simply describes the probability of a future event given prior knowledge of relevant conditions. It is one of the most important ideas in mathematics and has widespread applications, such as computing the accuracy and precision (and thus efficacy) of medical testing.
We can hardly be blamed for this built-in logical fallacy. We’ve grown up with the idea that every piece of information required to solve a word problem can be found within the problem itself. Such a teaching style naturally results in a struggle to synthesize previous information with the information at hand. We have thus been conditioned not to reference prior knowledge.
The world does not help us either — most information in the real world is scattered. It is a rarity to find all the resources one needs to whip up a solution laid out in one neat area and one can often get away with an accurate solution even with little to draw from. In the interest of time, it is thus tempting to make use of the information at hand and it can take great patience and scrutiny to seek out additional information. However, as we’ve seen, notwithstanding such an investigation, one may come to incorrect conclusions.
There exists, in some circles, staunch opposition to Bayesian interpretation. This largely arises out of the recognition that we may be applying Bayesian interpretation automatically, and more often than we’d care to admit, albeit substituting the reservoir of knowledge that we should be drawing from for preconceived notions. Experience and knowledge are built as we construct a mental model of the world, which is, unfortunately, the same way in which preconceived notions are built. We may thus invariably confound such notions for knowledge or experience when we look within. To refer back to the earlier example, our mental model of a typical librarian and a typical teacher are composed of our limited interactions with a number of individuals of both occupations and stereotypical accounts of each occupation through media. Thus, when we think of librarian, we think timid, bibliophile, etc., and these are not necessarily the same attributes we assign to teacher.
In recognition of this, the modern education system has conjured up situations in which we are in fact explicitly taught to judge things at face-value; to not draw on previous knowledge and experience. Take, for instance, the case of a professor grading a student, which so easily generalizes to situations of supposed objective judgement. Professors are trained to to not let their preconceived notions alter their judgement. This is because these notions give rise to individual bias, and bias clouds judgement. In doing so, they idealize a mythical notion of objectivity, which we intuitively know does not exist.; individual biases invariably creep in on judgements.
Thus, even though subjective judgements are ultimately a result of imperfect knowledge and imbalanced world views, they are all we have got. As such, individual bias is a a necessary evil that is a part of Bayesian interpretation, and Bayesian interpretation is the best system at our disposal to come to sound conclusions.
As a caveat, however, we must recognize that our subjective appraisals can never be separated from our identity and belief system. Nor should they be. This is one of the mechanisms by which we generate unique ideas and separate ourselves from the rest of the animal kingdom. To attempt to separate ourselves from these past opinions is disingenuous and ultimately unfruitful; we are a product of our environment. Fortunately, our diversity in experience results in a diversity of bias and quantity, rather than quality, does solve this problem. A diverse set of professors, for instance, would, absent of systematic bias, allocate a fair grade.
The implications for this problem are more far-reaching than mere problem solving: it also makes us prone to fragility in beliefs. New information should modify our world view, not dictate it. Otherwise, beliefs that should be strongly-held, as they shape one’s identity, become arbitrary and flimsy. Whereas new information should ordinarily serve only as branches on your tree of knowledge, they may inadvertently be rammed into their own respective roots, lacking critical reason. This deconsecrates the foundations of your knowledge and opens you up for security bugs. Whereas a dangerous idea would otherwise require a shaking of your foundations, uncritical reasoning would allow said ideas to infiltrate readily.
The intellectual, then, is the one endowed with the ability to look beyond the information at hand to the information that one has acquired before or to other sources of information. He recognizes the limitations of the input information and of his own knowledge and experience, and thus does not limit himself these sources. Finally, he averages out his judgement with those of others, allowing his answer to regress to the mean.