The Dirty Dozen of Ignorance

unsplash-image-aJTiW00qqtI.jpg

Harri Jalonen
21.6.2021

The notion that real-life phenomena can be rationally explained arose during the enlightenment. The belief in the cumulation of knowledge has been immense. Knowledge has been linked to progress and the lack thereof has been seen as a problem that needs to be addressed. The thought of “knowledge is power”, formulated by philosopher Francis Bacon in the end of the 16th century, accurately sums up the expectations towards knowledge present at the time. Although Bacon himself reportedly referred to humanity's ability to understand and control their surrounding environment, the belief in the instrumental role of knowledge has also been adopted by the social sciences.

Knowledge and its management have been used to explain the survival of individuals, organisations and societies alike. Numerous studies argue, that an organisation’s success largely depends on its ability to distinguish relevant information from irrelevant information (see e.g. Grant 1996). The utilisation of information is above all justified by higher quality decision-making (see e.g. Choo 2007). Research seems to point to Aristotle being right when thinking that one who knows what is necessary and can provide it, can manage both a household and a state (Xenophon 1960).

The IRWIN-project explores the nature of information resilience and examines its emergence and consequences. Information resilience is a good goal, that will not be reached by itself. For many reasons, goals are not always reached despite the best of efforts and a lot of hard work. Next, I will present a compilation of possible explanations for this, that I have named The Dirty Dozen. (1) Limited rationality (Simon 1947) materialises in decision-making situations where the optimal choice is limited by limited information about alternatives and their consequences, limited cognitive abilities and skills of the decision maker, and case specific limiting factors such as time and place. (2) Negative group thinking (Janis 1972) is a social phenomenon that usually manifests itself in one-sided thinking, uncritical examination of information, different misconceptions of information and unnecessary self-censorship. (3) The symbolic utilisation of information (Feldman & March 1981) means the initiation of various data collection operations either without a clear understanding of what the information will be utilised for or with the knowledge that the information is not intended to be used for its stated purpose. (4) Learned incompetence (Argyris 1986) manifests as effective action without reflection, resulting in choices that lead to undesirable consequences. (5) Blame avoidance (Weaver 1986) refers to actions, where information is not created, searched, distributed nor applied for the betterment of things, but to avoid blame. (6) Pluralistic ignorance (Miller & McFarland 1991) is a psychological phenomenon which substantiates as flawed reasoning. As a consequence of pluralistic ignorance, members of a group start erroneously thinking that their thoughts, views and behaviour differ from those of the other group members. (7) Avoidance of new information (Johnson 1996) refers to information behaviour that strengthens  preconceived notions, which is typical in situations where the amount of information available exceeds an individual's data processing capacity or when an individual seeks to avoid information they consider sensitive. (8) Cognitive dissonance (Wilson 1997) refers to the uncertainty felt by an individual in a situation where they are faced with information that is inconsistent with their own cognition. (9) Populist ignorance (Shamir & Shamir 1997) is born, enforced and spread between people and materialises in accepted thought patterns within communities. These patterns feed into a one-sided or erroneous picture of the true state of things and phenomena. (10) Confirmation bias (Nickerson 1998) directs us to see what we already know. The human mind is not interested in truth but in balance, which is why information that threatens one’s worldview is quickly rejected. (11) Strategic ignorance (McGoey 2012) refers to the ritual-like utilisation of information, where the public’s attention is diverted from a seemingly difficult topic by providing an abundance of information on an off topic subject.  (12) Functional stupidity (Alvesson & Spicer 2014) manifests as an inability or unwillingness to question received information, a negligent attitude towards the arguments made, and a generally narrow-minded approach to the matter.

The above list is nowhere near complete. What can however be said, is that the thought of right information, in the right place at the right time represents an idealistic view of the security of supply of information. Besides being concerned with information and knowledge, the security of supply of information is largely concerned with facing and controlling ignorance. The security of supply of information is not so much about a lack of factual information as about disagreements and conflicts over who has legitimate information and what kind of ignorance is seen as useful. It might even be that one cannot remove ignorance by increasing information, only frame it in a different way.

Harri Jalonen, Professor, University of Vaasa

Edellinen
Edellinen

The floods in Germany are an important reminder of the challenges within national preparedness

Seuraava
Seuraava

Information resilience for breakfast