One of the most troublesome sources of conflict I experience in meaningful relationships is grounded in internal processes that conflict implicitly — that is, the root of the conflict is felt but not seen.
Some people acquire knowledge and make decisions about their world based on a familiar ideology — i.e., by “just knowing” or by relying on an external authority for knowledge. Others put thoughts or questions through a more formalized process of examination and reasoning before they’re confident in arriving at a conclusion or making a claim. (I suspect this process preference correlates with one or more dimensions of the Myers-Briggs Type Indicator.)
When we make a claim about external reality or disclose a decision or when we withdraw however momentarily to engage in an internal examination process, our choice of what process we’re adopting is usually covert — neither identified openly nor self-evident to our audience — a choice that’s not subject to their participation. We just tend to feel discomfort when another person in a discussion adopts a pace and a process that’s different from ours. Before we’ve started, we’re already out of alignment if the other person’s process for acquiring knowledge feels unnatural to us.
This domain of study falls under epistemology, the theory of knowledge. Epistemology questions what knowledge is, how it can be acquired, and the extent to which knowledge that’s relevant to any given subject or entity can be acquired.
The scope of this article applies to acquiring impersonal information of a factual, public and scientific or political nature and to observations about external reality. It does not apply to personal communication and connection or to someone’s subjective experience of meaning — internal states that are not subject to objective verification. I use my intuition and simply listening to understand information about another person’s experience. (I’ve written volumes about that scope of experience, and I developed a workshop to refine one’s perception of and connection with their romantic partner: Deepening Sensual Connection.)
Part of my motivation in writing this article is to arrive at a place in myself where I can respect and validate people who rely more on ideological argument than on evidence-based reasoning in constructing their perceptions and conclusions about the world. Methods in science were developed primarily to eliminate ideological or personal-interest bias from the results of the discovery process. So, naturally, as a science-trained professional, I’ve tended to adopt an attitude of not only skepticism but also veiled contempt for people or “news” sources that exhibit a strong bias toward ideology at the expense of scientific knowledge. So I clearly have some work to do to arrive at acceptance of ideology-subscribing individuals while I remain skeptical of their claims until they rise to my standards for truth.
A friend recently sent me a YouTube video with a well-credentialed talking head ranting about a coming economic collapse due to quantitative easing, quoting the usual anti-Fed memes. I responded that it wasn’t persuasive for me because the speaker didn’t adopt an evidence-based reasoning strategy to make his arguments. The argument was purely narrative and ideological, counting on the listener to swallow the speaker’s predigested conclusions without the effort of masticating verifiable evidence through their own reasoning process. (I’m making a process observation here, not rendering a value judgment about the content.)
I guess I pissed her off by saying, “Sorry, this isn’t persuasive. I’m a science guy.” In an ill-advised moment of flippancy, I used shorthand to say this: My method of acquiring knowledge and making decisions is the claim-evidence-reasoning (CER) model that’s adopted primarily in science and jurisprudence. (See the list of resources about CER below.) In a court of law or in a peer-reviewed journal, just making claims without any evidence or reasoning to support the claims (except references to one’s own or others’ previous claims) is an attempt to make an ideological argument in hopes that the listener will skip over the process of logical inference from verifiable evidence and move to a conclusion based on a sense of familiarity or an engaging narrative or the “it’s just obvious, everybody knows this” leap that we all know so well. The ideological argument begs the question and entices us to abdicate our own judgment to a “higher authority” — i.e., what (their) experts say, what the Bible says, what the Koran says, what a widely-read author says, what The President says, etc.
When relying on ideological methods of acquiring knowledge, people often prefer anecdotal evidence rather than objective evidence that’s verifiable in common practice by scientifically respectable means. The current political “debate” about the Affordable Care Act provides an abundance of evidence, anecdotes, and examples that demonstrate how we can talk past each other without understanding. The resulting chasm divides good people and makes civil discourse difficult.
We could also pick as topics humans’ contribution to global climate change, progressive taxation, women’s healthcare rights, immigration reform, the bankruptcy of Detroit, and a very long list of other subjects where we will find it difficult to converse with someone whose knowledge-acquisition process is different from ours.
Ironically, the key provisions of the Affordable Care Act were derived almost 100% from a market-based approach, including the individual mandate, proposed by ideologues on the political right such as the Heritage Foundation and put into practice with great pride in Massachusetts by Republican governor Mitt Romney. This demonstrates one weakness of an ideological argument — it can flip without warning or explanation. When one’s perceived identity feels threatened, it’s easy to change an ideologically-based position to the polar opposite without feeling even a little bit slippery as long as there are others in our self-identified cultural group who are making us feel welcome into the flipside in a conspiratorial alliance to reduce our collective cognitive dissonance.
My theory is that our knowledge-acquisition process preference derives from how we’ve learned to respond to cognitive dissonance. Per Wikipedia:
“Leon Festinger’s theory of cognitive dissonance focuses on how humans strive for internal consistency. When inconsistency (dissonance) is experienced, individuals largely become psychologically distressed. His basic hypotheses are listed below:
“The existence of dissonance, being psychologically uncomfortable, will motivate the person to try to reduce the dissonance and achieve consonance.”
“When dissonance is present, in addition to trying to reduce it, the person will actively avoid situations and information which would likely increase the dissonance.”
I was raised in The South (Tennessee and North Carolina) in the 1960s and 70s. In those days, there was an undercurrent of “good-ole boy” peer pressure to adopt the common narrative and core beliefs in my peer group, which included thinking of African Americans as being unworthy of the same respect and dignity afforded to God-fearin’ white folks. I never adopted those beliefs in exchange for the social-belonging bait and was sometimes vocal in identifying the moral inconsistencies I saw around me. My failure to support the norms of that society heightened cognitive dissonance in some. The beliefs being sold in that attempted exchange were, I felt, entirely inconsistent with the Christian teachings to which I subscribed at the time. Bringing that to people’s attention intensified the cognitive dissonance they were already experiencing. The choice they faced was either to embrace the conflicting data and resolve the dissonance by adopting a new belief more in harmony with Christian principles or to resist the conflicting data points, in many cases by suppressing data or awareness or both. Less psychological effort was expended by distancing me and denying the conflict than by reevaluating their racist beliefs, which were culturally supported by the majority in those days. So I felt mostly peripheral to the social culture of The South during my childhood and adolescence. (Those of you who know me now will not be surprised to hear that I was an irritating teenager who occasionally took the moral high-ground with his peers. 😉
Shortly after graduating from college, I escaped to Minnesota for work and found a very refreshing I’m-OK-You’re-OK sentiment that tolerated individuals who entertained unique ideas (as long as they weren’t anti-social or hypocritical). It felt more loving and affirmative to me in Minnesota. I thrived there and built a pioneering business in software. In hindsight and from a distance, I think the good-ole-boy culture in The South was pushing an ideological package that was always foreign to my knowledge-acquisition process and to my ethical grounding.
In a related article, I intend to review research in neuroscience that helps us understand how brain structures, biochemistry, and belief-formation processes operate in knowledge acquisition. For a wonderfully rich work on this subject written by a self-described libertarian, see The Believing Brain: From Ghosts and Gods to Politics and Conspiracies—How We Construct Beliefs and Reinforce Them as Truths, by Michael Shermer.
In US courts, in a trial by judge, an ideological argument will lose against a claim-evidence-reasoning-based argument almost 100% of the time. In a peer-reviewed journal, an ideology-based article won’t get past the junior editorial clerk at the front of their funnel. Ideological arguments are quicker and easier to make, thus the temptation to choose them over the more laborious track of claim-evidence-reasoning. It’s human nature to take the shortcut and to hang out with folks who’ll accept our narrative without resistance. Ideological narrative may advance a relationship, but it’s less likely to advance knowledge acquisition in a rigorous way.
In case you’re interested, let me offer some resources (listed below) that flesh out my preferred process for inference about reality. In my observation of the US culture (and human culture in general), emotional conflicts and political polarization often derive from the fact that one party relies on a form of evidence-based reasoning as their standard for knowledge (whether they know it or not) while the other party relies on ideological argument derived from others’ claims (whether they know it or not).
This is certainly the basis for any tension between me and one of my close friends. We genuinely love and trust each other to have each other’s best interests at heart. And we process differently, which is why we make a GREAT team in business. I just have to enforce a boundary with him that, if he and I can’t solve a particular problem between the two of us, we won’t discuss it because it’s irrelevant to our relationship. We don’t start from a common point of reality, and we don’t agree on a process for inference. Yet we enjoy each other’s company and have a deep sense of mutual trust. I have a similar relationship with a past girlfriend, a conservative raised in the Seventh Day Adventist tradition — deep trust and mutual appreciation without agreement on how to source verifiable evidence and conclusions.
Even broaching this topic for discussion with some test subjects (whom I suspected of leaning more toward an ideology-based worldview than an evidence-and-reasoning-based worldview) has raised prickles, so, at least on this occasion and in this realm, I’m going to reconsider opening this for discussion with someone until 1) I assess their openness to an evidence-reasoning-based thought experiment and 2) I unlearn my covert, self-protective attitude of moral superiority in these matters.
Apparently, ideologues think of a compelling narrative as “reasoning”. To identify an ideological argument as not being evidence-and-reasoning based has not advanced the conversation toward mutual understanding or a meeting of minds.
Perhaps I’ll start wearing a hat advertising “Seeking Epistemologists for Intimate Conversation”. Or browse on EpistemologicalMatch.com for companionship.
Claim-Evidence-Reasoning (CER) as a model for acquiring knowledge and making decisions
1. YouTube video –> http://www.youtube.com/watch?v=fkpZfpNWjWY
2. PowerPoint deck –> http://cosmos.bgsu.edu/nwoSymposium/2013%20stuff/PresenterMaterial/Claim%20Evidence%20Reasoning%20(Elizabeth%20Buckholtz)%20NWO%20Symposium,%202013.pdf
3. Mind map –> http://www.pearltrees.com/#/N-u=1_1165847&N-fa=7753286&N-s=1_8859370&N-f=1_8859370&N-p=85964067
4. A CER assessment tool –> http://tulsa.curriculum.schooldesk.net/Portals/Tulsa/Curriculum/docs/Science/Writing%20in%20Science/CER%20Rubric.pdf
5. How-to article by secondary education specialists –> http://primaryconnections.org.au/professional-reading/images/Inquiry-Scientific-Explanations-Helping-evidence-reasoning.pdf
6. A dirt-simple process by which to examine the value of a troublesome thought or belief –> http://DanWebb.com/TheWork