|Scooped by GoogleLitTrips Reading List|
GoogleLitTrips Reading List's insight:
11 December 2016
How does one teach informational reading in times when the soon to be most powerful person in the world is waging a full-scale war on trust in information?
Regardless of our personal beliefs, when contemplating the quality of our current public discourse, it is clear that too much of what is believed is not to be believed.
Again, refining our informational reading skills in times when mistrust in information itself is alarmingly rampant is more important than ever, while simultaneously also being perceived as being irrelevant by even those who might be expected to be models of respect for rational thinking.
That aside, I found this list of easy to understand explanations for identifying thinking patterns that may be based upon unrecognized and unquestioned default baseline biases fascinating.
I am reminded of the quote on the banner above my black, then green, then white board for nearly 30 years.
"We look at the world and see what we have learned to believe is there." ~Aaron Siskind
I think Siskind was accurate whether or not "what we have learned to believe is there" is well founded or not.
I remember being taught a basic list of logical fallacies. (A downloadable poster can be found at: https://yourlogicalfallacyis.com/poster)
Ad hominem? tu quoque? anecdotal? begging the question? false cause?
(Wow! seen any of these recently?)
Yet, at the time, like many "late bloomers," I was a bit of a know it all. Well, to be more accurate I was convinced that I knew enough about things that I had already determined to be important and had already mastered the art of ignorantly giving no credence to what I eventually came to appreciate was much more important than being a class clown or learning all I could about girls from James Bond.
Studying that list written in its scholarly academic language was more boring than engaging.
Yet, in reading this scooped article on cognitive biases, written in real world English and in a fashion that is easily personalized rather than easily dismissed, I could not help but wonder if it might be a much more engaging way to invite students to give some serious thought to the impact of their own un recognized personal biases.
In class I would have students read this article once as individuals with instructions to see if they could cite examples in the real world where they've seen any examples of each identified type of the bias.
Then I'd have them share their examples in small groups for a few minutes minutes discuss for example these examples from the article...
"5. Confirmation bias. Confirmation bias is the tendency to seek out information that supports our pre-existing beliefs. In other words, we form an opinion first and then seek out evidence to back it up, rather than basing our opinions on facts." __________
Whether we are, for example, either conservative or progressive in our political leanings, isn't it true that we are sometimes or generally or most often immediately more receptive to information or news that falls in line with our own pre-existing beliefs and less receptive even immediately skeptical of information or news that challenges our own pre-existing beliefs?
How much openness did our recent public discourse appeal to pre-existing biases and how much openness did that public discourse encourage an honest consideration of the importance of reconsideration of our own pre-existing points of view?
Ever give students an essay assignment that was supposed to encourage them to research the pros and cons of a particular opinion only to have them begin with a pre-existing opinion and then merely spend the rest of their effort cherry picking arguments that supported that opinion?
Do we emphasize enough the importance of what amounts to a requirement to include a concession paragraph where, students are forced to concede that there actually are opinions that are both contrary to their own yet worth considering nevertheless?
A related bias gets a bit close to home for some educators. I recently had the "opportunity" to witness an example of this one...
9. The halo effect. The halo effect occurs when someone creates a strong first impression and that impression sticks. This is extremely noticeable in grading. For example, often teachers grade a student’s first paper, and if it’s good, are prone to continue giving them high marks on future papers even if their performance doesn’t warrant it. The same thing happens at work and in personal relationships."
There are 13 Cognitive Biases in this article. Each provides simple examples that can be bridges to anyone's personal experiences.
Dunno, might just stick more easily than trying to figure out if I might be guilty of "tu quoque" thinking. And, for the record, if you happened to pay any attention to our recent political campaign, You probably saw, whether you fell for it or not, hundreds of examples of "tu quoque" thinking.
If only I could believe that it wasn't intentional.
brought to you by GLT Global ED | Google Lit Trips, an educational nonprofit