increasing the quality of our thinking
A mental model is an explanation of someone’s thought process about how something works in the real world. It is a representation of the surrounding world, the relationships between its various parts and a person’s intuitive perception about his or her own acts and their consequences.
If used responsibly, mental models can inform marketing, product design, and influence technology. If left unchecked, mental models can turn into foibles or minor weaknesses or eccentricities in our character.
Artificial intelligence and predictive data will continue to advance, enabling exponential growth. In AI and machine learning programs, discrimination is caused by data. This “algorithmic bias” occurs when AI and computing systems act not in objective fairness, but according to the prejudices that exist with the people who formulated, cleaned and structured their data. This is not inherently harmful – human bias can be as simple as preferring red to blue – but warning signs have started to appear.
A research team at the University of California Berkeley distinguished pre-existing biases in training data from the technical biases that arise from the tools and algorithms that power these AI systems and from the emergent biases that result from human interactions with them.
Ultimately, the solutions we embrace (whether technically or process-oriented) are only as good as the data it is trained to analyze. How we assess problems includes pre-existing (human) biases. These impact us on an individual and societal level. This kind of bias was found in a risk assessment software known as COMPAS. Courtroom judges used it to forecast which criminals were most likely to offend. When news organization ProPublica compared COMPAS risk assessments for 10,000 people arrested in one county in Florida with data showing which ones went on to re-offend, it discovered that when the algorithm was right, its decision making was fair. But when the algorithm was wrong, people of color were almost twice as likely to be labeled a higher risk, yet they did not re-offend.
Gaining insight to our mental models are how we understand the world. Not only do they shape what we think and how we understand but they shape the connections and opportunities that we see. Mental models help make the complex simple. complexity, why we consider some things more relevant than others, and how we reason.
A mental model is just that…a model. It’s a tool that enables us to make an abstract representation of a complex issue. Models help our brains filter the details of the world so we can focus on the relevant details of an issue.
A path toward better thinking
The quality of our thinking is proportional to the models we are aware of, and our ability to apply them correctly in a situation. The more models you know, the bigger your toolbox. The more models you apply, the more likely you are to see reality with greater clarity and make better decisions. When it comes to improving your ability to make decisions variety (and volume) matters.
Most of us, however, are specialists. Instead of a latticework of mental models, we have a few from our discipline–a few “rules of thumb.” Each specialist sees something different.
When you look at a forest, do you focus on:
the ecosystem? You might be a botanist.
the impact of climate change? You might be an environmentalist.
the state of the tree growth? You might be forestry engineer.
the value of the land? You might be a business person.
None of these perspectives are wrong. And, none of them see the forest in its entirety. That is the value of cross-disciplinary thinking. Understanding the basics of the other perspectives leads to a more well-rounded understanding of the forest allowing for better initial decisions about managing it. That’s latticework.
By putting these disciplines together in our head, we can gain greater proximity to the problem at hand by seeing it in a three dimensional way. If consider the problem merely from one angle, we’ve got a blind spot. And blind spots can kill you.
A Network of Mental Models for “good humaning”
Building your repertoire of mental models will help you make better decisions. Once you know a few, you will start to make connections between them, helping you create a networked understanding of how you operate as a human being. I’ve collected and summarized the ones I’ve found the most useful. You can use them almost like a deck of cards.
One of the reasons I refer to them as “Foibles” is because these biases are universal to us all. They are what make us human. Succumbing to them clouds our view of the world and contributes to making costly mistakes in our relationships, our businesses, and as a society.
I refer to “good humaning” because between learning and integration lies “the journey”, “the struggle”, “the gap.” Part of our work is learning and re-learning what it means to be a good human or to do “humaning” well, by making better decisions in our relationships, business, and society at large.
Remember: Developing this level of self-awareness about how you and others operate is a lifelong project. Stick with it, and you’ll find that you will see reality more clearly, make better decisions more consistently, and help those you love and care with greater your increased presence.
Mental Models Explained
The Map is not the Territory metaphorically illustrates the differences between belief and reality. The phrase was coined by Alfred Korzybski. Our perception of the world is being generated by our brain and can be considered as a ‘map’ of reality written in neural patterns. Reality exists outside our mind but we can construct models of this ‘territory’ based on what we glimpse through our senses.
Higher Order Thinking moves from the easier and safer anticipation of the immediate results of our actions, to thinking farther ahead and thinking holistically. The first approach ensures we get the same results as everyone else. Second-order thinking requires us to not only consider our actions and their immediate consequences but consider the long game. Failing to think through long term effects can invite crisis and disaster.
Inversion is a common method used in creative ideation sessions, also known as reverse thinking. Instead of following the ‘normal, logical’ direction of a challenge, you turn it around (or an important element in the challenge) and look for opposite ideas.