The Scout Mindset

I was suggested this book by a great friend of mine, a man who strives each day to follow a stoic and rational way of thinking, to live an intentional life avoiding (or minimizing) cognitive biases and all the fallacies we can easily getting trapped into. So, I read it, it took me just a couple of days since it’s really highly readable (compared, for example, with books like “Antifragile” by Nassim Taleb, probably well articulated, but way too much). It should be an easy reading to everyone, since I found all the concepts clearly explained, something easy to digest even by a complete layperson that never heard of concepts such as “echo chamber” or “biases”.
What I liked most is what a Buddhist may define the “observant but nonjudgmental mind” (I could also add “connect with your inner child”, with a curious mind and educating him/her to relax and deal softly with the emotions). Yes, perhaps it can surprise you, but I can find also links between rationality and (some parts of some) religions, if you want to read more: Religion and Science (and I should remember to post also long considerations on “Irriducibile”, the recent book by Federico Faggin, a well-known Italian scientist/inventor about some related topics). So, coming back to this book, I found interesting the highlight of the “psychological side” along with the more “technical” part.

What instead I didn’t like too much was the same bias (ironically, in a book that speak also about biases) that I found in some “self-help” books (that in my opinion don’t help too much, quite the contrary): the survivorship bias. I mean: we all know the economical success of Bezos and Musk (I highlight: the economical success, because we don’t know their overall lives and satisfaction in personal areas), but you can’t just make an analysis of their cause-and-effect: how many others tried the same way and failed? I’m not telling it was just luck, but luck (in other words: conditions, mostly non-repeatable) played a role with a certain weight to achieve their goals. Sure they have a good big business approach, but even if you try to replicate their “conditio sine qua non“, it won’t guarantee that you will survive (hence, the name of the survivorship bias). So please next time, alongside with the 0.001% that made impressive results, show me the list of the ones that tried to do the same and failed, this would be a much more honest narrative, rather than just remember that the expectation (in a statistical sense) of starting a new business is huge and that, as Taleb would say, this is an asymmetric risk for good, where you can lose -100% but gain +100,000% – see Taleb distribution for more).
I’ll now “summarize” the book, but, as always, remember:

  • this is here under a “fair use” only and I really strongly suggest you to buy the book: the author deserves it and I can tell you will spend nice hours with the company of her thoughts;
  • the following concepts are mixed with my own thoughts and considerations, so please don’t rely on that if you’re searching for a “uncontaminated” summary – but I’ll try to make clear when it’s my “expansion”: I’ll try to remember to add it between parenthesis;
  • a summary is not enough, there’s a huge difference between reading/listening a summary/review and actually reading all the book, I explained better here, where I wrote why summaries don’t work well.

0. Intro

Asking the right questions or even “asking” in general is something not encouraged neither in school nor at work (as also said in the intro of the course “Intelligence tools for the digital age“). A Scout mindset is fundamental to recognize when we are wrong and overcome our capacity to fool ourselves and hide the truth from ourselves (see also “The Elephant in the Brain” by Kevin Simler and Robin Hanson), we even avoid thinking about problems (it’s a matter of “efficiency” time/energy saver, you can read also about this shocking experiment – shocking also because it involves electroshock). Author found herself wasting interviews (she’s a podcaster) since she was trying to convince the interviewee that she was correct rather than understanding the other’s point of view (but for that you also need to develop/improve your ability to actively listen with empathy). We often think that seeing world realistically leads to depression (“Ignorance is bliss” or “being conscious is a torment”, as Epica’s Simone Simons sing, see also “Trop intelligent pour être heureux?” by Jeanne Siaud-Facchin), but often is the opposite: self-awareness and Scout mindset can be emotionally rewarding.

1. The Case for Scout Mindset

1.1 Two types of thinking

In 1984, a cleaning lady (actually, a French spy) in the German embassy in France found evidence of possible leak of information passed by someone from France to Germany, so they speculated it could have been an Army officer, Dreyfus, the only Jewish one in the general staff. Instead of carefully searching for evidence (and counter-evidence), they went straightforward with the confirmation bias to reinforce their first hypothesis: this was an example of directionally motivated reasoning, the one that cause us to ask ourselves “Can I believe this?” when something is in agreement with our expectations and instead “Must I think this?”, so to doubt, when we don’t like a result and so searching for reasons to discard the result (this is present also in an impressive number of research studies, I was impressed by the examples provided in “Neuroscience and Neuroimaging Specialization” by the Johns Hopkins University, including cherry-picking, manipulation/forging data, begin with results/conclusion in mind and many others). It’s quite common to find reasoning arguments in a militaristic language (after all, strategies, tactic, operations… are concepts widely used today in almost every field, but started in the military). Arguments are possibly a form of attack or defense (or, from my previous experience of a military officer, I can say blue, red or even white, like a neutral observer… and I strongly believe that knowledge can be almost always used in these 3 forms). Over time, our views become reinforced (that’s why it’s way harder to change something radical like religious beliefs in the elderly), specially about stereotypes. Of course, in the Dreyfus case, there were also other suspects, e.g. an officer with drinking problems (that’s why the security clearance takes into account also similar issues), but the anti-semitic feeling was stronger. Colonel Piquart felt that evidence was not strong enough, so decided to “search for the truth” and when someone asked why he insisted so much to reopen the case, he simply replied “Because it was my duty” (this is something useful for everyone: every time there’s a strong accusation like in the Johnny Depp case, with huge polarization, the search for the truth is something that help all the men, in our strange period). In the story just told, we can identify 2 ways of thinking:

  • like a soldier fighting off threatening evidence;
  • like a scout guided by accuracy and motivated reasoning, forming a map of the strategic landscape.
    So, being a scout means wanting your map be as accurate as possible (this is like the Children Development, when we don’t know the world and ourselves, so we naturally create models, then with every new evidence, we expand/modify/discard the old ones or create new models, to better understand the world and we adapt the maps to the reality, not viceversa). It helps also in knowing when it’s time to give up to cut losses or instead to continue (see the sunk-cost fallacies and “The Dip” – we perform a kind of continuous testing to check/adjust directions, and I have experience in sailboat, so I do know what I’m talking about). Finding out that we’re wrong it doesn’t mean “be defeated”, but only that our map was not correct/accurate enough and we can just correct it. We don’t fall always 100% in one of these two archetypes, we fluctuate between mindsets from day to day and from context to context.
Try to be a scout rather than a soldier.
Image created by me with DALL-E

1.2 What the soldier is protecting

Someone just want to change something before even trying to understand the status quo (and I can confirm I met a lot of politicians, senior officers and many others that ordered to change a lot of things before even understanding where they landed, assuming that all the previous ones were wrong). This is known as Chesterton’s fence: if you just remove a fence because you can’t get why is there, probably the idiot was not the one that put it there, but rather you who are missing the reason why the fence was there (for example to prevent accidents or animals to escape). A Soldier mindset can help us avoid negative emotions like fear, stress and regret, we can deceive ourselves like in Aesop’s “The fox and the grapes”, sometimes useful to indulge in a negative situations thinking for example that our problem is actually a blessing (compare this with the catholic assumption that suffering is good for us, it’s a god-sent proof), we convince ourselves that “it’s fate, it’s out of my control” (so more like believing in a fixed astrology that assign us a certain unchangeable sign, rather than thinking about the “homo faber”, since thinking and acting, being in control requires effort and emotive fatigue – you can see more in the circle of control in a previous post, “News of the world” threaten our intentionality, adding the popular Serenity Prayer: we should have the serenity to accept what cannot be changed, the courage to change what can be changed, and the wisdom to know the one from the other). We hide ourselves behind comfortable stereotypes, old sayings and fixed mindset, finding way to protect our ego (see “Ego Is the Enemy”). When something is a mix (e.g.: in a failure/success in finding a partner, there’s genetic but also chance to improve), we focus more on the “pre-destinated”/fixed part. Of course the “irrational optimism” is not such a good deal (e.g.: if you’re overconfident in jumping from one building to the next, specially if you never tried, more practical examples at work are found in “The Stupidity Paradox”), since it’s not true that if you are determined to success, you’ll do it for sure (I like the “homo faber” mindset, but I am also aware of context and external factors).
Some people (I think of politicians, influencers and dictator’s staff like in the minister of propaganda during Nazi period) persuade themselves first to try to convince others, to sound more authentic .Senator Lyndon Johnson said “What convinces is conviction” (in my own words, I used to say that some people believe their own lies, a self-directed way of “Repeating a lie enough times, you can convince others it’s true”… and this is more or less what some “PUA” keep telling in order to “instill charisma”). We are social animals where (virtue) signaling does matter (that’s also the reason why you can easily recognize on social networks members of some communities, like vegans, pro-LGBT, feminist and believers of some ideologies). And usually these are used as a socially acceptable way to hide your real interest, for example you can say you don’t want a new construction near your house, but not because you don’t want your property decrease the value or because you don’t want an obstacle in front of your window, but only because you care for the environment (this kind of hypocrisy is well described in a famous George Carlin’s sketch “Saving the Planet”here for an animated version). Sometimes they are genuinely unaware of their hypocrisy, a way the brain use to protect ourselves (see again [[The Elephant in the Brain]], what I hope is that people become not only aware of that, but even to the next level, understanding that some people are self-aware and others don’t).
In some religious communities, losing faith can mean losing family and all the social support (actually there’s a precise name for it: ostracism, a social ban that was considered the worst punishment, leading to death outside the city walls), but generally speaking we can’t tell an always valid rule about following others or not, it depends on the context: if all the kids are jumping out of the window, will you follow them? Well, it depends: probably they’ll jumping to escape from a fire (or the contrary, remember the famous “Eat shit: millions of mosquitos can’t be wrong!”). Be also aware of the “tall poppy syndrome”: in some communities, too much ambition and self-regard are strongly discouraged, you are not allowed to grow more than others (it’s up to you to remain or not).
Some schools try to teach students about cognitive biases (and in fact some of them are aware that they can have some biases), but they failed to consider and prepare for the rebuttals to their arguments.

1.3 Why truth is more valuable than we realize

Be aware that allowing yourself to to have doubts can lead to reject your communities’ point of view and to decide to live a less traditional life (ideologies pretend a blind faith, no doubts, just dogmas to be trusted and followed by irrational sheep). Even when we want (in theory) to receive an honest feedback, we sometimes pretend to be more rational/scientific, but we’ll end up in metrics in the Goodhart’s law style). Being irrational means unconsciously choosing just enough irrationality to achieve our goal and switch to rationality only when the comfort of denial is sufficiently high or chance of fixing a problem is sufficiently law, as explained by Bryan Caplan in the “rational irrationality hypothesis” (so, let’s say that if we lower this threshold, we may act in advance, before reaching that discomfort / cognitive dissonance and before the problem grows too much, of course it requires much more effort, but it will pay in the long term).
We overvalue the immediate rewards and fail to properly evaluate the future, this is called the present bias (this is related to the Stanford marshmallow experiment, not to be confused with the mindfulness concept of staying in the “here and now”). In other words: we tend to reach out for immediate gratification instead of visualizing the long-term consequences (for good and for bad), justifying with ourselves telling “it wasn’t my fault” (you can understand a practical example if you see the schema I drew in [[Weight management]], where appetite is the result of physiological needs stimuli combined with stronger emotional inputs, so we can have this feeling that “it’s stronger than our willpower” and easily reply ourselves that it’s pure determinism, “we’re made this way”). (If you really want to go understand more, I wrote more in Dopamine Nation and in How (yet another book on) Atomic Habits made me think).
The good news is that we can develop the scout mindset starting from the morning (see the book “Make your Bed” by Admiral William McRaven or even the concept express by Marco Aurelio: we’re not built to stay in the bed, so get up!) and reinforcing general habits of thought (meanwhile we expand our knowledge, design a better map and develop skills to faster generate more accurate maps). A lot of people prefer instead to view themselves through rose-colored glasses (hey, this is actually the opposite of the meaning behind the name of this blog, “Different glasses”, meant to be used to understand better the reality, not to seeing it altered and blurred, rather the opposite: to better face the truth! Searching for an equilibrium: kindness with yourself, but not to much self-indulgent).
We also overestimate social costs (see “The Courage To Be Disliked” for Adlerian psychology on the theme and “Loneliness” for the social psychology and the anthropological aspects), sacrificing a lot of happiness to avoid relatively small social costs (and always remember: we are not at the center of the universe, others think of us much less than we can imagine, but at the same time please don’t engage in anti-social behaviors! I’m just saying please don’t miss the train just because you’re afraid to look weird when you run).

2. Developing Self-Awareness

2.1 Signs of a Scout

There are forum like “Am I the asshole?” on Reddit, where it’s nice to see when/how some people see only their point of view and they search for external point of views when they’re stuck on a discussion with partners, friends, colleagues and so on (this is even more evident when you face completely different cultures, since they have different background and a different system of values, that’s why for example there are intelligence specialists that teach you how to interpret and behave when you deal with counter-parts and people in general, in other countries, to reduce friction and misunderstanding, as well as enhance empathy toward others). Viewing ourselves as rational can backfire (consider the words of Paul Watzlawick: “The belief that one’s own view of reality is the only reality is the most dangerous of all delusions” – it’s not a coincidence I chose these words in the homepage of this blog), there’s actually the paradox of the more educated and intelligent people holding tightly their views (they can self-convince themselves since they know they are people who spent a lot of time studying, so they should be less wrong, see also “The Ignorance of the Learned” by William Hazlitt, even more true when school just teach notions and not how to think). That’s not all: some tests used to measure how you are “intelligent” or better “open minded”, ask you about topics where holding a conservative beliefs is often seen as being “rigid”, so if you favor death penalty and you’re against abortion or socialism, they classify you as “rigid” instead of just conservative, under the prejudice that progressists are better people (and there are many psychologists with these biases, at the point that one professor like Jordan Peterson is considered, at best, “controversial”, since he’s “not aligned” with a certain mainstream politics), that’s because researchers in psychology identify themselves liberals over conservatives in the ration of 14:1.
There are 5 signs to help you understand you are following a scout mindset:

  1. Do you tell other people when you realize they were right? (Intellectual honesty, without having the issue of Fonzie who couldn’t pronunciate “I was wrong”, but in out society it’s seen as a weakness, specially for politicians)
  2. How do you react to personal criticism? (Most people ignore or justify themselves, but here is crucial to distinguish that criticism is not to ourselves as a whole person, rather to a single act/behavior, for example as a worker in a specific set of tasks or as a student in a certain field, hence the importance also to diversify what you do!)
  3. Do you ever prove yourself wrong? (Specially when you think you’ve found something that aligns with one of the thing your ideology repeat in propaganda). In the example provided by the author, journalist Bethany Brookshire with a PhD claimed to have noticed that some men call her “Ms” instead of “Dr”, but then it turned out it was not the case, but just a focused cherry picking, so she acknowledged she was wrong (this happens every time we try to torture the data, “If you torture the data long enough, it will confess to anything”)
  4. Do you take precautions to avoid fooling yourself? Trying for example to work with blind data analysis (or, what I suggest in controversial topics: try to label just “group A” and “group B” when you perform analysis like Russia/Ucraine or Israel/Palestine).
  5. Do you have any good critics? But avoid criticism from people with prejudice, like the ones that are not interested in what you say because they just focus on the fact you are a military or work for some industry they judge controversial.

2.2 Noticing bias

Some politicians reply in a completely opposite way depending they’re attacking one of the others or defending one of theirs (so, like in a stupid debate I listened to: if you see a young female prime minister dancing in a disco, it’s OK, it’s expression of youth and female empowerment, but if you see a man from a conservative party doing exactly the same, it’s disgusting – I also add that in theory we must be “less indulgent” with the ones that start this useless game, since they consider themselves better than the conservatives). Let’s consider this example: “If you get sued and you win the cause, should the person who sued you pay your legal costs?”, 85% answered yes; “if you sue someone and you lose the case, should you pay his cost?”, only 44%, half compared to the counter-question before, answered yes. What changed is being part of the defense, so it’s right to defend and be paid from false accuse, or the opposite: if you’re on the accuse, you may argue this discourage victims from suing since they can lose. In this cases, it’s recommended to play a thought experiment (for example, a role game, pretending to be the counterpart, but I encountered people having really difficulties in imaging to be in the other side or to imagine to put in a certain situation). This is known for example as the double standard test: am I judging other’s people’s behavior by a standard I wouldn’t apply to myself?
Another thought experiment is the outsider test (it reminds me of the different color caps during project management). In a few words, it’s like imaging to observe the situation starting from scratch, like “seeing stuff with new eyes”, something that can help to minimize anchoring and the sunk-cost fallacy, so like “what would someone do if he start right now walking in my shoes?”.
The conformity test is the one that come to our mind every time we tried to mirror a person we liked (there are a lot of comic sketches, like restaurant scenes in which the lover try to meet the taste of the other – there are really a lot of experiments about conformity, showing our social pressure even in cases we don’t know others but they influence our judice, see the Asch experiments). One way to try to overcome this social pressure is trying to imagine the opposite situation, e.g.: if making children is a behavior related to really few people, will I still have this desire to make children (I can extend this to “if he/she was not my boss, should I have accepted the party invitation?”)
The selective skeptic test is basically thinking: if the article or the study I am reading now was written by the opposite party and cherry-picked, would I react the same?
The status quo bias test can help us every time we feel stuck in a situation and then we can’t decide if continue in a path or not (see also: “The Dip”) or when we aren’t sure we want to change our current job or location (see also “Thinking, fast and slow” on loss aversion and sunk-cost fallacy, with examples like: I have the ticket for the concert, will I go even if it’s raining and it will cost me yet more money to go there?). So, a way to move away from the impasse is asking ourselves: if my current situation was not the status quo, would I actively choose it? (Be aware that here we’re not considering all the over-head and the friction related to the move, so for the thought experiment we must ignore them! It applies to a wide range, from relationships to a career change, to relocation, to switch university faculty and so on)

Thought experiment: if you are blue team, thinking like the counter-part red team (this is actually how a serious security specialist works).
Image created by me with DALL-E

2.3 How sure are you?

(I would add at the beginning here: remember that we must distinguish a lot here: be (un)certain in our competence fields is quite different from being (un)certain in “generic topis”, in which we may be really not interested at all. For the first case, you can go deeper searching for the Duning-Kruger effect and impostor syndrome – I already wrote about it in (How to) prepare for a new job).
We like feeling certain (there’s a lot of psychology here, with mechanisms developed to cope with uncertainty outside our locus of control, minimizing anxiety, starting with civilizations that attributed powers of catastrophes to the gods or to someone to blame… and it still happens), certainty is simple (and also it helps simple minds to understand the reality with a lot of approximations). The author proposes a test to track how much we are “far from reality”, showing on a graph the difference between “it actually happens X% of the time” versus “when I think X% is right”, so we can see the difference compared to a perfect calibration 1:1 (what I would suggest to the author, is to divide for each field, since we may be a lot accurate on a field we’re master or maybe even the contrary, so we can better know how much “wrong” we are compared to how we are sure, ideally we can compile and update a kind of “confusion matrix” but about ourselves!).
Robert Kurzban found that we have 2 ways to think/communicate: one toward outside, e.g.: company’s press secretary, and one, more honest, within the company (it reminds me of the “secretary in our mind” in “The Elephant in the Brain”, but also that it’s different when we have skin in the game for example if manager are really paid with their products more than in stock options, plus we may say others that our stuff is good but denying access to that to our children, like Big Tech CEOs do with their children who want to access social media and Internet in general). What can help us decide, even in a quantitative way, is the Equivalent Bet Test, e.g..: if you have to choose between a box with 25% to win a prize or bet on the rightness your statement for the same prize amount, what would you choose? And now if you have to choose between a box with a 10% chance to win and your claim to be true? And so on.

3. Thriving without Illusions

3.1 Coping with reality

Steven Callahan survived after his ship capsized not only thanks to his technical ability and for the ability of rationing food supplies, but also for the mindset, repeating himself like a mantra: “You’re doing the best you can. You can only do the best you can” (actually there’s a study showing that talking with ourselves is good and probably sign to be intelligent, but in general this is what I always suggest myself and others: do the best you can do, in the moment and inthe context you are, I learnt this during scout camps and obviously during years spent as a military officer in training and in missions). Remember that small mistakes sometimes are OK, but repeated are not OK (see also Atomic Habits), don’t ignore symptoms/signs, keep despair at bay (see also “The Happiness Trap”).
There are honest and self-deceptive ways of coping. A small amount of self-justification is OK, as said by Tavris and Aronson in “Mistakes were made (but not by me)” (here I can also add that this is related to our culture: in English or Spanish, we broke a vase or the vase broken itself, changing the point of view, so the locus, so our perception, that’s why language and cultures deeply shape the models we build during development). In “Thinking, fast and slow”, Kahneman explains that it’s easier to bounce back from failure if you can blame anyone but ourselves (but actually this reinforce arrogance in some managers, in general we are still facing one of the most common dilemmas: how much responsibility we must attribute to ourselves on what it happens). Honest coping strategies are for example: count our blessing, notice how far we’ve come, remember we can’t do more than our best (see here for example the final scene in Schindler’s List).
Make a plan (and stick to it, see Meno procrastinazione, più produttività, divide in small tasks and start for at least 2mins, think what will happens if you don’t do it or if you do it, why it’s important to you and all the rest), notice silver linings (search for the good in every situation), focus on different goal when situation changes, for example if you were a programmer but now you’re a manager, you may want to move to “people skills” rather than trying to prove others you’re the best programmer. Studies on self-deception are controversial, when positive psychology try to correlate positive illusions and positive beliefs, what’s worse is that some researchers just assume that everyone must be stressed and angry, so if you answer that you’re calm and it’s extremely rare you are upset, to them you’re “in denial” (this reminds me of some feminazi that assume that all men have issues with anger, so if you try to say “Not all men”, you’re “denying yourself the truth”, even if you’re a Buddhist monk).

3.2 Motivation without Self-deception

In the self-belief model of success, if you think you can do it and visualize your success, you’ll be motivated, otherwise you will not (it reminds the famous “Wheter you think you’ll success or you’ll fail, you’ll be right”, but actually thinking about possible failures is preparing to minimize failures and properly counteract if you’ll face them). Visualizing is a popular trend, just have a look to Pinterest (and also to “The secret” and all the stupid “Law of the attraction” stuff), but what it will really help us is a realistic accurate picture of our odds (and I can add also prototyping like in Designing Your Life). Ask yourself “Are you sure this is the only career you could be excited about?” (I almost move to tears here, remembering the 17 y.o. me, deciding if military academy was the best option… and actually I prepared other plans in case of failure; even after, I checked I was on the “sub-optimal” path like in Viterbi algorithm in progress). Be flexible when opportunities arise and situation changes. Here the author mention the possible way of thinking of Elon Musk, who said: “If something is important enough you should try. Even if the probable outcome is failure”. Basically, you can calculate (estimate) the expected value: if you throw a dice where you can lose 10$ for 5 faces but win 110$ if you get “6”, it may be worth to try (-10 -10 -10 -10 -10 +110)$ x 1/6 = +10$, so a positive expectation (Taleb here goes further when loss are limited, like -100% and win is even more than 10,000%). Then, in case of Musk, there’s also diversification and repeating several times (important here: not all the people may want to become the richer person on the earth, so it depends actually in your goal… and this can’t be applied for “serious” relationships, this is actually the tactic proposed by some PUAs: try in a serial way with everyone)

3.3 Influence without Overconfidence

We have epistemic confidence (certainty about what’s true) and Social confidence (self-assurance). (Be careful again here: even if some people loves people that show false-modesty and being “uncomfortable with life”, if you want to build a character like Woody Allen but not going too far on Eeyore mode, usually people like arrogant aggressive models that fakes humbleness and humility, people like VIPs that show extreme self-confidence). A study from Anderson et al. 2012 clearly show perceived competence that is higher when the speaker speak a lot with a confident tone of voice, much more than provided useful info (you can see politicians here), where epistemic confidence (so actually saying how much they are certain on results and how much they’re competent) is a really low-considered feature (it’s not surprising, that’s why we are surrounded by people good looking and smiling that try to sell you something, after all we’re mostly monkey searching for the alpha leader, that we often identify in the style of Tony Robbins).
There are two kind of uncertainty: due to our ignorance or inexperience (so, we are responsible and we can work on it) and due to reality being messy and unpredictable. How can we better communicate that something is really uncertain and it’s not us not knowing the topic?

  1. Show that uncertainty is justified (some audience can’t understand complexity and uncertainty and pretend “certain answers”, like it was pretended by politicians and people during last pandemic)
  2. Give informed estimates (like: in previous cases, it went X% this way, but can be also fine sometimes to say: since we didn’t have hard data, we estimated/simulated that…)
  3. Have a plan (even if there are data, remembering that people don’t like uncertainty, you can give them a plan or some suggestions to help them feeling more in control, like a plan to reduce probability of developing cancer – see here a lot of studies in which people like to find some correlations or all the time placebos work fine, there are some psychotherapies involving “assignments” that sometimes are just a way to “distract”, for example in strategic therapy, same is for some children)
    You don’t need to promise success to be inspiring (also: “Be the change you want to see” or “Do it and others will follow”, I found this true, every time someone said my lifestyle is an inspiration to them).
Instead of trying to convince others, be an example.
Image created by me with DALL-E

4. Changing your Mind

4.1 How to be wrong

Some people are really good in Superforecasting (there’s a specific book about this topic, even if sometimes, it’s more likely to be survivorship bias, someone think that probably a monkey could have performed like Warren Buffet, it was just random luck). What makes superforecasters strong is that are great at being wrong (check, change idea, check again, …). This happens sometimes also to us: during a job interview, we update our guess to be hired, like: I replied perfectly, so maybe there’s 30% of possibility, then they won’t reply shortly to you so you update your guess to 20% and so on. Keep your mind open (don’t search for the “absolute” and stay away from ideology, it will help you to stay better vigilant and responsive on the truth without filters). When thinking about others or external situations, don’t try to think like what would you do in the situation, consider the context and the other’s background.
(There’s more than better output, but also better outcome) when a forecaster recognize he was wrong, it helps him make better forecasts, when an investor recognizes he was wrong, it helps him make better investments. Try to get rid of confirmation bias, recency bias and all the other bias (stick to data and context), they’re domain-general lessons that you can apply in every field.
Knowing principles is not enough, you must practice and internalize them.
You don’t have to “admit a mistake” (to ourself and others), rather the scout mindset tells you that you’re just updating the map: this is Bayesian updating, a correct way to revise a probability after learning new information (the same it happens in Child Development when a baby corrects models of reality, after all the scientific progress is following more or less this way). Software engineer Devon Zuegel encourages readers to view her blog posts not as her permanent opinions, but instead as a “stream of thoughts, caught in the middle of updates”.
(As a corollary) if you are not changing your mind, you’re doing something wrong (be open to updated and knowledge progress, even the Dalai Lama told he can revisit some Buddhist principles when they find are in contrast with science – but so far no major “collisions” – it’s a good sign if we don’t think the same we thought at 5 years old, but at the same time this is not a virtue if you are a politician changing your opinion right after the elections).

4.2 Lean in the confusion

Our ability to change our minds depends on how we react when the world confounds our expectations (see also “Deviate” by Beau Lotto). When we think “Nobody likes me” or “They only invited me out of pity” (see also Charlie Brown syndrome, searching for confirmation) we fall in a loop of motivated reasoning, until the mismatch with reality goes too far. We try to avoid what is disrupting our model, but it can be a chance to update our map, instead: Darwin got crazy about Peacock’s tail since he couldn’t find a rational explanation for this feature after evolution, but ignoring it he could have missed one important piece of the puzzle: that also sexual selection is an important driver in evolution (the same is true for us and also nowadays: dance ability and other “useless” abilities are actually a proxy for our health, coordination, self-confidence and other quality that a partner search for). Sometimes we are really too attached to our models (specially true when we are outside our bubble) and this can be a great obstacle when we try to understand others, like why a country decides to attack another one (we prefer the simplicity to say that “Putin got crazy” instead of thinking about the reasons that could have moved him to invade Ukraine – I was shocked after learning some new way of thinking after intelligence and strategy courses, it was like opening a door to completely new worlds). Top negotiators emphasize: don’t write off the other side is crazy (and when analyzing societies, don’t be tempted to exclude outliers, on the contrary they can be the easier way to understand what lies below that is expressed strongly in their behavior). Sometimes, you need to completely change you point of view, acting a paradigm shift (that is often followed by an ah-ah moment). One of the main causes of bad decisions, according to “Source of Power” by Gary Klein, is “de minimus error“: attempt to minimize the inconsistency between observations and theory (if someone promise you great rewards but you fail, you may accuse yourself, saying “probably I’m not trying hard enough”). Of course, it’s not that we should abandon a paradigm just because we find something “not expected”: ask yourself if the new evidence stretches your theory a little or a lot (you may have encountered really an exception): don’t dismiss observations against your model, but get curious about them.

4.3 Escape your echo chamber

Escaping our “bubble” is not easy, sometimes it could be even counterproductive: a Michigan magazine in 2017 attempted an experiment in which people with very different views exchanged their “media diet” for one week; the results: they found the other side even more biased than they thought. A larger experiment in 2018 showed that people that followed Twitter bots set on the counterpart become even more convinced of their point of view: the conservatives become even more conservatives (liberals slightly more liberals). Solution: we should listen to people who make it easier to be open to their arguments, not harder (so: not shocks, but more “proximity area” with something in common with us, e.g.: same point of views in other areas, similar job or interests and so on). An example is a space on Reddit called FeMRADebates where people try to argue in a honest and decent way (so without misogynists redpilled incels or misandric feminazi, even if data shows that there’s clearly a double standard toward men, probably due to the lack of empathy for example with divorced fathers, see the documentary of a repentant feminist in 2016, “The Red Pill”).
Some politicians tried to build a “team of rivals” (opposed to the useless “yes men”, in theory we may want Alice’s talking cricket or the little angel and devil like in some cartoons) but often they failed, specially if they were way too far from the point of view of the politician.
We misunderstand each other’s views (specially when there’s an implicit different background and different meaning for same word or gesture), sometimes we ignore a good argument that we mistake at first for a bad argument (this is often true when someone use a buzzword that meet the current hype, like “quantistic thinking”, but then it can turn out that the author is not one of the many idiots but a well-respected scientist that use the terminology in a way different than you think).
Our beliefs are interdependent, changing one requires changing others (so, the key here is again flexibility – the problem is that some ideologies are often found strongly linked, so trying to change a single aspect in the system of belief is like trying to destroy a monolith)

5. Rethinking identity

5.1 How beliefs become identities

Etiquette says that you’re not supposed to make conversation about politics or religion: the reason is that religious and political believes shaped us to to the point we identify with the respective associated values (I can also add here that it’s like for football teams: we blindly support the one that was “believed” in the family or social context we’re born in, the same is often true for people believing in magical thinking like astrology, who think people are immutably classified with stereotypes based on the period they are born). Some member proud to be raised in certain groups “feel sorry” for others, like they are the predestined or the one “who know better”. The incredible situation is that even in scientific fields there are “currents” and “movements”, there’s even a “conflict” (including goliardic hymns) between “frequentists” and “Bayesians”. Generally speaking, you can easily spot those “proud members”, e.g. “proud vegan” ((virtue) signaling), with one or more of these red flags:

  1. They use sentences like “I believe that women are changing the world” (you can replace women with whoever is “trendy” to use now to show that you’re a liberal), since “I believe” is a statement about you, what you think, so you’re generous and compassionate.
  2. When you feel the urge to step in and defend a group or belief system against perceived criticism, chances are good that your identity is involved.
  3. Defiant language like “proud”, “standing up”, “fearless”
  4. A righteous tone, like statements ending with “period. full stop. end of story” or “You don’t support it? You. Are. Part. Of. The. Problem.”
  5. Gatekeeping, like people pretending to be part of a group even if they don’t even understand (like wannabe scientists joining fan clubs and sharing simple jokes, since it’s cool, in a word: poser)
  6. Schadenfreude: you celebrate when you read a news of something bad happened to the counterpart or a study that reveal they are wrong on something
  7. Use of epithets (this is part of sociology of groups, including dressing and jargon)
  8. Having to defend your view: the more you spent time and argued in public, the more you have yet to defend again your position.

5.2 Hold your identity lightly

As Paul Graham wrote: “Keep your identity small”: “the more labels you have for yourself, the dumber they make you” (you feel the pressure to meet some expectations). Instead of “vegan”, think yourself as “a person who eats a vegan diet” (same is for people affected by some diseases). If you vote or believe in something, try to see yourself as a “person who agrees with most ideas that are part of X movement”. Avoid the “someone is wrong on the Internet” stimulus to jump into unproductive conversations. Honestly repeat yourself to abandon when you find something against your values.
Could you pass an ideological Turing test? This is suggested by Bryan Caplan: can you explain/convince others to follow you if you had to simulate being member of the other party (there are actually some games in which an opponent has to play the part of one of the members of a community, while the real members have to identify this opponent, just speaking everyone about everything). Ideally, you can really find one real opponent in person and ask if you sound “real” like them.
It may sound like a paradox, but holding strongly an identity prevents you from persuading others: you take everything for granted and can’t explain clearly why someone has to follow you or why your group is better.
More than on the identity, focus on the impact, acquire knowledge, listen to experts and propose real collaborations with them and institutions.

When needed, get rid of all your labels and “patches” sewn to your identity
Image created by me with DALL-E

5.3 Scout identity

If you pride yourself on being a scout, it becomes easier to resist the temptation to mock someone who disagrees with you, because you can remind yourself, “I’m not the kind of person who takes cheap shots” (this behavior is gold: if you have something to say, resist the urge to comment on social media and take time to elaborate and write a proper essay on it or nothing).
We can’t deny our human nature, we are also the results of the influence of the people who surround us, so the best thing you can do tp change your thinking is to change the people you surround yourself with. If, for instance, you are in an environment that encourage fast response without thinking, it’s probably not good for you. You can choose what kind of people you attract.
You can also “test yourself” in communities and sites like ChangeAView.com.
Don’t strive to be or appear certain, “I don’t need to appear certain – because if I can’t be sure of the answer, no one can be” (Robert Nozick).
Some scout habits:

  1. Ask yourself what kind of bias could be affecting your judgement in that situation
  2. Ask yourself how sure you really are
  3. Make a concrete plan for how you would deal when you find yourself with a motivated reasoning
  4. Find an author who holds different views, but that is also someone you find reasonable or with whom you share some common ground
  5. Next time you find someone being “irrational”, get curious about why their behavior might make sense to them
  6. Look for opportunities to update at least a little bit
  7. If you have a disagreement and find yourself wrong, reach out to that person to let them know how you’ve updated
  8. Attempt an ideological Turing test of the other side

Lastly, don’t think you’re “immune” just because you tell yourself you have a scout mindset (actually sometimes is even worse, since when we think we’re safe, we may underestimate biases).

6. A few last words

I really enjoyed this book, even if I found it “too basic”, but I can live with this issue: once you study every kind of subject, after a few books and courses, it’s every time more and more difficult to acquire some new “shocking information”, some “revelation” you were not aware before. But it was nevertheless enjoyable, since it’s often the case that others can elaborate a different point of view, even when starting with the same information, since everyone has a different background and a different mind.
In a growing complex world, full of inter-dependent systems, scout mindset is far more useful than it would have been in the past, I consider it another superpower, alongside with the ability to do Deep Work.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.