Tuesday muse


My old Navy buddy sent me this article (scroll down). The source isn't cited**, but it's wise insight into human nature. 

Its main thrust, where I realized some weeks ago that I was foolishly presuming myself, is that we humans aren't alarmed by statistics such as Covid-19 death forecasts. If vast numbers will die, it's still a minor percentage of the population, and it's Them, those who will die not us who will die, so we don't much worry about it. And, as we are now seeing, we become impatient with such as social distancing, stay at home, and out of work, pressing to end this nonsense and get back to work even if it does mean that a few more of Them may get sick and die, it's just a statistic. 

Airplane crashes are only a statistic. We board an airliner for our flight, knowing this plane will not be the one that crashes. And it never occurs to me that the story in the Local News this evening will be about me. Bad news is what we read about happening to other people.

A little off track, but there's the story of John and Bill, remember. Two baseball fanatics hope there'll be baseball in heaven, and they jokingly promise each other that whoever dies first will come back and tell the survivor whether it's true. So John dies. One evening a few months later, Bill is relaxing at home when suddenly his dead buddy John appears in front of him.

Bill: "John! how're things up there?" 

John: "It's great, Bill! You and I were right, we DO have baseball! We have games almost every day, and I'm pitcher on our team!"

Bill: "Hey, that's great, I can't wait!! I hope I get to play!" 

John: "Terrific!! Oh, that reminds me: we have a game tomorrow evening and you're my catcher!"

What really rang my bell is the concomitant idea that our obsession with being seen as "on the team" and not an outsider impacts our judgment, and possible outcomes. In the face of consensus when everyone else is clicking and saluting "Jawohl, mein Führer" we are afraid to speak up with a dissenting view no matter how well founded. Only the most confident and courageous do so. Thus, General Mattis is no longer Secretary of Defense. Colin Powell opposing the U S plan to invade Iraq, warns President Bush, "You'll own this place, you know", with overwhelming responsibility in the unthought out aftermath: proving himself not on the team, General Powell wasn't Secretary of State much longer. I remember this in the Navy "can do" attitude I experienced as pervasive and essential if you wanted a good fitness report, and yes, I participated. The immediate 850Strong stiff upper lip right from the afternoon of 10 October 2018 when people needed time to rage and grieve respectably. The current harebrained mentality to "reopen America" before the health experts judge it medically safe and wise.  

I grant, rather than defeatism, a positive attitude is essential to accomplishing goals, including recovery of any kind, and certainly warfare where one's order of battle warrants confidence; but it can also bring on what the article terms "functional stupidity" where everyone feels pressed to buy into agreement with the group's consensus and not risk being judged negative, and certainly not weak with an outlying viewpoint. As it says, the Bay of Pigs debacle because no one stood up to JFK, everybody got onboard, even uncomfortably. Stupid decisions are taken when we let ourselves get swept up in the moment, or are afraid to risk speaking up with the voice of reason. .

Not that any sort of pessimism would have helped us after the hurricane, we needed the determination, and still do, because town and county are still a mess. But people need to be allowed emotionally and mindfully to face What Is and begin healing without fear of scorn that we are weak or pessimists. We need to face facts instead of all the smiley faces, "Pack up your troubles in your old kit bag and smile, smile, smile". Eleven years ago this month a beloved family member died of cancer after long years of brave treatment. A month or so before the inevitable end, she phoned me to cry "I'm so scared and I'm tired of putting up a brave front, and I'm discouraged with everyone wearing a smiley face and no one willing to talk about the truth". I'm not sure I helped her, though I know she wanted me to and hoped I would. But I do understand. I recall my own discouragement during my heart episode a decade ago, when I needed someone to listen and talk about what I was feeling rather than uncomfortably brushing me off with "Oh, we don't want to talk about that" and hurrying on with falsetto sunshine when I was in the shadow and needing someone to visit me there. 

On the other hand, my personal experiences of darkness, like everything else I have had to face in life that at the time I'd rather not have done, have helped me immeasurably in being a more understanding and compassionate priest and pastor to people who are hurting. 

Anyway, here's the article. The yellow highlighting is not mine, though it's apt.


+++++++++++++++++++++

** By Michael Marshall, 14 April 2020. "Why we find it difficult to recognise a crisis" BBC FUTURE, Psychology,


Why we find it difficult to recognize a crisis
The current pandemic has affected some countries more than others, partly because they have been slow to react to the crisis. That, it turns out, is a very human response.

The coronavirus pandemic is upon us, and for many people it feels like it came out of nowhere.
The UK saw its first reported cases at the end of January, by which time the virus was already spreading around the world. But it was not until the middle of March that UK Prime Minister Boris Johnson “advised” people to avoid non-essential travel and socializing, and only on 23 March did he order the country into lockdown. The slow UK response came in for widespread criticism from public health experts.
In the US, President Donald Trump has overseen a chaotic response. The country has had a dire shortage of testing kits, so its government does not know how many people have had the disease. President Trump also repeatedly downplayed the dangers of the disease  (although despite what you may have read he did not call it a hoax). He also incorrectly compared it to seasonal flu, and falsely claimed the US response was more comprehensive than any other country's.
How did two of the most advanced countries in the world, with technology and expertise to spare, fail to recognize the crisis as it unfolded? A final answer will only come with hindsight and public inquiries, but there are many known psychological processes that cause individuals and organizations to miss the signs of a coming emergency – even when it is staring them in the face.
In 1980, psychologist Neil Weinstein published the first study of what has come to be known as “optimism bias”. He found that people are “unrealistically optimistic” about their own future prospects.
Weinstein asked over 200 students to rate their chances of experiencing different life events: either positive things like owning their own home or having a gifted child, or negative things like developing cancer or getting divorced. The students also rated the chances of other people in the group experiencing the same events.

Most of the students thought they had better than average prospects, for example saying they were less likely to get cancer than everyone else, and more likely to own their own homes.
“That’s been known and demonstrated in many different ways,” says Tali Sharot of University College London in the UK.
Sharot says the root of the bias may be the way we learn new information. In a 2011 study, her team found that people are quicker to update their beliefs in response to information that is better than expected, compared to information that is worse than expected.

It is easy to imagine how the optimism bias could affect our beliefs about Covid-19. If experts were to say that the lockdown would be eased in two weeks, people would quickly update their beliefs, says Sharot. But if the experts instead said it would last for longer than promised, people would update their beliefs less. “They say ‘I don’t really believe it’, ‘things change’, and so on,” she says. “As a consequence, you then generate these biased beliefs.”
Indeed, there is already evidence that the bias is at work, possibly explaining why so many people have failed to adopt precautions like social distancing.
For example, in a study that has not yet been peer-reviewed, Toby Wise of University College London in the UK and his colleagues surveyed 1,591 Americans on their beliefs and actions regarding the virus. While the volunteers’ awareness grew over time and they started taking protective measures, they underestimated their personal risk of infection, relative to the average person.

Similarly, Benjamin Kuper-Smith of the University Medical Center Hamburg-Eppendorf and his colleagues surveyed people in the UK, US and Germany. Their volunteers not only underestimated their risk of getting infected, they lowballed the chances of them passing the virus to others.
“We are now conducting a large study, and our pilot data shows the same thing,” says Sharot. In her pilot, “not a single person said they were more likely to get the virus”.
People are also susceptible to a subtler mistake, dubbed “outcome bias”.
“A very obvious example is if you have two airplanes that nearly collide, but don’t,” says Robin Dillon-Merrill at Georgetown University in Washington DC. She says one possible response is, “Wow, that was really close, next time that could have happened” – which might prompt changes to current practice. However, often people do not respond like that. Instead they say, “Wow, I’m a fabulous pilot and my flight skills avoided that entirely”. This is outcome bias: the fact things turned out OK can cause us to underestimate how close they came to going badly wrong.

Outcome bias was described by Jonathan Baron and John Hershey in 1988. They gave volunteers descriptions of decisions other people had taken in uncertain situations, such as gambling. The volunteers were asked to rate the other people’s decision-making and reasoning. They rated the other people more highly if the outcomes were favorable – even though chance played a large role in the outcomes. In other words, the fact the decisions happened to work out caused the volunteers to overrate the reasoning that went into making them.
The Covid-19 epidemic is a clear instance of governments and organizations not having learned from near misses. In the past 20 years there have been two outbreaks of diseases caused by coronaviruses, the group to which the new virus belongs. The Sars outbreak of 2003 and 2004 killed at least 774 people before it was contained, while the ongoing Mers outbreak which began in 2012 has killed 858. Covid-19 has already far surpassed both, at more than 76,500 deaths at the time of writing.
“I don’t think that we’re experiencing anything like a near-miss at the moment, unfortunately,” says Dillon-Merrill. “This is not a near-miss, this is an absolute hit.”

Even if people are presented with clear evidence that a crisis is unfolding, they may deny the reality of it. Many psychological factors contribute to denial, but a crucial one is confirmation bias. If people are motivated to believe something, they may only seek out evidence which supports their point of view, and ignore or dismiss anything that contradicts it.  
Dillon-Merill points to a recent story from the Los Angeles Times. On 8 March, a woman celebrated her 70th birthday with a party in southern California. A week later it emerged that one partygoer had tested positive for Covid-19. Many other attendees soon tested positive.
Guests had been advised not to attend if they were ill, but that was not enough. “A lot of people who are transferring the virus don’t have symptoms,” says Dillon-Merrill. But everyone rationalized that away. “With the confirmation bias, you find the data that supports your position,” she says. “What you really want to see is: ‘I’m healthy. I really want to have my party’. We assume nobody’s showing symptoms and nobody’s coughing on anybody else, why can’t we have our party?”

The story also illustrates another problem: in uncertain situations, we look to each other for guidance, but our neighbors are not always the best guides. “I think a very strong influence of all of this is social norms,” says Dillon-Merill. “Because the information is not clear, it’s changing or it’s uncertain, people are looking for clues and cues, and are tending to do what they see is the social norm.”
This may explain so-called panic buying of unnecessary items like bottled water. If you see other stocking up on bottled water, you may do it too. Our tendency to conform can be beneficial, but in this case it is hurting us.
At the level of government and other big organizations, this tendency to conformity can manifest as “groupthink”. Intelligent and experienced decision-makers sometimes stop discussing the various options openly and instead uncritically accept whatever plan they think everyone else is settling on.

Groupthink was first described by psychologist Irving Janis in the early 1970s, most notably in his book Victims of Groupthink. Janis studied President John F Kennedy’s decision-making in two international incidents: the failed Bay of Pigs invasion of Cuba in 1961, and the Cuban Missile Crisis of 1962. The Bay of Pigs was a major US foreign policy failure, and Janis found that Kennedy’s advisers were reluctant to contradict him or each other. “With the Cuban Missile Crisis, a lot more of the meetings happened without him in the room, where they were forced to come up with alternative ideas, and that forced people to weigh the pros and cons of ideas,” says Dillon-Merrill. “The simplest idea to overcome groupthink is a better-structured process for making decisions.”
There is a related concept called “functional stupidity”, described by Mats Alvesson at Lund University in Sweden and Andre Spicer at City University of London in the UK. The pair found that organizations often hire clever and talented people, but then create cultures and decision-making processes that do not encourage them to raise concerns or make suggestions. Instead, everyone is encouraged to emphasize positive interpretations of events, leading to “self-reinforcing stupidity”