Loading

Posts by Paul Costello1

THE COGNITIVE BIASES TRICKING YOUR BRAIN

Science suggests we’re hardwired to delude ourselves. Can we do anything about it?By Ben Yagoda  SEPTEMBER 2018 The Atlantic 

I am staring at a photograph of myself that shows me 20 years older than I am now. I have not stepped into the twilight zone. Rather, I am trying to rid myself of some measure of my present bias, which is the tendency people have, when considering a trade-off between two future moments, to more heavily weight the one closer to the present. A great many academic studies have shown this bias—also known as hyperbolic discounting—to be robust and persistent.

Discover “Where Memories Live”

In the second chapter of “Inheritance,” The Atlantic explores how place informs history—and how history shapes who we are.Explore

Most of them have focused on money. When asked whether they would prefer to have, say, $150 today or $180 in one month, people tend to choose the $150. Giving up a 20 percent return on investment is a bad move—which is easy to recognize when the question is thrust away from the present. Asked whether they would take $150 a year from now or $180 in 13 months, people are overwhelmingly willing to wait an extra month for the extra $30.

Present bias shows up not just in experiments, of course, but in the real world. Especially in the United States, people egregiously undersave for retirement—even when they make enough money to not spend their whole paycheck on expenses, and even when they work for a company that will kick in additional funds to retirement plans when they contribute.

That state of affairs led a scholar named Hal Hershfield to play around with photographs. Hershfield is a marketing professor at UCLA whose research starts from the idea that people are “estranged” from their future self. As a result, he explained in a 2011 paper, “saving is like a choice between spending money today or giving it to a stranger years from now.” The paper described an attempt by Hershfield and several colleagues to modify that state of mind in their students. They had the students observe, for a minute or so, virtual-reality avatars showing what they would look like at age 70. Then they asked the students what they would do if they unexpectedly came into $1,000. The students who had looked their older self in the eye said they would put an average of $172 into a retirement account. That’s more than double the amount that would have been invested by members of the control group, who were willing to sock away an average of only $80.

I am already old—in my early 60s, if you must know—so Hershfield furnished me not only with an image of myself in my 80s (complete with age spots, an exorbitantly asymmetrical face, and wrinkles as deep as a Manhattan pothole) but also with an image of my daughter as she’ll look decades from now. What this did, he explained, was make me ask myself, How will I feel toward the end of my life if my offspring are not taken care of?

When people hear the word bias, many if not most will think of either racial prejudice or news organizations that slant their coverage to favor one political position over another. Present bias, by contrast, is an example of cognitive bias—the collection of faulty ways of thinking that is apparently hardwired into the human brain. The collection is large. Wikipedia’s “List of cognitive biases” contains 185 entries, from actor-observer bias (“the tendency for explanations of other individuals’ behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation … and for explanations of one’s own behaviors to do the opposite”) to the Zeigarnik effect (“uncompleted or interrupted tasks are remembered better than completed ones”).

Some of the 185 are dubious or trivial. The ikea effect, for instance, is defined as “the tendency for people to place a disproportionately high value on objects that they partially assembled themselves.” And others closely resemble one another to the point of redundancy. But a solid group of 100 or so biases has been repeatedly shown to exist, and can make a hash of our lives.

FROM OUR SEPTEMBER 2018 ISSUE

Check out the full table of contents and find your next story to read.See More

The gambler’s fallacy makes us absolutely certain that, if a coin has landed heads up five times in a row, it’s more likely to land tails up the sixth time. In fact, the odds are still 50-50. Optimism bias leads us to consistently underestimate the costs and the duration of basically every project we undertake. Availability bias makes us think that, say, traveling by plane is more dangerous than traveling by car. (Images of plane crashes are more vivid and dramatic in our memory and imagination, and hence more available to our consciousness.)

The anchoring effect is our tendency to rely too heavily on the first piece of information offered, particularly if that information is presented in numeric form, when making decisions, estimates, or predictions. This is the reason negotiators start with a number that is deliberately too low or too high: They know that number will “anchor” the subsequent dealings. A striking illustration of anchoring is an experiment in which participants observed a roulette-style wheel that stopped on either 10 or 65, then were asked to guess what percentage of United Nations countries is African. The ones who saw the wheel stop on 10 guessed 25 percent, on average; the ones who saw the wheel stop on 65 guessed 45 percent. (The correct percentage at the time of the experiment was about 28 percent.)

The effects of biases do not play out just on an individual level. Last year, President Donald Trump decided to send more troops to Afghanistan, and thereby walked right into the sunk-cost fallacy. He said, “Our nation must seek an honorable and enduring outcome worthy of the tremendous sacrifices that have been made, especially the sacrifices of lives.” Sunk-cost thinking tells us to stick with a bad investment because of the money we have already lost on it; to finish an unappetizing restaurant meal because, after all, we’re paying for it; to prosecute an unwinnable war because of the investment of blood and treasure. In all cases, this way of thinking is rubbish.“We would all like to have a warning bell that rings loudly whenever we are about to make a serious error,” Kahneman writes, “but no such bell is available.”

If I had to single out a particular bias as the most pervasive and damaging, it would probably be confirmation bias. That’s the effect that leads us to look for evidence confirming what we already think or suspect, to view facts and ideas we encounter as further confirmation, and to discount or ignore any piece of evidence that seems to support an alternate view. Confirmation bias shows up most blatantly in our current political divide, where each side seems unable to allow that the other side is right about anything.

Confirmation bias plays out in lots of other circumstances, sometimes with terrible consequences. To quote the 2005 report to the president on the lead-up to the Iraq War: “When confronted with evidence that indicated Iraq did not have [weapons of mass destruction], analysts tended to discount such information. Rather than weighing the evidence independently, analysts accepted information that fit the prevailing theory and rejected information that contradicted it.”

The whole idea of cognitive biases and faulty heuristics—the shortcuts and rules of thumb by which we make judgments and predictions—was more or less invented in the 1970s by Amos Tversky and Daniel Kahneman, social scientists who started their careers in Israel and eventually moved to the United States. They were the researchers who conducted the African-countries-in-the-UN experiment. Tversky died in 1996. Kahneman won the 2002 Nobel Prize in Economics for the work the two men did together, which he summarized in his 2011 best seller, Thinking, Fast and Slow. Another best seller, last year’s The Undoing Project, by Michael Lewis, tells the story of the sometimes contentious collaboration between Tversky and Kahneman. Lewis’s earlier book Moneyball was really about how his hero, the baseball executive Billy Beane, countered the cognitive biases of old-school scouts—notably fundamental attribution error, whereby, when assessing someone’s behavior, we put too much weight on his or her personal attributes and too little on external factors, many of which can be measured with statistics.

Another key figure in the field is the University of Chicago economist Richard Thaler. One of the biases he’s most linked with is the endowment effect, which leads us to place an irrationally high value on our possessions. In an experiment conducted by Thaler, Kahneman, and Jack L. Knetsch, half the participants were given a mug and then asked how much they would sell it for. The average answer was $5.78. The rest of the group said they would spend, on average, $2.21 for the same mug. This flew in the face of classic economic theory, which says that at a given time and among a certain population, an item has a market value that does not depend on whether one owns it or not. Thaler won the 2017 Nobel Prize in Economics.

Most books and articles about cognitive bias contain a brief passage, typically toward the end, similar to this one in Thinking, Fast and Slow: “The question that is most often asked about cognitive illusions is whether they can be overcome. The message … is not encouraging.”

Kahneman and others draw an analogy based on an understanding of the Müller-Lyer illusion, two parallel lines with arrows at each end. One line’s arrows point in; the other line’s arrows point out. Because of the direction of the arrows, the latter line appears shorter than the former, but in fact the two lines are the same length. Here’s the key: Even after we have measured the lines and found them to be equal, and have had the neurological basis of the illusion explained to us, we still perceive one line to be shorter than the other.

At least with the optical illusion, our slow-thinking, analytic mind—what Kahneman calls System 2—will recognize a Müller-Lyer situation and convince itself not to trust the fast-twitch System 1’s perception. But that’s not so easy in the real world, when we’re dealing with people and situations rather than lines. “Unfortunately, this sensible procedure is least likely to be applied when it is needed most,” Kahneman writes. “We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available.”

Because biases appear to be so hardwired and inalterable, most of the attention paid to countering them hasn’t dealt with the problematic thoughts, judgments, or predictions themselves. Instead, it has been devoted to changing behavior, in the form of incentives or “nudges.” For example, while present bias has so far proved intractable, employers have been able to nudge employees into contributing to retirement plans by making saving the default option; you have to actively take steps in order to not participate. That is, laziness or inertia can be more powerful than bias. Procedures can also be organized in a way that dissuades or prevents people from acting on biased thoughts. A well-known example: the checklists for doctors and nurses put forward by Atul Gawande in his book The Checklist Manifesto.

Is it really impossible, however, to shed or significantly mitigate one’s biases? Some studies have tentatively answered that question in the affirmative. These experiments are based on the reactions and responses of randomly chosen subjects, many of them college undergraduates: people, that is, who care about the $20 they are being paid to participate, not about modifying or even learning about their behavior and thinking. But what if the person undergoing the de-biasing strategies was highly motivated and self-selected? In other words, what if it was me?

Naturally, I wrote to Daniel Kahneman, who at 84 still holds an appointment at the Woodrow Wilson School of Public and International Affairs, at Princeton, but spends most of his time in Manhattan. He answered swiftly and agreed to meet. “I should,” he said, “at least try to talk you out of your project.”

I met with Kahneman at a Le Pain Quotidien in Lower Manhattan. He is tall, soft-spoken, and affable, with a pronounced accent and a wry smile. Over an apple pastry and tea with milk, he told me, “Temperament has a lot to do with my position. You won’t find anyone more pessimistic than I am.”

In this context, his pessimism relates, first, to the impossibility of effecting any changes to System 1—the quick-thinking part of our brain and the one that makes mistaken judgments tantamount to the Müller-Lyer line illusion. “I see the picture as unequal lines,” he said. “The goal is not to trust what I think I see. To understand that I shouldn’t believe my lying eyes.” That’s doable with the optical illusion, he said, but extremely difficult with real-world cognitive biases.

The most effective check against them, as Kahneman says, is from the outside: Others can perceive our errors more readily than we can. And “slow-thinking organizations,” as he puts it, can institute policies that include the monitoring of individual decisions and predictions. They can also require procedures such as checklists and “premortems,” an idea and term thought up by Gary Klein, a cognitive psychologist. A premortem attempts to counter optimism bias by requiring team members to imagine that a project has gone very, very badly and write a sentence or two describing how that happened. Conducting this exercise, it turns out, helps people think ahead.

“My position is that none of these things have any effect on System 1,” Kahneman said. “You can’t improve intuition. Perhaps, with very long-term training, lots of talk, and exposure to behavioral economics, what you can do is cue reasoning, so you can engage System 2 to follow rules. Unfortunately, the world doesn’t provide cues. And for most people, in the heat of argument the rules go out the window.

“That’s my story. I really hope I don’t have to stick to it.”

As it happened, right around the same time I was communicating and meeting with Kahneman, he was exchanging emails with Richard E. Nisbett, a social psychologist at the University of Michigan. The two men had been professionally connected for decades. Nisbett was instrumental in disseminating Kahneman and Tversky’s work, in a 1980 book called Human Inference: Strategies and Shortcomings of Social Judgment. And in Thinking, Fast and Slow, Kahneman describes an even earlier Nisbett article that showed subjects’ disinclination to believe statistical and other general evidence, basing their judgments instead on individual examples and vivid anecdotes. (This bias is known as base-rate neglect.)

But over the years, Nisbett had come to emphasize in his research and thinking the possibility of training people to overcome or avoid a number of pitfalls, including base-rate neglect, fundamental attribution error, and the sunk-cost fallacy. He had emailed Kahneman in part because he had been working on a memoir, and wanted to discuss a conversation he’d had with Kahneman and Tversky at a long-ago conference. Nisbett had the distinct impression that Kahneman and Tversky had been angry—that they’d thought what he had been saying and doing was an implicit criticism of them. Kahneman recalled the interaction, emailing back: “Yes, I remember we were (somewhat) annoyed by your work on the ease of training statistical intuitions (angry is much too strong).”

When Nisbett has to give an example of his approach, he usually brings up the baseball-phenom survey. This involved telephoning University of Michigan students on the pretense of conducting a poll about sports, and asking them why there are always several Major League batters with .450 batting averages early in a season, yet no player has ever finished a season with an average that high. When he talks with students who haven’t taken Introduction to Statistics, roughly half give erroneous reasons such as “the pitchers get used to the batters,” “the batters get tired as the season wears on,” and so on. And about half give the right answer: the law of large numbers, which holds that outlier results are much more frequent when the sample size (at bats, in this case) is small. Over the course of the season, as the number of at bats increases, regression to the mean is inevitable. When Nisbett asks the same question of students who have completed the statistics course, about 70 percent give the right answer. He believes this result shows, pace Kahneman, that the law of large numbers can be absorbed into System 2—and maybe into System 1 as well, even when there are minimal cues.

Concept by Delcan & Co; photograph by The Voorhes

Nisbett’s second-favorite example is that economists, who have absorbed the lessons of the sunk-cost fallacy, routinely walk out of bad movies and leave bad restaurant meals uneaten.

I spoke with Nisbett by phone and asked him about his disagreement with Kahneman. He still sounded a bit uncertain. “Danny seemed to be convinced that what I was showing was trivial,” he said. “To him it was clear: Training was hopeless for all kinds of judgments. But we’ve tested Michigan students over four years, and they show a huge increase in ability to solve problems. Graduate students in psychology also show a huge gain.”

Nisbett writes in his 2015 book, Mindware: Tools for Smart Thinking, “I know from my own research on teaching people how to reason statistically that just a few examples in two or three domains are sufficient to improve people’s reasoning for an indefinitely large number of events.”

In one of his emails to Nisbett, Kahneman had suggested that the difference between them was to a significant extent a result of temperament: pessimist versus optimist. In a response, Nisbett suggested another factor: “You and Amos specialized in hard problems for which you were drawn to the wrong answer. I began to study easy problems, which you guys would never get wrong but untutored people routinely do … Then you can look at the effects of instruction on such easy problems, which turn out to be huge.”

An example of an easy problem is the .450 hitter early in a baseball season. An example of a hard one is “the Linda problem,” which was the basis of one of Kahneman and Tversky’s early articles. Simplified, the experiment presented subjects with the characteristics of a fictional woman, “Linda,” including her commitment to social justice, college major in philosophy, participation in antinuclear demonstrations, and so on. Then the subjects were asked which was more likely: (a) that Linda was a bank teller, or (b) that she was a bank teller and active in the feminist movement. The correct answer is (a), because it is always more likely that one condition will be satisfied in a situation than that the condition plus a second one will be satisfied. But because of the conjunction fallacy (the assumption that multiple specific conditions are more probable than a single general one) and the representativeness heuristic (our strong desire to apply stereotypes), more than 80 percent of undergraduates surveyed answered (b).

Nisbett justifiably asks how often in real life we need to make a judgment like the one called for in the Linda problem. I cannot think of any applicable scenarios in my life. It is a bit of a logical parlor trick.

Nisbett suggested that I take “Mindware: Critical Thinking for the Information Age,” an online Coursera course in which he goes over what he considers the most effective de-biasing skills and concepts. Then, to see how much I had learned, I would take a survey he gives to Michigan undergraduates. So I did.

The course consists of eight lessons by Nisbett—who comes across on-screen as the authoritative but approachable psych professor we all would like to have had—interspersed with some graphics and quizzes. I recommend it. He explains the availability heuristic this way: “People are surprised that suicides outnumber homicides, and drownings outnumber deaths by fire. People always think crime is increasing” even if it’s not.

He addresses the logical fallacy of confirmation bias, explaining that people’s tendency, when testing a hypothesis they’re inclined to believe, is to seek examples confirming it. But Nisbett points out that no matter how many such examples we gather, we can never prove the proposition. The right thing to do is to look for cases that would disprove it.

And he approaches base-rate neglect by means of his own strategy for choosing which movies to see. His decision is never dependent on ads, or a particular review, or whether a film sounds like something he would enjoy. Instead, he says, “I live by base rates. I don’t read a book or see a movie unless it’s highly recommended by people I trust.

“Most people think they’re not like other people. But they are.”

When I finished the course, Nisbett sent me the survey he and colleagues administer to Michigan undergrads. It contains a few dozen problems meant to measure the subjects’ resistance to cognitive biases. For example:

Because of confirmation bias, many people who haven’t been trained answer (e). But the correct answer is (c). The only thing you can hope to do in this situation is disprove the rule, and the only way to do that is to turn over the cards displaying the letter A (the rule is disproved if a number other than 4 is on the other side) and the number 7 (the rule is disproved if an A is on the other side).

I got it right. Indeed, when I emailed my completed test, Nisbett replied, “My guess is that very few if any UM seniors did as well as you. I’m sure at least some psych students, at least after 2 years in school, did as well. But note that you came fairly close to a perfect score.”

Nevertheless, I did not feel that reading Mindware and taking the Coursera course had necessarily rid me of my biases. For one thing, I hadn’t been tested beforehand, so I might just be a comparatively unbiased guy. For another, many of the test questions, including the one above, seemed somewhat remote from scenarios one might encounter in day-to-day life. They seemed to be “hard” problems, not unlike the one about Linda the bank teller. Further, I had been, as Kahneman would say, “cued.” In contrast to the Michigan seniors, I knew exactly why I was being asked these questions, and approached them accordingly.

For his part, Nisbett insisted that the results were meaningful. “If you’re doing better in a testing context,” he told me, “you’ll jolly well be doing better in the real world.”

Nisbett’s coursera course and Hal Hershfield’s close encounters with one’s older self are hardly the only de-biasing methods out there. The New York–based NeuroLeadership Institute offers organizations and individuals a variety of training sessions, webinars, and conferences that promise, among other things, to use brain science to teach participants to counter bias. This year’s two-day summit will be held in New York next month; for $2,845, you could learn, for example, “why are our brains so bad at thinking about the future, and how do we do it better?”

Philip E. Tetlock, a professor at the University of Pennsylvania’s Wharton School, and his wife and research partner, Barbara Mellers, have for years been studying what they call “superforecasters”: people who manage to sidestep cognitive biases and predict future events with far more accuracy than the pundits and so-called experts who show up on TV. In Tetlock’s book Superforecasting: The Art and Science of Prediction (co-written with Dan Gardner), and in the commercial venture he and Mellers co-founded, Good Judgment, they share the superforecasters’ secret sauce.

One of the most important ingredients is what Tetlock calls “the outside view.” The inside view is a product of fundamental attribution error, base-rate neglect, and other biases that are constantly cajoling us into resting our judgments and predictions on good or vivid stories instead of on data and statistics. Tetlock explains, “At a wedding, someone sidles up to you and says, ‘How long do you give them?’ If you’re shocked because you’ve seen the devotion they show each other, you’ve been sucked into the inside view.” Something like 40 percent of marriages end in divorce, and that statistic is far more predictive of the fate of any particular marriage than a mutually adoring gaze. Not that you want to share that insight at the reception.

The recent de-biasing interventions that scholars in the field have deemed the most promising are a handful of video games. Their genesis was in the Iraq War and the catastrophic weapons-of-mass-destruction blunder that led to it, which left the intelligence community reeling. In 2006, seeking to prevent another mistake of that magnitude, the U.S. government created the Intelligence Advanced Research Projects Activity (iarpa), an agency designed to use cutting-edge research and technology to improve intelligence-gathering and analysis. In 2011, iarpa initiated a program, Sirius, to fund the development of “serious” video games that could combat or mitigate what were deemed to be the six most damaging biases: confirmation bias, fundamental attribution error, the bias blind spot (the feeling that one is less biased than the average person), the anchoring effect, the representativeness heuristic, and projection bias (the assumption that everybody else’s thinking is the same as one’s own).Confirmation bias—probably the most pervasive and damaging bias of them all—leads us to look for evidence that confirms what we already think.

Six teams set out to develop such games, and two of them completed the process. The team that has gotten the most attention was led by Carey K. Morewedge, now a professor at Boston University. Together with collaborators who included staff from Creative Technologies, a company specializing in games and other simulations, and Leidos, a defense, intelligence, and health research company that does a lot of government work, Morewedge devised Missing. Some subjects played the game, which takes about three hours to complete, while others watched a video about cognitive bias. All were tested on bias-mitigation skills before the training, immediately afterward, and then finally after eight to 12 weeks had passed.

After taking the test, I played the game, which has the production value of a late-2000s PlayStation 3 first-person offering, with large-chested women and men, all of whom wear form-fitting clothes and navigate the landscape a bit tentatively. The player adopts the persona of a neighbor of a woman named Terry Hughes, who, in the first part of the game, has mysteriously gone missing. In the second, she has reemerged and needs your help to look into some skulduggery at her company. Along the way, you’re asked to make judgments and predictions—some having to do with the story and some about unrelated issues—which are designed to call your biases into play. You’re given immediate feedback on your answers.

For example, as you’re searching Terry’s apartment, the building superintendent knocks on the door and asks you, apropos of nothing, about Mary, another tenant, whom he describes as “not a jock.” He says 70 percent of the tenants go to Rocky’s Gym, 10 percent go to Entropy Fitness, and 20 percent just stay at home and watch Netflix. Which gym, he asks, do you think Mary probably goes to? A wrong answer, reached thanks to base-rate neglect (a form of the representativeness heuristic) is “None. Mary is a couch potato.” The right answer—based on the data the super has helpfully provided—is Rocky’s Gym. When the participants in the study were tested immediately after playing the game or watching the video and then a couple of months later, everybody improved, but the game players improved more than the video watchers.

When I spoke with Morewedge, he said he saw the results as supporting the research and insights of Richard Nisbett. “Nisbett’s work was largely written off by the field, the assumption being that training can’t reduce bias,” he told me. “The literature on training suggests books and classes are fine entertainment but largely ineffectual. But the game has very large effects. It surprised everyone.”

I took the test again soon after playing the game, with mixed results. I showed notable improvement in confirmation bias, fundamental attribution error, and the representativeness heuristic, and improved slightly in bias blind spot and anchoring bias. My lowest initial score—44.8 percent—was in projection bias. It actually dropped a bit after I played the game. (I really need to stop assuming that everybody thinks like me.) But even the positive results reminded me of something Daniel Kahneman had told me. “Pencil-and-paper doesn’t convince me,” he said. “A test can be given even a couple of years later. But the test cues the test-taker. It reminds him what it’s all about.”

I had taken Nisbett’s and Morewedge’s tests on a computer screen, not on paper, but the point remains. It’s one thing for the effects of training to show up in the form of improved results on a test—when you’re on your guard, maybe even looking for tricks—and quite another for the effects to show up in the form of real-life behavior. Morewedge told me that some tentative real-world scenarios along the lines of Missing have shown “promising results,” but that it’s too soon to talk about them.

Iam neither as much of a pessimist as Daniel Kahneman nor as much of an optimist as Richard Nisbett. Since immersing myself in the field, I have noticed a few changes in my behavior. For example, one hot day recently, I decided to buy a bottle of water in a vending machine for $2. The bottle didn’t come out; upon inspection, I realized that the mechanism holding the bottle in place was broken. However, right next to it was another row of water bottles, and clearly the mechanism in that row was in order. My instinct was to not buy a bottle from the “good” row, because $4 for a bottle of water is too much. But all of my training in cognitive biases told me that was faulty thinking. I would be spending $2 for the water—a price I was willing to pay, as had already been established. So I put the money in and got the water, which I happily drank.

In the future, I will monitor my thoughts and reactions as best I can. Let’s say I’m looking to hire a research assistant. Candidate A has sterling references and experience but appears tongue-tied and can’t look me in the eye; Candidate B loves to talk NBA basketball—my favorite topic!—but his recommendations are mediocre at best. Will I have what it takes to overcome fundamental attribution error and hire Candidate A?

Or let’s say there is an officeholder I despise for reasons of temperament, behavior, and ideology. And let’s further say that under this person’s administration, the national economy is performing well. Will I be able to dislodge my powerful confirmation bias and allow the possibility that the person deserves some credit?

As for the matter that Hal Hershfield brought up in the first place—estate planning—I have always been the proverbial ant, storing up my food for winter while the grasshoppers sing and play. In other words, I have always maxed out contributions to 401(k)s, Roth IRAs, Simplified Employee Pensions, 403(b)s, 457(b)s, and pretty much every alphabet-soup savings choice presented to me. But as good a saver as I am, I am that bad a procrastinator. Months ago, my financial adviser offered to evaluate, for free, my will, which was put together a couple of decades ago and surely needs revising. There’s something about drawing up a will that creates a perfect storm of biases, from the ambiguity effect (“the tendency to avoid options for which missing information makes the probability seem ‘unknown,’ ” as Wikipedia defines it) to normalcy bias (“the refusal to plan for, or react to, a disaster which has never happened before”), all of them culminating in the ostrich effect (do I really need to explain?). My adviser sent me a prepaid FedEx envelope, which has been lying on the floor of my office gathering dust. It is still there. As hindsight bias tells me, I knew that would happen.

Unconscious Bias

What to Expect from Governor Whitmer's Implicit Bias Training Directive

Here are some readings on the topic for your research.

Being Conscious about the Unconscious
Are these unconscious biases hardwired into our brains as an evolutionary response, or do they emerge from assimilating information that we see around us? 

Implicit Bias
Thoughts and feelings are “implicit” if we are unaware of them or mistaken about their nature. We have a bias when, rather than being neutral, we have a preference for (or aversion to) a person or group of people. Thus, we use the term “implicit bias” to describe when we have attitudes towards people or associate stereotypes with them without our conscious knowledge. A fairly commonplace example of this is seen in studies that show that white people will frequently associate criminality with black people without even realizing they’re doing it.

What is Cognitive Bias
The tendency of us as humans to act in ways that are prompted by a range of assumptions and biases that we are not aware of.   This can include decisions or actions that we are not consciously aware of, as well as hidden influences on decisions and actions that we believe are rational and based on objective un-biased evidence and experience.  Unconscious bias can be present in organizations and groups as well as influencing the behaviors and decisions made by individuals.  

The bias that causes Catastrophe
The outcome bias erodes your sense of risk and makes you blind to error, explaining everything from fatal plane crashes to the Columbia crash and the Deepwater Horizon oil spill.

More Books on the Topic

The Names

Visitors leave remembrances on names in the National 9/11 Memorial Plaza in  the World Trade Center site EDITORIAL USE ONLY Stock Photo - Alamy

A Poem tribute by Poet Laureate Billy Collins on the Names on the Ground Zero Memorial, the names read out on every anniversary usually by the survivors of those they lost.

Opinion: We best remember 9/11 by moving beyond it

Opinion by E.J. Dionne Jr. Columnist Yesterday at 5:47 p.m. Washington Post

The primary lesson we should take from the events of Sept. 11, 2001, is to be wary of lessons we think we have learned from traumatic events. Trauma can undermine the clear thinking and calm deliberation big decisions require.

The trauma the nation felt then was amplified by the contrast between our experience of sudden vulnerability and a mood shaped by a long period of relative peace and nearly a decade of roaring prosperity.

Our nation had been on a high of national self-confidence after the collapse of the Soviet Union encouraged talk of a “unipolar world” in which the United States confronted no serious competitors.

And the actual suffering was excruciating. We still mourn the thousands killed at the World Trade Center, at the Pentagon and on Flight 93 when brave passengers, at the cost of their lives, overcame their hijackers to stop another targeted crash. We remember the firefighters, police officers and other first responders who died or suffered grievous, lasting health problems to save others.

Briefly, we were united as a nation. For some time, partisan politics very nearly disappeared.

Among Democrats, President George W. Bush’s approval rating was just 27 percent in a Gallup survey taken Sept. 7-10, 2001; in less than a week, it soared to 78 percent. It was even higher among independents and Republicans.

But the unity would not last. If the decision to attack the Taliban and al-Qaeda in Afghanistan was broadly popular, the use of 9/11 to justify the invasion of Iraq was not. Americans rallied around the flag when the war in Iraq started, but they had grave doubts going in, and those mushroomed as the war dragged on.

The way Bush administration officials made the case for intervening in Iraq sowed seeds of division that blossomed into today’s rancid politics.

They painted utterly unrealistic portraits of what the war would achieve (“we will, in fact, be greeted as liberators,” Vice President Dick Cheney famously said), and savaged critics in partisan terms. When Bush announced the invasion on March 19, 2003, a sidebar report in the next day’s Post was headlined: “GOP to Hammer Democratic War Critics.”

The years that followed were jarring in other ways. De-industrialization savaged many once-vibrant communities, especially in the Midwest. Economic inequality grew. The financial collapse of 2008 greatly aggravated the damage. The economy recovered, but slowly.

The pre-9/11 sense of American invincibility and that too-brief interlude after the onslaught when it felt like we were all in this together gave way to bitterness, division and new doubts about the country’s capacities.

This is why we should not be surprised by a Post-ABC News poll this week that found 46 percent of Americans say that the events of Sept. 11 changed the country for the worse while only 33 percent said they changed it for the better.

The contrast with responses to the same question in September 2002 could hardly be starker. Back then, 55 percent said the country had changed for the better, only 27 percent for the worse. We had not gone to Iraq yet, and we were still basking in the selflessness of our 9/11 heroes.

There is, I think, wisdom in the country’s intuitions, then and now. As we reach a milestone anniversary of the attacks, we should never forget those whose lives were lost. And if there is one aspect of the spirit of 9/11 that should remain with us, it is the devotion to selfless service that inspired our country two decades ago and remains a model for what patriotism should look like.

But in many other ways, we need to move beyond 9/11 — beyond the hubris that made us think we could remake the world by force, beyond the ever-present temptation to use a catastrophe to justify projects already in mind before disaster struck.

What we did right after 9/11 was inspired not by grandiose plans but by a painstaking response to more ordinary failures: the failure to understand and act upon available intelligence, the lack of cooperation among agencies charged with keeping us safe, the inability to grasp how much damage could be inflicted by enemies far less powerful than us. Our systems are better for acknowledging these shortcomings, and our imaginations are more alive to the threats.

What we don’t need and shouldn’t want are bombastic declarations of American purpose on Sept. 11, 2021. Far better would be sober remembrances of the heroes and the fallen; realistic assessments of what it will take to protect our people; and a pledge not to remain mired in the feelings, impulses and mistakes that followed a tragic moment. All this, and prayers that we might never again confront a misfortune of this sort.

View 3 more stories Opinion by E.J. DionneE.J. Dionne writes about politics in a twice-weekly column for The Washington Post. He is a professor at Georgetown University, a senior fellow at the Brookings Institution and a frequent commentator for NPR and MSNBC. His latest book is “Code Red: How Progressives and Moderates Can Unite to Save Our Country.”  Twitter

Commemorating the 20th Anniversary of September 11

Share your memories of 9/11 | AccessWDUN.com

SEPTEMBER 7, 2021

In 2005, in partnership with the  9/11 Memorial & Museum, StoryCorps launched the September 11 Initiative. The goal of the project is to record at least one story commemorating each life lost during the attacks on September 11, 2001 and the February 26, 1993 World Trade Center Bombing.

This year, in recognition of the 20th anniversary of the September 11 attacks, StoryCorps is releasing two new animated shorts highlighting the voices of those impacted by this tragedy, “September 12th” and “Father Mychal’s Blessing.” These new animations are part of a rich body of stories from the September 11th Initiative, which includes conversations with family members, colleagues, and friends who wish to commemorate the events of September 11.

These StoryCorps interviews are archived in the StoryCorps Archive in the American Folklife Center at the Library of Congress, and are also part of a special collection at the National 9/11 Memorial & Museum.


Father Mychal’s Blessing

On 9/11, Father Mychal Judge, beloved chaplain to the NYC Fire Department, was killed during the attack on the World Trade Center while offering spiritual support, becoming the first certified fatality of the 9/11 attacks. His friend, Father Michael Duffy, read the sermon at his funeral. He remembers Father Mychal’s endearing mannerisms, constant positivity, and profound impact on everyone he knew.
Read the full transcript here.

September 12th

On 9/11, Vaughn Allex checked in two passengers arriving late for their flight. He learned later that they were two of the hijackers of the plane that crashed into the Pentagon. He recalls the toll it took on him.
Read the full transcript here.


We will also be releasing a two-part podcast episode that shares first-hand reflections on 9/11. The first part, a collaboration with Consider This, looks at the lasting toll of 9/11 on U.S. civilians, U.S. veterans, and Afghan citizens, and will be published on Friday, September 10. Part two will go live on 9/11. Subscribe to the StoryCorps podcast wherever you get your podcasts.


STORY“I OPENED UP THE BACK DOOR OF THAT CHURCH TO SEE THESE HUNDREDS OF EYES ALL STARING BACK AT ME, KNOWING WHERE I HAD BEEN.”2:43

Joe Dittmar

Joe Dittmar recounts making his way back home on September 11, 2001 after surviving the attacks on the World Trade Center.
Read the full transcript here.STORY“HE GAVE ME THE JOYS OF MOTHERHOOD, AND THE PAINS OF MOTHERHOOD.”3:09

Salman Hamdani

Talat Hamdani remembers her son, an EMT and NYPD cadet who died at the World Trade Center on September 11, 2001 as a first responder and was wrongfully accused of having terrorist links.
Read the full transcript here.

She Was the One

When Richie Pecorella met Karen Juday, she captured his heart and changed his life. They were engaged when she was killed at the World Trade Center on September 11, 2001.
Read the full transcript here.

John and Joe

The late John Vigiano Sr., a retired FDNY captain, honors his sons — John Jr., also a firefighter, and Joe, a police detective — who were killed while saving others on September 11, 2001.
Read the full transcript here.

Sean Rooney

Beverly Eckert shares her final conversation with her husband, Sean Rooney, before he died in the south tower of the World Trade Center on September 11, 2001.
Read the full transcript here.STORY“WE WERE THE LUCKIEST OF THE UNLUCKY.”3:02

Mark Petrocelli

Retired NYC Fire Chief Albert Petrocelli died from COVID-19 nearly two decades after losing his youngest son, Mark, on September 11, 2001. Before he passed, Chief Petrocelli and his wife, Ginger, sat down to remember the last time they saw their son.
Read the full transcript here.STORY“PEOPLE SAW ONLY A TURBAN AND A BEARD.”3:14

Balbir Singh Sodhi

Rana and Harjit Sodhi remember their brother, Balbir Singh Sodhi, a Sikh man who was killed in the first hate crime following the September 11, 2001 attacks.
Read the full transcript here.

Always a Family

Monique Ferrer remembers the last time she spoke with her ex-husband, Michael Trinidad, on September 11, 2001, when he called her from the 103rd floor of the World Trade Center’s north tower to say goodbye.
Read the full transcript here.

From the Archive: More Stories of September 11

To hear more stories related to September 11, visit our Archive and search for the keyword “9/11”.

Dawn Ennis and Amy Weinstein

Interview partners Dawn Ennis and Amy Weinstein talk about Dawn’s experience as a producer on CBS This Morning on the morning of September 11, 2001. Dawn describes the exact moment when newsrooms found out that a plane had hit the World Trade Center and she shares her feelings regarding the reactions that New Yorkers had after the attack. Read the full transcript here.


Sharon Watts

Sharon Watts shares the story of her relationship with her ex-fiance Captain Patrick Brown of the FDNY, who passed away during the 9/11 attacks. Sharon affectionately recollects stories and reveals that soon after Patrick passed away, Sharon compiled stories and journals about his life to create a book. Read the full transcript here.


Maria Dominguez and Phillip Cassanova

Rescue medical firefighter, Maria “Terry” Dominguez talks with her nephew Phillip Cassanova about her deployment with the USSR during the 9/11 attacks and shares her feelings about the aftermath of the tragedy while reflecting on the importance of loved ones. Phillip describes being 10 years old when the attack occurred and finding out in his 5th grade classroom. Read the full transcript here.


Michael Doyle

Michael Doyle, along with StoryCorps facilitator Virginia Lora, recounts finding out that the attacks had occurred while he was riding the Q train over lower Manhattan. Read the full transcript here.


Diane Davis and Leo McKenna

Spouses Diane Davis and Leo McKenna discuss their memories of 9/11, when 7,000 plane passengers were forced to land in the town of Gander, Newfoundland, Canada following the attacks in New York City. Diane, a third grade teacher at the time, remembers preparing the schools to house the passengers. Leo recalls the commotion that occurred due to the sudden landing of the passengers. Read the full transcript here.


Seth and Lois Gilman

Seth Gilman, who was a rescue worker during 9/11, speaks with his mother Lois Gilman about assisting the New York City police and witnessing the loss of many lives on that day. He describes his journey to becoming a teacher, and the unity that he saw during a difficult moment in history. Read the full transcript here.


Nadine Newlight

Nadine (Nai’a) Newlight tells StoryCorps facilitator Eloise Melzer about how close she was to being at World Trade Center on 9/11. She describes her love for the World Trade Center and her experience as a tour guide there. Read the full transcript here.


Brian Muldowney and B. Kelly Hallman

Colleagues and close friends Brian Muldowney and B. Kelly Hallman discuss the loss of Muldowney’s brother, Richard Muldowney Jr., a fellow firefighter who passed away saving people on 9/11. Brian describes going down to the World Trade Center with his brother’s firehouse to help and discusses how his brother’s legacy affects his work. Read the full transcript here.


Michael Fabiano

Michael Fabiano, a Deputy Controller for NY/NJ Port Authority, speaks with Sarah Geis about his experience being on the 69th floor of Tower 1 when the first plane attacked. He describes his escape from the building and his efforts to help bring to safety a colleague, John Ambrosio, who was wheelchair bound.
Read the full transcript here.


Kris Gould and Scott Accord

On the morning of 9/11 as she watched the planes crash, Kris Gould tried to get in contact with a friend who worked on the 99th floor of Tower 1. She and her colleague Scott Accord talk about the vibe that fell over the city the day after the attacks occurred. Read the full transcript here.


Share your story. StoryCorps Connect makes it possible to interview a loved one remotely and then upload it to the StoryCorps Archive at the Library of Congress. Learn more at StoryCorpsConnect.org.

More Americans say 9/11 changed U.S. for worse than better, Post-ABC poll finds

20 years after 9/11 attacks, just half call US more secure: POLL - ABC News

By Scott Clement Polling director Yesterday at 6:00 a.m.

Americans increasingly say the events of Sept. 11, 2001, had a more negative than positive impact on the country, and predictions for the pandemic’s long-term impact are even more downbeat, according to a Washington Post-ABC News poll.

Ahead of the 20th anniversary of the attacks on the World Trade Center and Pentagon on Saturday, more than 8 in 10 Americans say those events changed the country in a lasting way. Nearly half (46 percent) say the events of 9/11 changed the country for the worse, while 33 percent say they changed the country for the better.

That represents a shift from 10 years ago when Americans were roughly divided on this question, and it marks an even larger swing from the first anniversary of the attacks in 2002. Back then, 55 percent said the country had changed for the better.

Americans’ perceptions of safety from terrorist attacks are also at a low ebb, with 49 percent saying the country is safer from terrorism today than before 9/11 — one percentage point from the record low of 48 percent reached in 2010, and down from 64 percent in September 2011, four months after Osama bin Laden was killed.

Partisans differ on whether the U.S. is safer from terrorism today than before 9/11, with 57 percent of Democrats saying it is, while 54 percent of Republicans say it’s less safe. Independents are closer to Democrats on this question, saying by 52 percent to 38 percent that the U.S. is safer rather than less safe.

The Post-ABC poll also finds about 8 in 10 Americans think the coronavirus pandemic will change the country in a lasting way, with more than twice as many expecting it to change for the worse rather than better, 50 percent vs. 21 percent.

Pessimism about the pandemic’s impact is widespread, though it peaks on the ideological right. Conservatives are among the most concerned groups, with 62 percent saying they expect the pandemic to change the country for the worse, compared with 47 percent of moderates and 44 percent of liberals. Republicans also are more likely than Democrats to predict the pandemic will make the country worse (56 percent vs. 42 percent).

Opposition to mask requirements and other pandemic restrictions likely explains part of the negative outlook among conservatives and Republicans. Among the roughly 6 in 10 Republicans and Republican-leaning independents who oppose school mask mandates, 69 percent believe the pandemic will change the country for the worse, compared with 48 percent of Republicans who support mask requirements.

Read full Post-ABC poll results and how the survey was conducted

The ideological split is reversed in views of the 9/11 attacks, with nearly 6 in 10 liberals saying the events changed the country for the worse (59 percent), compared with 44 percent of moderates and 45 percent of conservatives. Liberals have grown much more negative on this question since 2011, when 42 percent said the attacks changed the country for the worse.

The latest Post-ABC poll overlapped the U.S. military withdrawal from Afghanistan along with the evacuation of more than 120,000 Americans and allies in just over two weeks. In results released Friday, 36 percent of Americans said the Afghan war was worth fighting, while 54 percent say it was not. Those who say the war was not worth fighting are also more likely to say Sept. 11, 2001, changed the U.S. for the worse (53 percent), compared with those who say it was worth fighting (37 percent).

The Post-ABC poll was conducted by telephone Aug. 29-Sept. 1 among a random national sample of 1,006 adults, with 75 percent reached on cellphone and 25 percent on landline. Overall results have a margin of error of plus or minus 3.5 percentage points.1.6k CommentsGift Article

Image without a caption

By Scott ClementScott Clement is the polling director for The Washington Post, conducting national and local polls about politics, elections and social issues. He began his career with the ABC News Polling Unit and came to The Post in 2011 after conducting surveys with the Pew Research Center’s Religion and Public Life Project Twitter

READ MORE

https://www.usatoday.com/story/news/nation/2021/09/02/9-11-terrorist-attacks-american-lives-changed-suffolk-poll/5641993001/

Honor. Serve. Unite.

Youth Service America | Summer of Service and 9/11 Day of Service  (September 11, 2021)

The September 11 National Day of Service and Remembrance (9/11 Day), is a chance to help others in tribute to those killed and injured on September 11, 2001, first responders, and the countless others who serve to defend the nation’s freedom at home and around the globe.

September 11, 2021, is the 20th Anniversary of that tragic day. Join AmeriCorps on 9/11 Day— step forward to serve in a remarkable spirit of unity, honor, and compassion. 

Remember, even a small act of service is a giant act of patriotism. “Service is a fitting way to start to heal, unite, and rebuild this country we love.”/ President Joe Biden

Find a volunteer opportunity

Use the AmeriCorps Volunteer Search, powered by VolunteerMatch, to find an opportunity near you through one of these organizations: AmeriCorps, Idealist, California Volunteers, MENTOR, Volunteer.gov (National Park Service), and VolunteerMatch. For volunteer opportunities for September 11, please enter #911 in the search box.

SEARCH NOW

Register a volunteer opportunity

Publicize your volunteer opportunities for the September 11 National Day of Service and Remembrance by adding them to AmeriCorps Volunteer Search.

  1. To be discoverable in search results for 9/11 Day, you must use #911 in the title of the project.
  2. Create a free project listing with one of these organizations: IdealistJustServe, or VolunteerMatch
  3. That’s it, anyone searching for 9/11 Day volunteer opportunities on our site will see your post. 

If you organization hosts volunteer opportunities and would like to promote these opportunities, please contact our Partnerships team

Commit to serving

Develop a volunteer opportunity that addresses a critical need in your community.Get started

Find in-person or virtual events to join.Find an opportunity

Share your service story and photos, and use #911Day on social media.Access our social press kit

Post the 9/11 Day of Service logo in your channelsDownload logos

9/11 Day grantees

For 2021, all grants have been awarded. Learn about funding opportunities with AmeriCorps.
FUNDING OPPORTUNITIES

Resources to help

Check out resources we’ve created and compiled to help you promote your call to service.
FIND RESOURCES

9/11 was a test. The books of the last two decades show how America failed.

Essay by Carlos Lozada.   Illustrations by Patrik Svensson. Updated Sept. 3 at 6:00 a.m.Originally published Sept. 3, 2021

Deep within the catalogue of regrets that is the 9/11 Commission report — long after readers learn of the origins and objectives of al-Qaeda, past the warnings ignored by consecutive administrations, through the litany of institutional failures that allowed terrorists to hijack four commercial airliners — the authors pause to make a rousing case for the power of the nation’s character.

“The U.S. government must define what the message is, what it stands for,” the report asserts. “We should offer an example of moral leadership in the world, committed to treat people humanely, abide by the rule of law, and be generous and caring to our neighbors. . . . We need to defend our ideals abroad vigorously. America does stand up for its values.”Story continues below advertisement

This affirmation of American idealism is one of the document’s more opinionated moments. Looking back, it’s also among the most ignored.

Rather than exemplify the nation’s highest values, the official response to 9/11 unleashed some of its worst qualities: deception, brutality, arrogance, ignorance, delusion, overreach and carelessness. This conclusion is laid bare in the sprawling literature to emerge from 9/11 over the past two decades — the works of investigation, memoir and narrative by journalists and former officials that have charted the path to that day, revealed the heroism and confusion of the early response, chronicled the battles in and about Afghanistan and Iraq, and uncovered the excesses of the war on terror. Reading or rereading a collection of such books today is like watching an old movie that feels more anguishing and frustrating than you remember. The anguish comes from knowing how the tale will unfold; the frustration from realizing that this was hardly the only possible outcome.

Whatever individual stories the 9/11 books tell, too many describe the repudiation of U.S. values, not by extremist outsiders but by our own hand. The betrayal of America’s professed principles was the friendly fire of the war on terror. In these works, indifference to the growing terrorist threat gives way to bloodlust and vengeance after the attacks. Official dissembling justifies wars, then prolongs them. In the name of counterterrorism, security is politicized, savagery legalized and patriotism weaponized.

It was an emergency, yes, that’s understood. But that state of exception became our new American exceptionalism.

It happened fast. By 2004, when the 9/11 Commission urged America to “engage the struggle of ideas,” it was already too late; the Justice Department’s initial torture memos were already signed, the Abu Ghraib images had already eviscerated U.S. claims to moral authority. And it has lasted long. The latest works on the legacy of 9/11 show how war-on-terror tactics were turned on religious groups, immigrants and protesters in the United States. The war on terror came home, and it walked in like it owned the place.

Ghost Wars: The Secret History of the CIA, Afghanistan, and Bin Laden, from  the Soviet Invasion to September 10, 2001: Coll, Steve: 9780143034667:  Amazon.com: Books

“It is for now far easier for a researcher to explain how and why September 11 happened than it is to explain the aftermath,” Steve Coll writes in “Ghost Wars,” his 2004 account of the CIA’s pre-9/11 involvement in Afghanistan. Throughout that aftermath, Washington fantasized about remaking the world in its image, only to reveal an ugly image of itself to the world.

The literature of 9/11 also considers Osama bin Laden’s varied aspirations for the attacks and his shifting visions of that aftermath. He originally imagined America as weak and easily panicked, retreating from the world — in particular from the Middle East — as soon as its troops began dying. But bin Laden also came to grasp, perhaps self-servingly, the benefits of luring Washington into imperial overreach, of “bleeding America to the point of bankruptcy,” as he put it in 2004, through endless military expansionism, thus beating back its global sway and undermining its internal unity. “We anticipate a black future for America,” bin Laden told ABC News more than three years before the 9/11 attacks. “Instead of remaining United States, it shall end up separated states and shall have to carry the bodies of its sons back to America.”

Bin Laden did not win the war of ideas. But neither did we. To an unnerving degree, the United States moved toward the enemy’s fantasies of what it might become — a nation divided in its sense of itself, exposed in its moral and political compromises, conflicted over wars it did not want but would not end. When President George W. Bush addressed the nation from the Oval Office on the evening of Sept. 11, 2001, he asserted that America was attacked because it is “the brightest beacon for freedom and opportunity in the world, and no one will keep that light from shining.” Bush was correct; al-Qaeda could not dim the promise of America. Only we could do that to ourselves.

I.

Image without caption

“The most frightening aspect of this new threat . . . was the fact that almost no one took it seriously. It was too bizarre, too primitive and exotic.” That is how Lawrence Wright depicts the early impressions of bin Laden and his terrorist network among U.S. officials in “The Looming Tower: Al-Qaeda and the Road to 9/11.” For a country still basking in its post-Cold War glow, it all seemed so far away, even as al-Qaeda’s strikes — on the World Trade Center in 1993, on U.S. Embassies in 1998, on the USS Cole in 2000 — grew bolder. This was American complacency, mixed with denial.

The books traveling that road to 9/11 have an inexorable, almost suffocating feel to them, as though every turn invariably leads to the first crush of steel and glass. Their starting points vary. Wright dwells on the influence of Egyptian thinker Sayyid Qutb, whose mid-20th-century sojourn in the United States animated his vision of a clash between Islam and modernity, and whose work would inspire future jihadists. In “Ghost Wars,” Coll laments America’s abandonment of Afghanistan once it ceased serving as a proxy battlefield against Moscow.

Image without caption

In “The Rise and Fall of Osama bin Laden,” Peter Bergen stresses the moment bin Laden arrived in Afghanistan from Sudan in 1996, when Khalid Sheikh Mohammed first pitched him on the planes plot. And the 9/11 Commission lingers on bin Laden’s declarations of war against the United States, particularly his 1998 fatwa calling it “the individual duty for every Muslim” to murder Americans “in any country in which it is possible.”

Yet these early works also make clear that the road to 9/11 featured plenty of billboards warning of the likely destination. A Presidential Daily Brief item on Aug. 6, 2001, titled “Bin Ladin Determined to Strike in US” became infamous in 9/11 lore, yet the commission report notes that it was the 36th PDB relating to bin Laden or al-Qaeda that year alone. (“All right. You’ve covered your ass now,” Bush reportedly sneered at the briefer.) Both the FBI and the CIA produced classified warnings on terrorist threats in the mid-1990s, Coll writes, including a particularly precise National Intelligence Estimate. “Several targets are especially at risk: national symbols such as the White House and the Capitol, and symbols of U.S. capitalism such as Wall Street,” it stated. “We assess that civil aviation will figure prominently among possible terrorist targets in the United States.” Some of the admonitions scattered throughout the 9/11 literature are too over-the-top even for a movie script: There’s the exasperated State Department official complaining about Defense Department inaction (“Does al Qaeda have to attack the Pentagon to get their attention?”), and the earnest FBI supervisor in Minneapolis warning a skeptical agent in Washington about suspected terrorism activity, insisting that he was “trying to keep someone from taking a plane and crashing it into the World Trade Center.”

Against All Enemies - Wikipedia

In these books, everyone is warning everyone else. Bergen emphasizes that a young intelligence analyst in the State Department, Gina Bennett, wrote the first classified memo warning about bin Laden in 1993. Pockets within the FBI and the CIA obsess over bin Laden while regarding one another as rivals. On his way out, President Bill Clinton warns Bush. Outgoing national security adviser Sandy Berger warns his successor, Condoleezza Rice. And White House counterterrorism coordinator Richard Clarke, as he reminds incessantly in his 2004 memoir, “Against All Enemies,” warns anyone who will listen and many who will not.

With the system “blinking red,” as CIA Director George Tenet later told the 9/11 Commission, why were all these warnings not enough? Wright lingers on bureaucratic failings, emphasizing that intelligence collection on al-Qaeda was hampered by the “institutional warfare” between the CIA and the FBI, two agencies that by all accounts were not on speaking terms. Coll writes that Clinton regarded bin Laden as “an isolated fanatic, flailing dangerously but quixotically against the forces of global progress,” whereas the Bush team was fixated on great-power politics, missile defense and China.

Clarke’s conclusion is simple, and it highlights America’s we-know-better swagger, a national trait that often masquerades as courage or wisdom. “America, alas, seems only to respond well to disasters, to be undistracted by warnings,” he writes. “Our country seems unable to do all that must be done until there has been some awful calamity.”

The problem with responding only to calamity is that underestimation is usually replaced by overreaction. And we tell ourselves it is the right thing, maybe the only thing, to do.

Read More

VI.

Image without caption

In the 11th chapter of the 9/11 Commission report, just before all the recommendations for reforms in domestic and foreign policy, the authors get philosophical, pondering how hindsight had affected their views of Sept. 11, 2001. “As time passes, more documents become available, and the bare facts of what happened become still clearer,” the report states. “Yet the picture of how those things happened becomes harder to reimagine, as that past world, with its preoccupations and uncertainty, recedes.” Before making definitive judgments, then, they ask themselves “whether the insights that seem apparent now would really have been meaningful at the time.”

Image without caption

It’s a commendable attitude, one that helps readers understand what the attacks felt like in real time and why authorities responded as they did. But that approach also keeps the day trapped in the past, safely distant. Two of the latest additions to the canon, “Reign of Terror” by Spencer Ackerman and “Subtle Tools” by Karen Greenberg, draw straight, stark lines between the earliest days of the war on terror and its mutations in our current time, between conflicts abroad and divisions at home. These works show how 9/11 remains with us, and how we are still living in the ruins.


Image without caption

When Trump declared that “we don’t have victories anymore” in his 2015 speech announcing his presidential candidacy, he was both belittling the legacy of 9/11 and harnessing it to his ends. “His great insight was that the jingoistic politics of the War on Terror did not have to be tied to the War on Terror itself,” Ackerman writes. “That enabled him to tell a tale of lost greatness.” And if greatness is lost, someone must have taken it. The backlash against Muslims, against immigrants crossing the southern border and against protesters rallying for racial justice was strengthened by the open-ended nature of the global war on terror. In Ackerman’s vivid telling — his prose can be hyperbolic, even if his arguments are not — the war is not just far away in Iraq or Afghanistan, in Yemen or Syria, but it’s happening here, with mass surveillance, militarized law enforcement and the rebranding of immigration as a threat to the nation’s security rather than a cornerstone of its identity. “Trump had learned the foremost lesson of 9/11,” Ackerman writes, “that the terrorists were whomever you said they were.”

Both Ackerman and Greenberg point to the Authorization for Use of Military Force, drafted by administration lawyers and approved by Congress just days after the attacks, as the moment when America’s response began to go awry. The brief joint resolution allowed the president to use “all necessary and appropriate force” against any nation, organization or person who committed the attacks, and to prevent any future ones. It was the “Ur document in the war on terror and its legacy,” Greenberg writes. “Riddled with imprecision, its terminology was geared to codify expansive powers.” Where the battlefield, the enemy and the definition of victory all remain vague, war becomes endlessly expansive, “with neither temporal nor geographical boundaries.”

This was the moment the war on terror was “conceptually doomed,” Ackerman concludes. This is how you get a forever war.Story continues below advertisement

There were moments when an off-ramp was visible. The killing of bin Laden in 2011 was one such instance, Ackerman argues, but “Obama squandered the best chance anyone could ever have to end the 9/11 era.” The author assails Obama for making the war on terror more “sustainable” through a veneer of legality — banning torture yet failing to close the detention camp at Guantánamo Bay and relying on drone strikes that “perversely incentivized the military and the CIA to kill instead of capture.” There would always be more targets, more battlefields, regardless of president or party. Failures became the reason to double down, never wind down.

The longer the war went on, the more that what Ackerman calls its “grotesque subtext” of nativism and racism would move to the foreground of American politics. Absent the war on terror, it is harder to imagine a presidential candidate decrying a sitting commander in chief as foreign, Muslim, illegitimate — and using that lie as a successful political platform. Absent the war on terror, it is harder to imagine a travel ban against people from Muslim-majority countries. Absent the war on terror, it is harder to imagine American protesters labeled terrorists, or a secretary of defense describing the nation’s urban streets as a “battle space” to be dominated. Trump was a disruptive force in American life, but there was much continuity there, too. “A vastly different America has taken root” in the two decades since 9/11, Greenberg writes. “In the name of retaliation, ‘justice,’ and prevention, fundamental values have been cast aside.”

The Rise and Fall of Osama bin Laden | Book by Peter L. Bergen | Official  Publisher Page | Simon & Schuster

In his latest book on bin Laden, Bergen argues that 9/11 was a major tactical success but a long-term strategic failure for the terrorist leader. Yes, he struck a vicious blow against “the head of the snake,” as he called the United States, but “rather than ending American influence in the Muslim world, the 9/11 attacks greatly amplified it,” with two lengthy, large-scale invasions and new bases established throughout the region.

Yet the legacy of the 9/11 era is found not just in Afghanistan or Iraq, but also in an America that drew out and heightened some of its ugliest impulses — a nation that is deeply divided (like those “separated states” bin Laden imagined); that bypasses inconvenient facts and embraces conspiracy theories; that demonizes outsiders; and that, after failing to spread freedom and democracy around the world, seems less inclined to uphold them here. More Americans today are concerned about domestic extremism than foreign terrorism, and on Jan. 6, 2021, our own citizens assaulted the Capitol building that al-Qaeda hoped to strike on Sept. 11, 2001. Seventeen years after the 9/11 Commission called on the United States to offer moral leadership to the world and to be generous and caring to our neighbors, our moral leadership is in question, and we can barely be generous and caring to ourselves.

The Forever War: Filkins, Dexter: 8601420107779: Amazon.com: Books

In “The Forever War,” Dexter Filkins describes a nation in which “something had broken fundamentally after so many years of war . . . there had been some kind of primal dislocation between cause and effect, a numbness wholly understandable, necessary even, given the pain.” He was writing of Afghanistan, but his words could double as an interpretation of the United States over the past two decades. Still reeling from an attack that dropped out of a blue sky, America is suffering from a sort of post-traumatic stress democracy. It remains in recovery, still a good country, even if a broken good country.

Other Articles

https://www.alreporter.com/2021/09/07/opinion-9-11-and-a-now-divided-nation/

Why Poetry Is So Crucial Right Now

By Tish Harrison Warren

Opinion Writer New York Times August 29th 2021

This summer, on a lark, I took a course on poetry geared toward Christian leaders. Twelve of us met over Zoom to read poems and discuss the intersection of our faith, vocations and poetry.

We compared George Herbert’s “Prayer” to Christian Wiman’s “Prayer.” We discussed Langston Hughes’s “Island,” Countee Cullen’s “Yet Do I Marvel” and Scott Cairns’s “Musée” to examine suffering and the problem of evil. We read about Philip Larkin’s fear of death and what he sees as the failures of religious belief in his poem “Aubade.” It was my favorite part of the summer.

In our first class, we took turns sharing what drew us to spend time with poetry. I clumsily tried to explain my longing for verse: I hunger for a transcendent reality — the good, the true, the beautiful, those things which somehow lie beyond mere argument. Yet often, as a writer, a pastor and simply a person online, I find that my life is dominated by debate, controversy and near strangers in shouting matches about politics or church doctrine. This past year in particular was marked by vitriol and divisiveness. I am exhausted by the rancor.

In this weary and vulnerable place, poetry whispers of truths that cannot be confined to mere rationality or experience. In a seemingly wrecked world, I’m drawn to Rainer Maria Rilke’s “Autumn” and recall that “there is One who holds this falling/Infinitely softly in His hands.” When the scriptures feel stale, James Weldon Johnson preaches through “The Prodigal Son” and I hear the old parable anew. On tired Sundays, I collapse into Wendell Berry’s Sabbath poems and find rest.

I’m not alone in my interest in this ancient art form. Poetry seems to be making a comeback. According to a 2018 survey by the National Endowment for the Arts, the number of adults who read poetry nearly doubled in five years, marking the highest number on record for the last 15 years. The poet Amanda Gorman stole the show at this year’s presidential inauguration, and her collection “The Hill We Climb” topped Amazon’s best-seller list.

There is not a simple or singular reason for this resurgence. But I think a particular gift of poetry for our moment is that good poems reclaim the power and grace of words.

Words seem ubiquitous now. We carry a world of words with us every moment in our smartphones. We interact with our family and friends through the written word in emails, texts and Facebook posts. But with our newfound ability to broadcast any words we want, at any moment, we can cheapen them.

“Like any other life-sustaining resource,” Marilyn Chandler McEntyre writes in her book “Caring for Words in a Culture of Lies,” “language can be depleted, polluted, contaminated, eroded and filled with artificial stimulants.” She argues that language needs to be rescued and restored, and points us to the practice of reading and writing poetry as one way of doing so. Poems, she says, “train and exercise the imagination” to “wage peace” because “the love of beauty is deeply related to the love of peace.”

Indeed, in our age of social media, words are often used as weapons. Poetry instead treats words with care. They are slowly fashioned into lanterns — things that can illuminate and guide. Debate certainly matters. Arguments matter. But when the urgent controversies of the day seem like all there is to say about life and death or love or God, poetry reminds me of those mysterious truths that can’t be reduced solely to linear thought.

Poetry itself can engage in smart debate, of course. Yet even didactic poetry — poetry that makes an argument — does so in a more creative, meticulous and compelling way than we usually see in our heated public discourse.

Another reason that I think we are drawn to poetry: Poems slow us down. My summer poetry class teacher, Abram Van Engen, an English professor at Washington University in St. Louis, reminded me that poetry is the “art of paying attention.” In an age when our attention is commodified, when corporations make money from capturing our gaze and holding it for as long as possible, many of us feel overwhelmed by the notifications, busyness and loudness of our lives. Poetry calls us back to notice and attend to the embodied world around us and to our internal lives.

In this way, poetry is like prayer, a comparison many have made. Both poetry and prayer remind us that there is more to say about reality than can be said in words though, in both, we use words to try to glimpse what is beyond words. And they both make space to name our deepest longings, lamentations, and loves. Perhaps this is why the poetry of the Psalms became the first prayer book of the church.

I am trying to take up more poetry reading in my daily life. Reading new poems can be intimidating, but I figure that the only way to get poetry really wrong is to avoid it altogether. It helps that poetry is often short and quick to read so I fit it into the corners of my day — a few minutes in bed at night or in the lull of a Saturday afternoon.

During the past school year, with my kids home because of Covid precautions, we would pile books of poetry on our table once a week (Shel Silverstein, Shakespeare, Nikki Grimes, Emily Dickinson), eat cookies, and read poetry aloud. I now try to always keep some books of verse around.

In one of my very favorite poems, “Pied Beauty,” Gerard Manley Hopkins writes of a beauty that is “past change.” In this world where our political, technological and societal landscape shifts at breakneck speed, many of us still quietly yearn for a beauty beyond change. Poetry stands then as a kind of collective cry beckoning us beyond that which even our best words can say.

Have feedback? Send a note to HarrisonWarren-newsletter@nytimes.com.

Tish Harrison Warren (@Tish_H_Warren) is a priest in the Anglican Church in North America and author of “Prayer in the Night: For Those Who Work or Watch or Weep.”

Weary of turmoil and division, most teens still voice faith in future, Post-Ipsos poll finds

Image without a caption

By Sydney Trent and Emily Guskin Updated Aug. 25 at 8:00 a.m. Originally published Aug. 25, 2021228

Sophia Grigsby watched with horrified amazement as insurrectionists stormed the Capitol on Jan. 6, defiling the halls of power in a violent attempt to prevent Joe Biden from becoming president.

The 16-year-old from rural Minnesota wondered, fleetingly, if she had been naive in believing that the protests last summer following the murder of George Floyd had truly marked a turning point. Yet even as the televised spectacle confirmed her belief in the rising dangers of white supremacy — some of the rioters were carrying Confederate flags — Grigsby’s optimism won out.

“Even with the murder of George Floyd, I’m finding people have become so much more aware,” said Grigsby, who starts her junior year of high school in St. Peter, Minn., this month. “While our country is really divided, I think that part of that division is because of that newfound awareness.”

Despite some difficulties as a mixed-race student, including oncefiling a legal complaint againsther school district after it failed to stop classmates from hurling racial slurs at her, Grigsby is also optimistic about her own life. She sees herself graduating from college, meeting her husband in medical school and raising two children — “a boy and a girl, twins,” she hopes — all the while most likely becoming rich.Story continues below advertisement

Grigsby’s largely upbeat attitude about the future, combined with a world-weary realism that seems mature beyond her years, is echoed in the findings of a nationalWashington Post-Ipsos poll of teens ages 14 to 18.

[Read the Post-Ipsos Teens in America poll results]

While still hopeful about what lies ahead, many teens do not view the current moment so favorably. Fifty-one percent say that now is a bad time to be growing up, compared with 31 percent who answered that way 16 years ago, in a poll of teens conducted by The Post, the Kaiser Family Foundation and Harvard University. Their parents are even more negative, with more than 6 in 10 saying it’s a bad time for teenagers to be growing up.

These young Americans, who are coming of age amid a once-in-a-lifetime pandemic, political and social unrest, growing economic inequality and rising crime, are keenly aware of the country’s problems. Majorities view political divisions, racial discrimination, the cost of health care and gun violence as “major threats” to their generation, according to the new Post-Ipsos poll. Nearly half also rank climate change as a major threat.

Some are already trying to make a difference. Heily DeJesus, who lives in Lebanon, Pa., said she dashed from her brother’s high school graduation to a Black Lives Matter protest, where they all took a knee for a selfie as her brother raised his fist in the air.

“It felt great to know that we’re a part of making a change for the world,” she remembered. “Even if it’s a small town, we’re still making a change.”

The survey of 1,349 teens was conducted online in May and June primarily through Ipsos’s randomly recruited panel of U.S. households. Overall results have a margin of error of plus or minus three percentage points, and the relatively large sample allows comparison of White, Black, Hispanic and Asian teens.

These young people are part of what is likely the most diverse cohort in the nation’s history.New Census Bureau data shows that the country’s under-18 population is majority-minority for the first time, with White children making up 47.3 percent of that age group compared with 53.5 percent in 2010. Their childhoods have been marked by racial justice protests and a growing societal acceptance of LGBTQ people. Most also perceive significant discrimination against a wide range of groups in American society. Black and transgender people topped the list, with about 6 in 10 teens saying Black people are treated unfairly very or somewhat often and an almost equal share saying the same thing about transgender people.

Read More