Symptoms of GroupThink
Symptoms of GroupThink usually causes disruption to the problem-solving process.

Janis lists eight specific symptoms of a group undergoing GroupThink:

1. Illusion of Invulnerability
Experts in a group will feel that nothing can go wrong after making a decision, because they "are the best" and therefore the decision they made cannot go wrong.

2. Belief in Inherent Morality of the Group
Experts in a group will believe that the decisions they are making will be morally correct.

3. Collective Rationalization
Members discount warnings that their thinking may be irrational.

4. Out-group Stereotypes
The members will all have the same stereotype of the problem.

5. Self-Censorship
When a specific expert is asked what to do about a problem, the expert will not give an answer directly contradicting the group's wishes.

6. Illusion of Unanimity
Even if some individuals of the group privately do not consent to the decision made, the individual will not speak out. The individual's silence will then be percieved as consent.

7. Direct Pressure on Dissenters
If an individual does speak out against the will of the majority of the group, all other members will attempt to quell the "dissident".

8. Self-Appointed Mindguards
The leader of the group of experts will refuse to hear any argument contradicting his own wishes or the will of the majority.

 

Challenger Explosion

This is a the Space Shuttle Challenger exploding in mid-air, during it's launch in 1986

1. Illusion of Invulnerability. Despite the launch pad fire that killed three astronauts in 1967 and the close call of Apollo 13, the American space program had never experienced an in-flight fatality. When engineers raised the possibility of catastrophic O-ring blow-by, NASA manager George Hardy nonchalantly pointed out that this risk was ''true of every other flight we have had." Janis summarizes this attitude as ''everything is going to work out all right because we are a special group."11
2. Belief in Inherent Morality of the Group. Under the sway of groupthink, members automatically assume the rightness of their cause. At the hearing, engineer Brian Russell noted that NASA managers had shifted the moral rules under which they operated: ''I had the distinct feeling that we were in the position of having to prove that it was unsafe instead of the other way around."
3. Collective Rationalization. Despite the written policy that the O-ring seal was a critical failure point without backup, NASA manager George Hardy testified that ''we were counting on the secondary O-ring to be the sealing O-ring under the worst case conditions." Apparently this was a shared misconception. NASA manager Lawrence Mulloy confirmed that ''no one in the meeting questioned the fact that the secondary seal was capable and in position to seal during the early part of the ignition transient." This collective rationalization supported a mindset of ''hear no evil, see no evil, speak no evil."
4. Out-group Stereotypes. Although there is no direct evidence that NASA officials looked down on Thiokol engineers, Mulloy was caustic about their recommendation to postpone the launch until the temperature rose to 53 degrees. He reportedly asked whether they expected NASA to wait until April to launch the shuttle.
5. Self-Censorship. We now know that Thiokol engineer George McDonald wanted to postpone the flight. But instead of clearly stating ''I recommend we don't launch below 53 degrees," he offered an equivocal opinion. He suggested that ''lower temperatures are in the direction of badness for both O-rings . . .." What did he think they should do? From his tempered words, it's hard to tell.
6. Illusion of Unanimity. NASA managers perpetuated the fiction that everyone was fully in accord on the launch recommendation. They admitted to the presidential commission that they didn't report Thiokol's on-again/off-again hesitancy with their superiors. As often happens in such cases, the flight readiness review team interpreted silence as agreement.
7. Direct Pressure on Dissenters. Thiokol engineers felt pressure from two directions to reverse their ''no-go" recommendation. NASA managers had already postponed the launch three times and were fearful the American public would regard the agency as inept. Undoubtedly that strain triggered Hardy's retort that he was ''appalled" at Thiokol's recommendation. Similarly, the company's management was fearful of losing future NASA contracts. When they went off-line for their caucus, Thiokol's senior vice president urged Roger Lund, vice president of engineering, to ''take off his engineering hat and put on his management hat."
8. Self-Appointed Mindguards. . NASA managers insulated Jesse Moore from the debate over the integrity of the rocket booster seals. Even though Roger Boisjoly was Thiokol's expert on O-rings, he later bemoaned that he ''was not even asked to participate in giving input to the final decision charts."

 

 

 

 

 

Operation Market Garden, 1944

A daring plan was formed by top allied commanders during 1944 to end the European war. The plan itself was brilliant, concieved by the leader of the British forces, Field Marshal Montgomery. Paratroopers were to land in the fields of Holland and seize critical points to secure a passage for the main army to come racing through. In all the confusion before the operation, however, an important reconnaissance officer was ignored by the top brass. The officer had very recent pictures of elite tank units in the area of the paratrooper drop zone. The top brass, eager to move ahead, ignored the photos. The officer, eager to seem cooperative, put the photos away for storage. Unfortunately, there WERE elite tank units in the paratrooper drop zone areas. The operation commenced, and the paratroopers were dropped over Holland. Almost immediately, the enemy tank units counterattacked, slaughtering the paratroopers. The operation failed in less than 10 days.

If all was carried out as planned it should have ended the war by Christmas 1944.

Unfortunately this daring plan, named Operation Market Garden, didn't have the expected outcome. Critical information was suppressed by the top generals, thus causing the operation to fail and have many lives lost.

Groupthink can be easily spotted here. The high cohesion levels between the generals and officers, the eagerness to cooperate and win the war, and silencing of the people speaking out against the majority's opinion.

Cults

Another good example of groupthink is cults. Cults suck people in, and twist their beliefs to fit their own. A perfect example of groupthink.

1.    Illusion of Invulnerability. Most people join cults to become a part of it, and feel safe in the group. The cult becomes an unstoppable force, and the members believe that no matter what they do, that idea they hold so dear is what keeps them going, and believing. One example is Heaven's Gate, where most of the members killed themselves in a mass suicide, to join whatever idea they had.
2.    Belief in Inherent Morality. The members of a cult believed that whatever they did was right and just under their belief system. Whatever the cult leaders decided was absolutely correct in every aspect. They had the notion that their beliefs were the correct ones, and so, everything they did was moral and just.
3.    Collective Rationalization. The entire cult would follow exactly what their leader said, leading to the belief that their leaders were always right. Whatever would be perceived as incorrect normally was swept under the rug in the cult.
4.    Out Group Stereotypes. As shown before, cult members would refuse to listen to any outsiders who had valid opinions. The word of their leaders was law, and anyone who said otherwise, was shut out.
5.    Self-Censorship. Even those who may have an opinion or suggestion to the contrary of the leader's, they will not speak out for fear of being cast out. They are afraid that they will be forced to leave, or even worse, get killed. Due to this fear, they don't speak out, furthering the groupthink.
6.    Illusion of Unanimity. There are always certainly those who don't agree with the decisions of the leaders 100% of the time. However, for the same reasons as listed above, they refuse to speak out. This leads to the belief that the entire group will always agree with the leaders, and makes everything worse.
7.    Direct Pressure on Dissenters. When one of these people eventually does speak out, they are immediately pressured to reconsider. Their ideas and suggestions are not even considered, as the majority of the group will believe in the leader.
8.    Self-Appointed Mindguards. Also, when somebody speaks out, their ideas never reach the leader. As soon as the start to argue, they're silenced, and the counter opinion never gets high enough to be considered. Perhaps if it did reach the leader, they would reconsider.

Hosted by www.Geocities.ws

1