Back To Resources
Twelve Essays

Our Deep Desire To Belong.

Are you feeling brave? I hope so, because I'm about to ask you to do something that will be excruciatingly difficult. I want you to look at a photo, and no matter how much it hurts to look at this particular photo, I want you to continue to do so. Are you ready? Okay, take a deep breath, go find your middle school yearbook, and meet me back here.

Chances are, regardless of the page you turn to, you'll find plenty to cringe about. Memories of middle school are typically not what most of us would try to recall if we were hoping to boost our self-esteem. In fact, memories of middle school are typically not what most of us would care to recall under any circumstances. For many, middle school represents the pinnacle of their adolescent insecurity. Middle schoolers toil through this segment of their lives in perpetual pubescent angst and anguish; no longer afforded the leniency of childhood, but not yet permitted the freedom of adulthood. Stuck in an awkward purgatory on the precipice of maturity.

To look back at our middle school selves is to inevitably wonder "what was I thinking?" Why was I wearing those clothes? Why would I style my hair like that? Unfortunately, the answer is as painfully simple as the pictures are painfully embarrassing: you did so because everyone else was doing it.

Here's the good news: most of us grow out of our middle school insecurity and blossom into beautiful, moderately well-adjusted butterflies. However, there is some bad news: we never really grow out of our compulsion to conform.

The Asch Experiments

Take, for instance, the following experiment. You are ushered into a room where only one other person waits for you, the principal researcher who is conducting the study. The prompt is simple: the researcher holds up a card, directs you to examine the line on the far left, and asks you to decide which of the three lines on the right is closest to it in length. He gives you a minute to consider your options, but truthfully it didn't take you more than two or three seconds. The answer is obvious to most people: Line A is clearly the closest in length to the line in question (referred to henceforth, as many researchers have, as Line X).

The researcher repeats the same process with twelve different sets of lines, but each is as simple as the first. You breeze through it with very little difficulty, and are confident you've finished with 100% accuracy.

Now comes part two.

The researcher leads you to an adjoining room where several other participants are seated and patiently waiting on one side of a long table. Each of them has previously participated in the series of judgments that you've just completed and are awaiting the instructions for the next portion of the study. You take the last available seat at the end of the table and the researcher begins to speak.

"The next and final part of this study will be very familiar to each of you. In part one, we tested your visual perception by asking you to make a series of judgments. In part two, the task will not change. I will first randomize the cards used in part one. Then, I will present the first participant with a randomly selected card, he or she will state aloud which line — A, B, or C — is closest in length to Line X, and then I will move to the next participant and repeat the process with the same card. I will make my way down the line until each participant has had an opportunity to state their judgment for that card, and then we will repeat this process with the remaining eleven cards. Understood?"

The group nods in unison. The researcher approaches the first and farthest participant from you. He holds up a card and you notice that it's the first one you were shown during the first portion of the experiment. "Easy, it's A," you think, as your head rolls back with boredom. "This is a waste of time."

Until you hear a voice break the silence. "C!" the first participant exclaims enthusiastically. "What an idiot," you think.

The researcher slowly side-steps to position himself in front of the next participant. "It's C," she states confidently. "Man, are these people blind?!"

The researcher then moves himself in front of the third participant. "Yup, C" he agrees. "Umm, what's happening...?"

The researcher methodically makes his way down the line as participant after participant suggest that it is Line C — and not Line A — that is closest in length to Line X. The group has reached a consensus, and now it is your turn to render your judgment. As the researcher silently approaches you and holds up the card, you can feel the eyes of your fellow participants fix on you, staring, waiting. You know what you see, but doubts have started to creep into your mind after every single participant disagreed with your assessment. So, how do you answer?

Most of us would like to believe we'd disregard the social pressure and give the answer we know to be true. However, statistically speaking, this is clearly not the case. Though people tend to almost always get 100% of the twelve line judgments correct when doing so privately (i.e. one-on-one with the experimenter), when given the same series of twelve judgments in front of a group where everyone seems to agree that the correct answer is not the one you were previously certain of, approximately one-third of all responses succumb to the pressure of the group and conform to the wrong answer (Asch, 1955). Not only that, but 50% — a full half of all participants — conform to the erroneous majority at least six (out of twelve) times.

One-third of responses conformed to a clearly wrong answer. Half of all participants gave the wrong answer at least half the time. These were people who knew the right answer.

That bears repeating: one-third of all responses conform to the group pressure, and half of the participants give the wrong answer (on a task where they knew the correct answer) at least half of the times they are given the opportunity to respond.

Why? Why are people willing to provide an answer they know is wrong? According to Elliot Aronson, author of the phenomenal book The Social Animal, the answer lies in our competing desires: our desire to be correct and our desire to be a liked by the group. Although being right is appealing, it is often outweighed by our desire to be included (or, perhaps just as importantly, to not be excluded). In fact, experiments similar to the one described above, which was originally designed by the psychological titan Solomon Asch in the 1950s, have been recently re-conducted, incorporating modern technology that allows us to monitor neural activity as a participant decides to conform to or deviate from the group. It turns out that deviation produces "error signals" in the brain while simultaneously stimulating activity in the regions responsible for pain processing (Wu, Luo, & Feng, 2016). In short, we may very well be wired to conform, and going against the group quite literally hurts.

When The Stakes Are Higher: Milgram

For those hoping to run an organization where individuals confidently speak up for what they believe is right — whether it be regarding policy, strategy, or the like — the findings above are cause for concern. These individuals were willing to knowingly ignore their better judgment and make an incorrect statement simply to win the approval of a group of strangers. These are people the participants had never even met and yet the compulsion to belong was so strong that they cared how they'd be perceived if their verdict differed from that of their anonymous peers. If people are willing to act this way while making trivial judgments in front of strangers, what about when they're making critical assessments of actual issues in front of colleagues — people whom they work with every day, whose opinions of them they truly care about, and whom they really want to remain liked by?

"But the results from the Asch experiments mean very little," some protest "because the judgments were meaningless. It was just a bunch of stupid lines. If the consequences were greater, people would be less likely to buckle under social pressure."

Enter Stanley Milgram. Unimpressed by the stakes of the experiments conducted by Asch (though intrigued by the findings), Milgram set out to test a phenomenon similar to conformity: obedience. Asch had tested whether people would crack under the implicit social pressure of a group; Milgram sought to test whether they would do the same under the explicit pressure of an authority figure.

If you've taken an introductory psychology class, you're familiar with the Milgram experiments, so I'll do my best to explain it for those who are unfamiliar while not dwelling on meticulous details for those who are. Essentially, Dr. Milgram would have participants come in to participate in what he referred to as a study on learning. When they arrived, they were introduced to another "participant" (actually a confederate working with Dr. Milgram). The two were then told that they would be randomly assigned to one of two roles: teacher or student. (Of course, the ostensibly random assignment was rigged so the real participant would always occupy the role of the teacher.) After the roles had been assigned, both teacher and student were led to a back room where they were shown an apparatus that delivered electric shocks (similar to an electric chair). The teacher was given an opportunity to experience a sample shock to see that the chair was indeed functional. The experiment was simple: the student would be strapped into the apparatus, given a learning task to memorize, and left in the back room. From the front room, the teacher would sit with the experimenter and "test" the student by reading questions over an intercom (the teacher and student could not see one another), and each time the student was unable to complete a section of the task correctly, the teacher was to remotely administer a shock. After each incorrect response, the shocks would increase in intensity, from just a few volts to over four hundred (accompanied by ominous labels such as "XXX" written above the levers).

The experiment starts off benign enough, with the students producing several correct responses with a few incorrect ones peppered in. Following each incorrect response, as instructed, the participants (assigned to the role of teacher) would pull a lever to deliver the shock. At low voltages, students would utter a non-threatening yet audible "ouch!" or say something like "that kind of hurt." However, as the shock voltages increased, so did the protests from the students. And if the participant tried to conscientiously object or refuse to continue to deliver shocks, the experimenter — looming over them in a lab coat — would insist they press on despite the agonizing pleads coming from the other room.

And then, silence. After minutes of desperate begging, at around 300+ volts, the cries would stop. The experimenter would instruct the teacher that silence must be classified as an incorrect answer, and thus the shocks must continue. But who would continue to administer what they believed to be three and four hundred voltage shocks to unresponsive participants? What type of animal could so callously ignore a human being screaming and begging for mercy just because a person in a lab coat was verbally prodding them to continue? Certainly not you, right?

Well, herein lies problem: nobody ever believes they'd be the monster to continue to deliver shocks to a helpless stranger. Everybody thinks themselves the hero who would stand up to the big, bad authority figure and refuse to continue. In fact, even the psychiatrists Milgram polled prior to conducting the experiment believed that less than 1% of participants would actually continue the unconscionable act of administering shocks until the highest voltage was reached (long after the student has ceased begging and gone silent).

In reality, however, 62% of participants did just that.

The Organizational Problem

Social pressure is a formidable opponent. Whether it's pressure to conform to a group or pressure to obey an authority figure, in spite of our insistence that that we'd be invulnerable to such forces, studies clearly show that we consistently and egregiously overestimate our ability to resist.

This presents a massive problem for organizations: how do they function effectively as teams if people have this compulsion to conform and obey? How can they avoid the inevitable progression of conformity and obedience to groupthink, a phenomenon wherein novel ideas are suppressed because of a fear of challenging the status quo? Asch and Milgram set the psychological world ablaze with their provocative findings, and the result was a scorched picture of human autonomy and decency. However, once the ash settled (or should I say once the Asch settled, right guys?! Guys?...hello?), scientists were able to begin to unearth factors that reduced conformity, obedience, and ultimately curtailed groupthink.

Reducing Conformity: What The Research Shows

The first has to do with unanimity. It's scary to go against a group when you're in the minority, but it's terrifying when you're the lone voice of opposition. The stakes are just too high. If you're wrong, you're not only incorrect, but you're also viewed as foolish for being unable to figure out what was so obvious to everyone else. But, in Asch-like studies, we find that if just one person deviates from the group norm — even if their answer isn't correct — conformity from participants drops sharply. This means that if everyone is saying the answer is Line C, and one person is brave enough to say they believe the correct answer is Line B (even though that is also incorrect), the simple act of that person deviating — regardless of correctness — gives other participants the courage to deviate, as well, and state what they actually believe.

Most managers can easily identify a team member they would remove from their team if they had the opportunity to do so, and usually it's the person that disagrees with them most often (let's call him Paul). Paul is seen as a nuisance; an obstacle to getting things done efficiently. But truthfully, Paul is probably the best safeguard you have against groupthink; against moving forward hastily because everyone else is too scared to identify faults in a project or proposal. Paul's dissenting voice — correct or incorrect — paves the way for others to break their tendency toward silent agreement and raise genuine concerns. And quite frankly, even if that pesky Paul really is just a bonafide contrarian who insists on going against the grain regardless of what is being discussed, he still has value because his opposition vicariously lends strength to those who actually (and constructively) disagree, but who otherwise might not be heard because they're frightened to challenge the silent support offered by the remainder of the group. Your team needs Paul.

This finding should cause many organizations to take a long, hard look in the mirror. The hiring policies employed by some firms are predicated less on qualifications and more on affinity and similarity. These types of vacuous hiring procedures, where individuals who might buck the status quo are routinely turned away, are typically justified by hiding behind the veneer of "preserving team cohesion." Predictably, this rationale is accompanied by some variation of "not being a good fit for our culture" or a "disruption to the group dynamic." But some group dynamics need to be disrupted!

Every leader likes to be reassured of their intelligence, and so hiring people who look and think as they do is tempting. However, not only does this perpetuation of homogeneity inherently limit the construction of diverse teams (the importance of which will be discussed shortly), but those who surround themselves with sycophants create an echo chamber where whatever ideas they put in are regurgitated back to them with minimal resistance. This is a lovely short-term plan (because who doesn't enjoy being told how brilliant they are?), but you then become a group in name only. You no longer enjoy the benefits associated with putting multiple heads together to solve a complex problem because it's only the ideas coming from one head that are being heard and mindlessly validated.

2. Build genuine belonging. The second way to reduce phenomena such as conformity and groupthink has to do with belongingness. For those who have read my previous pieces, we can chalk this up as yet another reason why organizations should begin to take social inclusion more seriously. Though we are all (whether we care to admit it or not) susceptible to the forces of social pressure, individuals who question their value — either individually or in relation to their membership within a group — are especially vulnerable. Team members with low self-esteem as well as those who feel less accepted by the group are far more likely to yield to conformity (Dittes & Kelley, 1956). Conversely, those who are secure with their group membership are less likely to mindlessly conform. In fact, long-term group members who have remained in good standing accrue what are referred to as "idiosyncrasy credits" (Hollander, 2006). Think of these credits as tickets that can be cashed in to permit someone the freedom to deviate from the group with minimal repercussions.

Those who feel less accepted by the group are far more likely to conform. Inclusion is not only an ethical priority — it is the mechanism by which independent thought becomes possible.

The Wisdom Of Crowds — Done Right

Although I've painted a treacherous picture of the pitfalls of social pressure, working as a group is not always a bad thing. In fact, thinking and problem-solving as a team can often yield dramatically better results than doing so alone, but it must be done in the correct way. As James Surowiecki explains at length in his book The Wisdom of Crowds, there is often great benefit to be reaped from the input of an entire unit.

A classic example of the benefits of group input transpired at a fair in 1906. A competition was being held to see which attendee could come closest to guessing the weight of a dead ox (fun fact: until as late as 1994, guessing the weight of dead animals was still the most fun thing to do at fairs*). The weight of the ox was 1,198 pounds. About 800 people placed a guess, and while some came very close, the true discovery — gleaned by well-known statistician Francis Galton — was that if you were to take the median of all the guesses made (1207 pounds), you can come exceptionally close to the true weight.

*I'm guessing.

Let's say I put a jar of marbles in the center of the table and asked two teams to put forward an estimate as to how many marbles are in the jar. Team A takes a traditional approach: the leader ventures a guess of his or her own, and then facilitates conversation to see if people agree or disagree. Team B, on the other hand, has everyone silently and independently write down their own guess, and then takes the median (for those who need a quick statistics refresher, think middle) guess and submits it without discussion. Who would do better?

The smart money is on Team B, especially if they are both large and diverse. The problem with Team A — as perceptive readers might have picked up on by now — is that once the leader (or even a well-liked and therefore influential team member) puts forward a guess, the tendency will be for others to conform to that guess. In fact, even if someone is strong enough to resist complete conformity, a confident initial guess can still skew subsequent guesses due to a phenomenon known as anchoring. Essentially, if you were about to guess 200, and someone places a guess of 1,000, you'll feel compelled to adjust your guess accordingly (perhaps to 300 or 400), thus amending your original guess by anchoring it to the higher number.

The success of Team B is based on two factors: anonymity and diversity of opinion. Anonymity is important because by not sharing their guesses with each other, the team members inoculate themselves from the social pressure to conform to each other's guesses. The second component, diversity, is important because it allows people to counteract each other's biases. For instance, one member of the team might be an over-estimator while another might be an under-estimator. Separately, they both make poor guesses, but when both guesses are included in a sample, they tend to balance each other out. As Scott E. Page, author of The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools, and Societies, notes, the greater the diversity in group decision-making paradigms, the more accurate the answers generally become.

The Pre-Mortem Technique

A final way to allow groups to function well while avoiding the natural inclination to conform is by engaging in an activity known as a pre mortem (Klein, 2007). While a post mortem refers to the examination of a dead body to uncover the cause of death, a pre mortem is a thought experiment designed to encourage employees to think about the possible issues with a project or initiative prior to finalizing it. An example would be something like this:

(To your team): "Okay, we've been working on [insert project name] for quite some time now and I think most of us agree that we're ready to move forward. However, before we do, I want us to do something. Everyone grab a piece of paper and something to write with. I want each of you to imagine that we are six months or a year into the future, and this project has failed miserably. Take a few minutes and write down, anonymously and independently, why it has been so prolifically unsuccessful."

This technique not only reduces conformity, it requires people to purposefully argue against the group consensus. You thereby give individuals license to explicitly address threats and shortcomings, features that often become taboo to discuss in organizations plagued by chronic groupthink.

The Bottom Line

Social pressure is an ubiquitous opponent for any organization. It fosters conformity, encourages groupthink, and makes generating innovative ideas and creative, out-of-the-box solutions nearly impossible. However, there are systemic ways to mitigate the pernicious effects of social pressure. Organizations can encourage constructive, contrarian thinking (that is truly free of consequence), work to cultivate inclusive cultures where people feel valued and included (and therefore confident enough to speak up), and leverage diverse perspectives by problem-solving in strategic ways (such as the methods described above).

In many ways, your employees are just like a group of middle schoolers (except hopefully better dressed). We never really grow out of that middle school urge to conform, so it's up to organizational leaders to create environments that enable their people to say what they actually believe to be correct rather than what they believe everyone else believes to be correct.