Why Your Negotiations Are Doomed (And How to Rescue Them)

Negotiators, even professional ones, make surprisingly many wrong decisions that doom negotiations that should have succeeded. Many of these mistakes relate to overestimating how well they can read the feelings and thoughts of other parties in the negotiation, as well as the extent to which the other party can read their feelings and thoughts. 

For instance, research shows that negotiators who sought to conceal their desires did a better job than they thought they did. In turn, those who tried to convey information to those they negotiated with about their preferences overestimated their abilities to communicate such knowledge. Other scholarship shows that negotiators with less power are more prone to such mistakes than those with more power.

Scholars call this erroneous mental pattern the illusion of transparency, referring to us overestimating the extent to which others understand us and how well we grasp others. This mental blindspot is one of many dangerous judgment errors – what scholars in cognitive neuroscience and behavioral economics call cognitive biases – that we make due to how our brains are wired. We make these mistakes not only in work but also in other life areas, for example, in our shopping choices, as revealed by a series of studies done by a shopping comparison website.

Fortunately, recent research in these fields shows how you can use pragmatic strategies to address these dangerous judgment errors effectively.

I observed a clear instance of an illusion of transparency when an electric company brought me in as a consultant to mediate in failing contract negotiations between the management and the union. Both sides believed the other party to be unwilling to negotiate in good faith, asking too much and giving too little. The union demanded substantial wage hikes, strong job protections, and better retirement benefits, and management pushed back firmly on each request.

Quickly, I noticed that the illusion of transparency gravely inhibited progress. My private conversations with representatives from both sides showed that all felt they communicated their positions effectively, both the areas where they wanted to stand firm and where they felt willing to compromise. Yet these same conversations showed many areas of agreement and flexibility that neither side recognized.

Why didn’t both sides explicitly outline their positions thoroughly and clearly, so that the other side understood precisely where they stood? Because they were afraid that the other party would take advantage of them if they explicitly stated their actual positions, including the minimum they’d be willing to accept. 

So both sides tried to convey what was most important to them by arguing more strongly for specific points and less strongly for others. They believed that the other side would “get the hint.” Unfortunately, neither side “got the hint” of the real priorities of the other side.

What I asked each side to do was use the decision-making strategy of weighing their priorities. After deploying this strategy, the union negotiators assigned priority to increased job protection, second to better retirement benefits, and third to a substantial wage increase. The management negotiators used the same strategy and attributed priority to no wage increase, second to decreased retirement benefits, and last to weaker job protection. 

By clarifying these priorities, the parties were able to find room for negotiation. The final contract included much-strengthened job protection, a moderate boost to retirement, and a small wage hike at just below inflation. 

The management appreciated the outcome since it didn’t have to spend as much money on labor; the union membership liked the peace of mind that came with job protection, even if they didn’t get the wage hike they would have wanted.

The key takeaway is that in any negotiation situation, you’re very likely to be overestimating the extent to which you explained your position to the other party. You’re also probably too confident about how well you understand the other party’s perspective. The other party is most likely making the same mistakes regarding you.

An easy way to address these problems is to use the decision-making strategy of weighing your priorities and having the other party do the same. Then, trade off your lowest preferences against their highest ones and vice versa. You can come to a win-win agreement where both parties realized the most significant gains and experience the least losses. Such strategic approaches to addressing cognitive biases will help you in all areas of your professional life.

Should Real Leaders Trust Their Gut?

Let’s say you’re interviewing a new applicant for a job, and you feel something is off. You can’t quite put your finger on it, but you’re a bit uncomfortable with this person. She says all the right things, her resume is great, she’d be a perfect hire for this job — except your gut tells you otherwise. Should you go with your gut?

In such situations, your default reaction should be to be suspicious of your gut. Research shows that job candidate interviews are poor indicators of future job performance.

Unfortunately, most leaders tend to trust their gut over their head and give jobs to people they like and perceive as part of their in-group, rather than merely the most qualified applicant. 

In other situations, however, it does make sense to rely on gut instinct to make a decision. Yet research on decision-making shows that most people don’t know when to rely on their gut and when not to do so. 

The reactions of our gut are rooted in the more primitive, emotional, and intuitive parts of our brains that ensured survival in our ancestral environment. Tribal loyalty and immediate recognition of friends or foes were especially useful for thriving in that environment.

In modern society, however, our survival is much less at risk. Our gut is more likely to compel us to focus on the wrong information to make decisions in the workplace and other areas.

For example, is the job candidate mentioned above similar to you in race, gender, socioeconomic background? Even seemingly minor things like clothing choices, speaking style, and gesturing can make a big difference in determining how you evaluate another person. 

Our brains tend to fall for the dangerous judgment error known as the “halo effect,” which causes some characteristics we like and identify with to cast a positive “halo” on the rest of the person. It’s opposite to the “horns effect,” in which one or two negative traits change how we view the whole. The halo effect and horns effect are two of many dangerous judgment errors, which are mental blind spots resulting from how our brain is wired that scholars in cognitive neuroscience and behavioral economics call cognitive biases. We make these mistakes not only in work but also in other life areas, for example, in our shopping choices, as revealed by a series of studies done by a shopping comparison website.

Fortunately, recent research in these fields shows how you can use pragmatic strategies to address these dangerous judgment errors, whether in your professional life, your relationships, or other life areas

You need to evaluate where cognitive biases are hurting you and others in your team and organization. Then, you can use structured decision-making methods to make “good enough” daily decisions quickly, more thorough ones for moderately essential choices, and an in-depth one for preeminent choices.

Such techniques will also help you implement your decisions well, and formulate truly effective long-term strategic plans. Besides, you can develop mental habits and skills to notice cognitive biases and prevent yourself from slipping into them.

For example, you need to remember that just because a person is similar to you does not mean she will be the best employee. The research is clear that our intuitions often don’t serve us well in making the best hiring decisions. Such reliance on intuition is especially harmful to workplace diversity and paves the path to bias in hiring, including in terms of racedisabilitygender, and sex.

Despite the numerous studies showing that structured interventions are needed to overcome bias in hiring, unfortunately, business leaders and HR personnel tend to over-rely on unstructured interviews and other intuitive decision-making practices. Due to our overconfidence bias, a tendency to evaluate our decision-making abilities as better than they are, leaders often go with their guts on hires and other business decisions rather than use analytical decision-making tools that have demonstrably better outcomes.

A proper fix is to note how the applicant is different from you, and give them “positive points” for it. Alternatively, create structured interviews with a set of standardized questions asked in the same order to every applicant.

Let’s take a different situation. Say you’ve known a business colleague for many years, collaborated with her on a wide variety of projects and have an established relationship. 

Imagine yourself having a conversation with her about a potential collaboration. For some reason, you feel less comfortable than usual. Most likely, your intuitions are picking up subtle cues about something being off.

Maybe it’s nothing. Perhaps that person is having a bad day or didn’t get enough sleep the night before.

However, that person may also be trying to pull the wool over your eyes. When people lie, they behave in ways that are similar to other indicators of discomfort, anxiety, and rejection, and it’s tough to tell what’s causing these signals.

Overall, this is an excellent time to take your gut reaction into account and be more suspicious than usual.

The gut is vital in our decision-making to help us notice when something might be amiss in well-established relationships. Yet, in most situations, when we face significant decisions about workplace relationships, we need to trust our heads more than our gut to make the best decisions.

Are You Still Falling for the ‘Failing to Plan is Planning to Fail’ Myth?

You’ve probably heard the old advice for entrepreneurs that “failing to plan is planning to fail.” That phrase is a misleading myth at best, and actively dangerous at worst. Making plans is essential, but our gut reaction is to plan for the best-case outcomes, ignoring the high likelihood that things will go wrong. 

A much better phrase is “failing to plan for problems is planning to fail.” To address the very high likelihood that problems will crop up, you need to plan for contingencies. 

When was the last time you saw a major planned project suffer from a cost overrun? It’s not as common as you may think for a project with a clear plan to come in at or under budget. 

For instance, a 2002 study of significant construction projects found that 86% went over budget. In turn, a 2014 study of IT projects found that only 16.2% succeeded in meeting the originally planned resource expenditure. Of the 83.8% of projects that did not, the average IT project suffered from a cost overrun of 189%. 

Such cost overruns can seriously damage your bottom line. Imagine if a serious IT project such as implementing a new database at your organization goes even 50% over budget, which is much less than the average cost overrun. You might be facing many thousands or even millions of dollars in unplanned expenses, causing you to draw on funds assigned for other purposes and harming all of your plans going forward. 

What explains cost overruns? They largely stem from the planning fallacy — our intuitive belief that everything will go according to plan.

The planning fallacy is one of many dangerous judgment errors. These are mental blind spots resulting from how our brain is wired. Scholars in cognitive neuroscience and behavioral economics call this cognitive biase. We make these mistakes in work and other areas of our life. An example, is shopping choices, as revealed by a series of studies done by a shopping comparison website.

Fortunately, recent research in these fields shows how you can use pragmatic strategies to address these dangerous judgment errors, whether in your professional life, your relationships, or other areas of your life. 

You need to evaluate where cognitive bias hurts you, and those in your team or organization. Then, you can use structured decision-making methods to make “good enough” daily decisions quickly, more thorough ones for moderately important choices, and an in-depth one for major decisions.

Such techniques will also help you implement your decisions well, and formulate truly effective long-term strategic plans. Also, you can develop mental habits and skills to notice cognitive biases and prevent yourself from slipping into them.

Solving the Planning Fallacy

Specifically around the planning fallacy, my coaching, and consulting clients have found three specific research-based techniques effective.

First, break down each project into parts. An IT firm struggled with a pattern of taking on projects that ended up losing money for the company. We evaluated the specific parts of the projects that had cost overruns and found that the most significant unanticipated money drain came from permitting the client to make too many changes at the final stages of the project. As a result, the IT firm changed its process to minimize any changes at the tail end of the project.

Second, use your experience with similar projects to inform your estimates for future projects. A heavy equipment manufacturer had a systemic struggle with underestimating project costs. In one example, a project that was estimated to cost $2 million ended up costing $3 million. We suggested making it a requirement for project managers to use past project costs to inform future projections. Doing so resulted in much more accurate cost estimates.

Third, for projects with which you have little experience, use an external perspective from a trusted and objective source. A financial services firm whose CEO I coached wanted to move its headquarters after outgrowing its current building. I connected the CEO with a couple of other CEO clients who had recently moved and expressed a willingness to share their experience. This helped the financial services CEO anticipate contingencies he didn’t previously consider, such as additional marketing expenses, printing new collateral with the updated address, and lost productivity from changing schedules and new commute routes for employees.

If you take away one message from this article, remember that the key to addressing cost overruns is to remember that “failing to plan for problems is planning to fail.” Use this phrase as your guide to prevent cost overruns and avoid falling prey to the dangerous judgment error of planning fallacy.

How Best to Deal With Colleagues in Denial

When was the last time a colleague said something so ridiculous that it made your jaw drop? A four-year study by LeadershipIQ.com found that 23 percent of CEOs were fired for denying reality — refusing to recognize adverse facts about an organization’s performance. 

We typically respond to people who deny reality by confronting them with facts and counter arguments. But research suggests this is precisely the wrong thing to do.

Research around confirmation bias shows that we tend to look for and interpret information in ways that conform to our beliefs. We have an emotional investment in continuing to believe what we want to believe. Furthermore, studies on a phenomenon called the backfire effect show that when we are presented with facts that cause us to feel bad about our self-worth or worldview, we can develop an even stronger attachment to incorrect beliefs.

These mental blindspots are 2 of more than 100 dangerous judgment errors that result from how our brains are wired. It’s something scholars of cognitive neuroscience and behavioral economics call cognitive biases. We make these errors in work and personal life alike. An example, is the shopping choices we make, as revealed by a series of studies done by a shopping comparison website.

Fortunately, recent research shows us how to use pragmatic strategies to address these dangerous judgment errors — in your professional life, your relationships, or other life areas. It can be helpful to evaluate where cognitive bias is hurting you, and others on your team. Then, use structured decision-making methods to make “good enough” daily decisions quickly, more thorough methods for moderately important choices, and in-depth one’s for major decisions.

Such techniques will also help you implement your decisions well, and formulate truly effective long-term strategic plans. Also, you can develop mental habits and skills to notice cognitive biases and prevent yourself from slipping into them.

So, how do you deal with colleagues suffering from the ostrich head-in-the-sand syndrome?

Rather than arguing about it, it’s much more effective to use a research-based strategy. I developed one called EGRIP (Emotions, Goals, Rapport, Information and Positive Reinforcement), which provides clear guidelines on how to deal with people who deny the facts.

For instance, consider the case of Mike, a new product development team lead in a rapidly-growing tech start-up. He set an ambitious goal for a product launch, and as more and more bugs appeared, he refused to move the launch date. People tried to talk to him, but he hunkered down and kept insisting that the product would launch on time, and work well. I was doing coaching for the company’s founder at the time, and he asked me to approach Mike to try and resolve the issue.

E – Connect with their emotions

If someone denies clear facts, you can safely assume that it’s emotions leading them from reality. While gut reactions can be helpful, they can also lead us astray. What works better is to focus on understanding these emotions and to determine what emotional blocks might be causing them to stick their heads in the sand.

What I discovered in my conversations with Mike was that he tied his self-worth and sense of success to “sticking to his guns,” associating strong leadership with consistency and afraid of appearing weak in his new role as the team lead. He believed team members were trying to undermine him by getting him to shift the schedule — leading him to admit that he’d failed to deliver. This false association of leadership with consistency, and fear of appearing weak, is a frequent problem for new leaders.

G – Establish shared goals

It’s best to establish shared goals — crucial for effective knowledge sharing. I spoke with Mike about how we both shared the same goal of having him succeed as a leader within the company. Likewise, we both shared our goal of having the new product become profitable.

R – Build rapport

Next, build up a rapport by establishing trust. Use empathetic listening to echo their emotions and show you understand how they feel. I spoke to Mike about how it was hard to be worried about the loyalty of one’s team members. We also discussed what makes someone a strong leader.

I – Provide information

At this point, start giving new information, that is a little more challenging, but doesn’t yet touch the actual pain point.

I told Mike how research suggests that one of the most important signs of being a strong leader is the ability to change your mind based on new evidence. I gave examples, such as Alan Mulally saving Ford Motor Company through repeated changes of mind. If I had begun with this information, Mike may have perceived it as threatening. However, by slipping this in naturally, as part of a broader conversation and building a rapport built on shared goals, he accepted the information calmly.

P – Provide positive reinforcement

After a person has changed their perspective, provide them with positive reinforcement. This is a research-based tactic that shifts someone’s emotions. The more positive emotions a person attaches to accepting adverse facts, the less likely you’ll need to have the same conversation with them again.

With Mike, I discussed how he could best exhibit these characteristics — to show those trying to undermine him, that he was indeed a strong leader. I directed the conversation toward how he could show strength by delaying the launch of the new product. Eventually, he agreed, and I praised his ability to show strength and leadership by shifting his perspective, based on new evidence.

Good luck, and remember that you can use EGRIP in a professional setting, and almost any other situation, that requires you to steer others away from a false belief that causes them to deny reality.

Why Promotions Fail: How to Overcome Blind Spots and the Curse of Knowledge

It’s all too common for people in organizations to be promoted up the hierarchy to their “level of incompetence.” In management, the concept is known as the Peter Principle.

People are promoted because they did well in their previous job, not because they have the potential or the skills to meet the requirements of their new role. In fact, there’s often no training for the new skills they need to learn to succeed in the new position. 

This combination of poor promotion practices and lack of training stems mainly from a dangerous judgment error known as the curse of knowledge: once we learn something, we can’t relate to someone who hasn’t learned it. For instance, we learn how to manage others, then completely forget that not everyone knows how to do it. Or we learn the jargon of our profession and use it with those who don’t know the terms, baffled as to why they don’t understand us. We can’t teach others how to do our roles because we can’t communicate the skills and knowledge the position requires.

The error stems from how our brains are wired. It’s one of a whole range of cognitive biases that cause us to make mistakes in all areas of our work and lives. Fortunately, there are practical strategies we can use to overcome these dangerous judgment errors.

Here’s a case in point:

A Northeast state’s Department of Transportation was having a severe Peter Principle challenge: staff were being promoted into supervisory roles based on seniority and prior performance, not the proven ability to supervise. Nor were they given any advance training: the just-promoted supervisors were expected to pick up their newly required skills on the job. It was a clear instance of the curse of knowledge: department leaders had forgotten how hard it was to develop their leadership skills. 

Thankfully, a newly hired HR Director who had an outsider’s perspective was able to see the flaws in this approach — and pointed out the seriousness of the issue to department leadership. She convinced them to create a leadership development training program for newly-promoted supervisors. The HR Director brought in Disaster Avoidance Experts to consult on creating the leadership development program. The opt-in program, meant for new supervisors promoted from within the ranks, included new skills and relevant knowledge and was soon expanded with a mentoring program.

In the past, a six-month performance review of new supervisors found that 63 percent on average met or exceeded expectations — and that became the department’s benchmark. When performance reviews were conducted for those who had voluntarily joined the program, the rate jumped to 83 percent. But of those who did not participate, it fell to 59 percent — a clear indicator that training was an effective way to overcome the problem. Based on the success of the program, the Department of Transportation adopted a commitment to training all of their newly promoted supervisors. While the roots of this flawed promotion system could not be addressed due to contracted promotion guidelines, the curse of knowledge could be alleviated.

Every organization should assess whether or not cognitive biases are harming your teams and your success. If so, there are effective techniques for making and implementing decisions on how to mitigate the damage and launch better long-term strategies. Using structured decision-making will put you in a better position to make those daily decisions quickly, those more important ones more thoroughly, and those significant decisions more accurately.