undefined cover
undefined cover
Tune In: How to Make Smarter Decisions in a Noisy World - Nuala Walsh cover
Tune In: How to Make Smarter Decisions in a Noisy World - Nuala Walsh cover
BE GOOD!

Tune In: How to Make Smarter Decisions in a Noisy World - Nuala Walsh

Tune In: How to Make Smarter Decisions in a Noisy World - Nuala Walsh

45min |15/07/2024
Play
undefined cover
undefined cover
Tune In: How to Make Smarter Decisions in a Noisy World - Nuala Walsh cover
Tune In: How to Make Smarter Decisions in a Noisy World - Nuala Walsh cover
BE GOOD!

Tune In: How to Make Smarter Decisions in a Noisy World - Nuala Walsh

Tune In: How to Make Smarter Decisions in a Noisy World - Nuala Walsh

45min |15/07/2024
Play

Description

šŸŽ§ In this episode we're excited to welcome Nuala Walsh author of the new book, Tune In: How to Make Smarter Decisions in a Noisy World. Named among the 100 most influential women in finance with over three decades of experience in investment management at BlackRock, Merrill Lynch, and Standard Life Aberdeen, she brings a wealth of knowledge to today's conversation. Currently serving as the CEO at Mind Equity, she advises organizations on behavior change, culture, and reputation.


Join us as we delve into her journey exploring the nuances of decision-making in a complex world and uncover strategies to navigate the noise and make smarter choices āœØ


During this conversation, weā€™ll explore:Ā 

  • šŸ“£ Four key factors that influence our ability to tune inĀ 

  • šŸ§  Cognitive challenges that affect our judgment and decision-making abilitiesĀ 

  • šŸ”Ž Practical case studies and behavioral science research for improving our judgementĀ 

Ā 

To learn more about Walsh's work visit www.nualawalsh.com.


šŸ‘‰Join the conversation and share your thoughts about this podcast on Twitter @BVANudgeConsult. Donā€™t have social media? Our inbox is always open at contact@bvanudgeconsulting.com.Ā Ā 


Hosted by Ausha. See ausha.co/privacy-policy for more information.

Transcription

  • Speaker #0

    Welcome to the Be Good Podcast,

  • Speaker #1

    where we explore the application of behavioral economics for good in order to nudge better business and better lives.

  • Speaker #2

    Hi and welcome to this episode of Be Good, brought to you by BVA Notch Consulting, a global consultancy specializing in the application of behavioral science for successful behavior change. Every month we get to speak with a leader in the field of behavioral science, psychology and neuroscience in order to get to know more about them, their work and its application to emerging issues. My name is Eric Singler, founder and CEO of BVN Notch Consulting, and with me is my colleague, Suzanne Kirkendall, CEO of BVN Notch Consulting North America. Hi, Suzanne.

  • Speaker #0

    Hi, Eric. I'm very happy to be back for another episode and delighted to be introducing today's guest, Nuala Walsh. Nuala is an award-winning non-executive director, behavioral scientist, TEDx speaker, and author. Named among the 100 most influential women in finance, she spent three decades in investment management at BlackRock, Merrill Lynch, and Standard Life Aberdeen as Chief Marketing Officer. Today, as CEO at MindEquity and a founding director of the Global Association of Behavioral Scientists, she advises on behavior change, culture, and reputation. Noala holds multiple appointments industry-wide. She is non-executive director at British and Irish Lions, president of Harvard Club of Ireland, chair of the Innocence Project London, council member at the Football Association, and former vice chair at UN Women UK. Her insights have been published in Forbes, Inc., Psychology Today, Harvard Business Review, the Financial Times, Fox Business, and BBC. And very excitingly, Nuala has just published an amazing book called Tune In, How to Make Smarter Decisions in a Noisy World, which is going to be at the heart of our conversation today. Nuala, welcome to our Be Good podcast.

  • Speaker #1

    Thank you very much. It is a huge pleasure to be here.

  • Speaker #2

    So thanks a lot, Nuala, again for being with us today. Before talking about your amazing book, we would like to know a little more about... You and your career! I think you received a master degree in behavioral science and business studies alongside a degree in philosophy. Can you tell us about how you came to be interested in behavioral science in general, and maybe specifically about your interest in decision-making process?

  • Speaker #1

    Sure, Eric. You're absolutely right. I've actually always been interested in human behavior. I just took a slightly circuitous route to get there. Because when I finished my first degree in Trinity, I actually studied forensic psychology straight after that. So that was always at the heart of what I wanted to do and was interested to do. And then I just had a 30 year career in the meantime before I went back to study at the London School of Economics to do the master's in behavioural science. But my interest, I think, was always there. So and it's since then that I've really applied the the insights that I learned there to business. And my thesis was actually on whistleblowing and the bystander effect. So that is not quite forensic psychology, but there is an element there, I think of that criminology. And then, as you say, when I set up my own consultancy, where I do advise firms, as you mentioned earlier on, and I sit on boards, I was able to see the mistakes that people made and they were preventable mistakes. So all of the theory, if you like, suddenly came into vogue and it became very clear why people were making these decisions and more importantly, how preventable they were. And that sort of led me to the point where, well, if mistakes are predictable, they're also preventable. And if only people knew the theory behind it. I think they would be much more equipped and enabled to prevent some of those errors.

  • Speaker #2

    Could you share now with us if you have any mentors that had a particularly strong influence on you? Do you have maybe any researchers or other people who have played an influential role in your professional career and in your interest in behavioral science?

  • Speaker #1

    quite a few I mean you know well you've interviewed most of them on this podcast I think there are so many researchers but I will say that in the beginning I attended a course in Harvard with Jennifer Lerner who is the expert on emotion and many other things but she teaches she teaches people how to be a decision architect and that was probably the first one that I had attended and and that was before I did the the master's in in the LSE so I sort of attribute that to the start and her influence there. And in fairness, she's been a supporter ever since, and she now sits on our board in Gabs as well. But I would also, I would pick out the LSE faculty, all of them, you know, I won't embarrass them by naming them, but they know who they are. And I think that group in particular, I think between them, they sparked or they created the spark, if you like. And that sort of, there are so many in this field, as you know, and they're all different. they're all really talented and they're experts in a particular area, which is probably why I've got 500, you know, references in the book. There's so many of them to pick out. But I think, whereas those individuals might have created this market, COVID provided the opportunity, believe it or not. So the timing and the chance to actually do it after so many years of being in the field and actually then applying the theory to people's reality.

  • Speaker #0

    Fantastic. So, Nuala, as I mentioned earlier, your recent book, Tune In, How to Make Smarter Decisions in a Noisy World, was just published. Before we discuss the content of it, can you tell us more about the inspiration behind writing it? How did the idea for the book come to you?

  • Speaker #1

    Sure. Well, Suzanne, I actually always wanted to write a book. I just wasn't sure what kind of book I was ever going to write. And people do say that there's a book in all of us. And I do think there is a book in all of us. We certainly all have enough stories and enough experience. education, if you like, to teach other people or lessons what we get right and what we get wrong. The output wasn't what I thought. Before I had done this master's, I was probably going to write a book along the lines of Mark McCormack's, what they didn't teach you in Harvard Business School, because that's the first one that I read when I started my career. And I always loved it and thought it was great. And I used it as a bit of a manual. So in my head, I always said, oh, I'll teach people the 50 tips and all the tricks and what you get wrong. So but when I did the master's, I learned something different. I guess it opened my mind to something different. So the output, the result is, of course, something that's completely rooted in science now, rather than me and my experience, which might have been the earlier book, if you like. So and then when I did that, when I saw all those excellent researchers and experts in this field, I guess I wanted to contribute in some way. So and that's why I thought. there are a lot of books on decision making. There are a lot of books on judgment. And that's why I suppose you ask, how did the idea come? That's why I deliberately went out of my way to try and find something that I thought was different. And I think this idea of deaf spots has not been covered elsewhere in the field. And I wanted to focus on something that wasn't just a list of biases or something that was just an academic reference. I wanted it to be really practical that people could, yes, it was rooted in science, but would be able to... use as a mnemonic and refer to it pretty easily when they were in these high stakes situations. So the idea, the actual final output, of course, it's never exactly as you start out. So the refinement. of this idea of being tuned in or tuned out came from that. And I, on all of the stories and the examples I have, I had to put a filter on at the very end, or at least halfway through to say, was this person tuned in or tuned out? Can I attribute this to deaf ear syndrome or a deaf spot, or they didn't listen, or they were motivated to mishear or hear something. And that was sort of my lens for, you know, keeping people in and taking them out. And that led me to this idea that tuning out, you know, is a hidden source of misinformation, but tuning in is a hidden source of opportunity. So it is quite, you know, binary, polarised in that way. And, you know, it came from that really. And then it was easy. It was easy to evidence why it mattered because, you know, these mistakes cost the average Fortune 500. company, 250 million a year, at least, I can give you a really long list. We could be here for an hour talking about the mistakes and the evidence as to why people's bad decisions are so critical, particularly when people hold power.

  • Speaker #0

    Absolutely. Yeah. So that title, Tuning In, makes so much sense. Can you tell us more about your research method? You talked a lot about all of your references, but also all of your experiences. Can you tell us how you combine those two?

  • Speaker #1

    Well, I didn't do RCTs and somebody once asked me, you know, is this all based on primary research? Well, it isn't primary research in that sense of RCTs. It is a blend of primary and secondary. Primary is my own research, but it's deliberately a spectrum. There's a heavy reliance on practical case studies, as in real life case studies, supplemented by all of these behavioural science experiments. And so that's why I do have an index that spans 25 pages, because it is, I think Eric said earlier on, it is very rich in terms of its content. So the conceptual framework is rooted in science. And so are all of the key concepts. So it was a blend. So I was more comfortable with that rather than going out, starting and making it based on one particular. experiment and then I exploded it into a theory. I wanted it to be a compendium of the best of other people's thinking as well, but for me to add a slightly different slant to it, if that makes sense.

  • Speaker #0

    Absolutely. That makes total sense. And I know you've given us a little bit of a preview of some of the important concepts, but before Eric asks you more details about all your great frameworks, can you tell our listeners, what is the one key takeaway? What would the headline be of your book?

  • Speaker #1

    Probably that what you hear is as important as what you see. And that we're more at risk than ever of tuning out the important voices and rushing to judgment. But if you at least remember the perimeter is traps and the need to tune in and to consciously think about who and what you're tuning into, I think you stand out in your field rather than lose out or miss out.

  • Speaker #2

    Okay, Nuala, now it's time to delve into the challenge of decision-making from an individual perspective. But before, I'd like us to discuss what you call, I think, the external context within which our decisions are made. And first, can you tell us why it's so tough to make good decisions in what, again, you call today's noisy world? What are the main factors that shape our judgment?

  • Speaker #1

    Sure. Well, and I think even just before we mention that, Eric, it's probably just useful to let people know that tuning in is the solution to the problem of tuning out. And it is a combination of two things that make us tune in. or tune out. One is the context, the external context, and the second thing is our cognition. So looking at the external environment, I point to four factors. Now, I'm sure there are more, but I think these four in particular are increasing our vulnerability to error. The first one is it's a speedy world. And I think we all know that we live in this fast-paced, frantic lifestyle, but that accelerates our speed of making decisions and our short-term our short-term thinking. And this whole ecosystem that we operate in today amplifies that probability of making fast choices, which aren't necessarily the best choices. The second one is data. And we know that there is excessive data, and we also know that it's overwhelming. Yet it is our new normal, and we're forced to make decisions in that noisy sort of context. So there's plenty of evidence to that. I mean, Microsoft find that, I think it's 68%. employees just don't even have enough uninterrupted time for them to do their jobs. There's a third layer of which I've introduced, which is quite different. And I don't think people particularly think about it. And that's the difference between, you know, the arrow words, what we see and what we hear. So I argue it's a very visual world, more so than before. And because it's visual, what we see dominates and skews our interpretation. So you only need to think about Instagram or first impressions. we see what we want and we don't consider enough what we hear. And, you know, one example of that can be stereotyping from, you know, hiring to refereeing matches or, you know, even awarding loans. So that's a problem when it comes to misjudgment. And the last one is the fact that, you know, binary thinking. We live in a polarised world. Half the world is going to vote this year and people are becoming more and more entrenched in their thinking. So we think in these polarised ways. And some of these binary classifications are just embedded in our systems and structures. I mean, if you're a marketeer, you segment your customers and it's very sensible, but we still do it. If you look at your colleagues, you know, in the BVA nudge unit, do you think of your colleagues as introverts or extroverts or high or low potential, you know, sporty or academic? But when we do that, as you well know, we narrow our perspective when we think in this way. So despite... the advances of the 21st century with data and all of these things, the odds are stacked against good judgment in this high-speed visual world. So when I work with my clients, as I'm sure you do, I find that these factors just diminish the time that the people have available to devote to tuning in and, you know, reinterpreting different conversations and not making this time, you know, is a mistake and is a judgment killer.

  • Speaker #2

    Yeah. There is this question, you mentioned this judgment. killers. And could you again explain for our listeners, I think it is three main judgment killers that you call blind spot, deaf spot and dumb spot.

  • Speaker #1

    Well, indeed, Eric, and I've combined them and I've called them the trilogy of error. Sometimes I've called them the trilogy of terror, if you get them wrong as well. And you can guess what they are. I mean, you know what the psychological blind spots are. they don't work in isolation on our judgment. And so you've heard of those, but people haven't really heard of death spots, but they do exist actually, because there's been a psychologist. that discovered these or certainly wrote about these in the 1960s. But it refers to the failure to interpret what people say. In the same way, blind spots is about what people see. And then, of course, dumb spots is about what people hear or don't hear. And this is when people typically don't speak up. Because if someone remains silent, by definition, you can't hear their voices. And again, the challenge is that we just don't think about this trilogy when we're under pressure. or in uncertainty or crisis. But I thought it was quite a nice way to pull together, you know, the fact that there is this trilogy out there. And even if people think in their own way that these exist, again, it's a simple way for people to think twice and try and at least pause and make more measured decisions when they are under pressure or in crisis or uncertainty. So the skill is to reinterpret what people hear.

  • Speaker #2

    Yes, we have been lucky to interview some weeks ago for the Human Advantage, but also for the podcast, Amy Edmondson on cycle digital safety, which is, I think, one key point which makes within an organization very difficult to make a smart decision.

  • Speaker #1

    absolutely and and and she she is the the leader in that field and she's absolutely right because when there is no psychological safety it yes there are dumb spots but actually if when there's no safety people tune out so people tune out of the voices that matter and they and they don't speak up so they develop this combination of a blind spot and and a death spot so it's a combination so sometimes they come together sometimes it might be one that dominates more than the other but together they are lethal and they are these judgment killers

  • Speaker #2

    You also highlight that interpretation is also a challenge because we can't trust what we hear. Can you tell us more about this big gap, which is classical in behavioral science, between what is said and what is heard? Generally in behavioral science it is between intent and behavior, here it is between said and heard.

  • Speaker #1

    Yes, and this could be a very long answer as well, Eric, because you can bring into this, you know, cultural factors, you know, what's heard versus what's said, language, euphemisms, tone of voice. So I was very conscious when I was writing this bit that it could be very long. So what I was keen to make sure that people didn't think of this book as a book about listening, because it's not about listening. It's about interpreting what you hear. It's a second order effect. So we all can listen. and here but we don't always choose to reinterpret we take things at face value we um jump to conclusions we misinterpret you know what we see we encode we don't decode so that's an important differentiation so this is about the difference between what you see what you sorry what's heard and and what's what's what's meant is also the extent to which you're willing to spend some bit of time decoding and it could be people's agendas it could be anything it could be you know people people's intent as you mentioned earlier on. But when you don't do it, you only have to think of, you know, people just take things at face value. I mean, you can look at genocide, you can look at misinterpreted military instructions because people didn't think about what they heard, they just did and obeyed. Or, you know, mishearing. I think there were some very powerful examples of people who genuinely just accidentally misheard, you know, air traffic controllers. Now that's not... I should pause and reinterpret because you might be in an emergency situation and you might not have time to do that. But in many cases, there is time. So you might not diagnose an unusual illness. You just have jumped to conclusions and rushed to judgment because we live in this. noisy, fast-paced, data-filmed, overwhelming world, and we just don't have time to slow down. I mean, aren't we all still catching up with yesterday and the day before? People can barely keep up with yesterday, and it's just this cycle that keeps going. And that's why I outline a number of different solutions to boost interpretation, which are all in the third part of the book.

  • Speaker #0

    Excellent. So we've covered the external challenges for cognition. Now let's talk about the internal. You mentioned in your book that there are structural traps that lead us to making poor decisions, of course. And you mentioned the great acronym that you've come up to summarize these traps, which is perimeters. And for our listeners, I'm going to walk you through that acronym real quick. So each letter is a specific trap or bias. P of perimeters is for power. E is for ego. R is risk. I is identity. M is memory. E, ethics. T, time. E, emotion. R, relationships. And S is stories. So before we get into the details of some of these traps, Nuala, could you give us an overview around the major levels that you describe, which are individual biases, organizational traps, and those related to society?

  • Speaker #1

    Yes. And to be honest, really the point there is that this can affect people at every level. It can be individual, organizational or societal. So an individual bias is probably fairly obvious. It relates to the ones for which you are responsible for. And it could be ego, it could be any of these, but ego is one that is particularly referred to an individual. So an organization bias affects the collective. So you talked about Amy Edmondson, it could be a conservative or a risk-taking culture. So, you know, risk, but risk can also be an individual bias and it can be an organizational bias based on the culture. So and then from a societal perspective, the traps can be cause and effect. So stories is one, for example. So a legend and a folklore of a particular country and, you know, the Loch Ness Monster or crying statues, whether it's true or not, people believe them. And this affects your judgment. And then the traps can hit all three levels. So, for example, if you take following the crowd. You know, that's an individual, does it? An organisation certainly does it. And societies and different groups also do it as well. So the point is that. And I deliberately interspersed in the book examples that hit all three, so that people wouldn't just say or wouldn't just think this only applies to an individual's decision making. It can be, you know, a national, well, national leaders making decisions for countries or organisation leaders. And then it's the collective because the crowd decides as well, in many respects, individuals in the crowd, but the crowd as a collective, as you know.

  • Speaker #0

    For sure. And so why do you think each of these traps is dangerous for making smart decisions?

  • Speaker #1

    Because the evidence is there. I mean, individually, they're dangerous, but collectively, they're lethal because we repeat them. And I think the evidence speaks for itself, whether you're talking about scams, whether you're talking about scandals, missed signals, whether it's 9-11, human error. There are so many examples, fines, misconduct, miscarriages of justice, because the problem is we don't notice or reinterpret what we hear. And so you have all of these examples. and as we've said it's a combination of the environment and us they're so hidden we don't notice them we're so biased we don't think we make them we think we're great decision makers we don't even question whether making mistakes yet as I said history repeats itself we still get scammed businesses fail you know ego traps predominate all the time and we miss these signals and tune out so by definition it's always going to be a problem for people in their decision making

  • Speaker #0

    And then for each of those traps in the book, you explain the specific mechanics that can lead us to making those poor decisions. So one example for our listeners, the first trap, you describe the power-based traps, and you mentioned six different biases that come into play, which include the authority bias, halo effect, champion bias, the contrast effect, and the just world hypothesis. Can you explain some of those biases and illustrate them for our listeners with some real-life examples?

  • Speaker #1

    Yes, and each of them do have quite a few. I was never short of an example, Suzanne. So is that a good thing or a bad thing? It was probably a good thing if you're a writer, it's a bad thing for reality. So even if I just take narrow focus. So narrow focus occurs when we're too goal-oriented and we're in search of power. So I should say that power-related traps are all, it doesn't mean you have power or haven't got power. These are traps that occur when you're chasing power, afraid of losing power. They're power-related. rather than being about I have power or I don't have power. Anyone at any level can have a narrow focus when they're too goal-oriented in search of power or trying to protect the power that they already have. And they make dangerously damaging decisions. So even if you look back at, I don't know if you remember in 2016, Wells Fargo, when they knowingly fabricated these millions of fake customer accounts. The leaders were goal-oriented, had a narrow focus in search of industry power, and they instructed their teams to achieve eight accounts. per customer. And the CEO, when he was interviewed by the Senate Banking Committee, you know, it just explained that eight rhymes with great and compounded by ego. He blamed his employees, not that toxic culture that we were talking about earlier. So leadership in that situation turned a deaf ear to these stress sales teams. And, you know, at the end of the day, it cost the bank three billion and a CEO's career because people accept the story that they want to hear and they turn. a deaf ear to the rest. Another one which is pretty obvious is maybe authority bias when we obey the person in charge and don't speak up enough. So lots of examples there. You could take Theranos with Elizabeth Holmes, where people knew that the Edison finger prick technology was based on a lie and the majority stayed silent. Some did speak up and that's how it came to light, but the majority for too long didn't speak up. And then you have maybe the just world hypothesis. Which happens when we trust that good people get rewarded and evil punished. And when you do that, you typically lose power. So it doesn't bear out in reality because we often over trust the system. And an example there might be the wrongfully convicted or the British post office example where employees trust that justice would serve them. But yet it didn't because they believed in a just world. And when you do that, you are naive. Your view is slightly narrow. You might also see it in M&A where the hardworking employee thinks that they're going to get noticed or rewarded or promoted. But, you know, they naively overlook and don't decode the messages in that situation. So they trust what they hear and they don't reinterpret sentences, statements, promises, you know, for politics. So we know that thinking is hard. And I think as Carl Jung says, that's why most people judge. So these occur, these occur all the time. But the power related ones, I think, are fascinating because we all seek power to some extent. And again, it depends on your definition of power. But, you know, you could be a parent and seek power in the house or over the children, or a professor or a politician or a president. So this one is really pervasive.

  • Speaker #0

    And unfortunately, we don't have time to go into each of the 10 traps and perimeters, but we definitely encourage our listeners to read your book to learn about all of them in detail. But Nuala, according to you, which one do you consider the most dangerous and most common for creating decision misinformation? Well, I might disappoint you, Suzanne, because I actually don't consider one the most dangerous. They're all common. They're all common. They're all dangerous. They're also also interrelated. So if you take the risk based traps, which, you know, they are a function of ego and emotion and time. We mentioned power a minute ago. So power based traps are a function of ego. You see the crowd in there, the crowd effect. You see ethics in there. So you could take any scandal and analyse it in terms of the different weightings of these traps, if you like, whether it's the Ukraine war or 9-11, pick anything, and you can absolutely translate it in terms of these. But the good news is, of course, is that opportunity exists once you know about these, you at least have half a chance of mitigating error and trying to prevent some of these. So it's not all doom and gloom. There is hope if you choose to spend time. and rethink about what you're doing.

  • Speaker #1

    And read the book to find out about all 10.

  • Speaker #0

    Well.

  • Speaker #2

    Yes, Nuala, we have understood. And maybe it's unfortunate, I don't know, why it is so difficult to make a smart decision, good decision because of external factor, because our internal cognition and all this bias, which are all... important and dangerous, but hopefully you have some solution. And now it's time to talk about your solution for making good decisions. So we understand that it is a really strong challenge, and our psychology, in addition to the external context, does not help us with many of the traps you have summarized in these fantastic acronym perimeters. But you suggest, you propose, you offer a roadmap and very, I think, concrete solution to become what you call a decision ninja. So before discussing this concrete solution, could you define what you mean by a decision ninja? Is it a superhuman?

  • Speaker #0

    Well, I think it's a superpower. I don't know about a superhuman, but it's someone who uses their superpower. It's very simply, Eric, somebody who's tuning in to the voices that really matter so that they stand out, as I said, rather than lose out. So it is someone who's intentional about interpretation. someone who consciously notices what's said and what's not said. So it is a little bit aspirational, but it does encourage people to be more thoughtful and more responsible for the decisions they make. And lots of people do it. So if you're a journalist or an investigator cracking cases, people do it. So yes, people get it wrong, but a lot of people also get it right.

  • Speaker #2

    Do you have in mind an example in real life of someone who is, from your perspective, a decision ninja?

  • Speaker #0

    I don't have one in particular, but in the book, I decided in every chapter, I made absolutely sure that I included at least one at the end of every chapter so that people could learn from other people. So people who are facing these perimeter traps, what did they do and how did they do it? So you've got a lot of examples of people who get it right in there as well.

  • Speaker #2

    Okay, thanks. So first, your first solution, I would say, it's about adopting a specific mindset that you call, again, and congrats for this, AAA, I don't know if it is the right pronunciation, mindset for outcome anticipation, outcome attitude, and outcome acceptance. Could you explain each of these three factors that we need to control to make better decisions?

  • Speaker #0

    Yeah, I've called it, I actually called it AAA, Eric, because it was a bit of a pun on my investment background. So that was the three A's. As you can see, I'm clearly trying to get people to remember some of this stuff by making it easy. Another mnemonic for people, but AAA was easy. So all it really says is that you control your own mindset, that if you go into your decision making, thinking about the fact that you can. you can't control everything, but you can control your anticipation, you know, your attitude to what happens and you can control whether you accept it or not. So you minimize the liability, if you like, when you do control these. So, and becoming this decision ninja relies on, you know, this being intentional, et cetera. So it's really just putting people into the right frame of mind. If you decide, woe is me and I have no control here. you're not really going to get very far anyway. So really, I was just trying to get people into a more positive mindset before they make this decision, before they adopt any of the approaches. So that's the suggestion with the triple A mindset, just an attitude of, you know, acceptance and anticipation. So predict what's going to happen, control your own attitude. And then when it happens, it happens and deal with the consequences.

  • Speaker #2

    But it's not only a question of state of mind or mindset. You have also, and you suggest also, what you call the sonic strategy. Again, I need to help your readers to remember each of the five key actions. S for slow down, O for organize your attention, N for navigate novel perspective, I for interrupt mindset, And C for calibrate situation, stranger and strategy. Could you briefly summarize each of these five key objectives?

  • Speaker #0

    Yeah, and they are very, very brief, Eric. And you would do it in a process. So I deliberately chose them. I mean, Sonic was deliberate because it plays on death spots, obviously. But you do it in a process. So first of all, you need to just slow down to enable yourself to even challenge your thinking. If you don't do that. you're kind of not even at the races. So if you don't organise your attention, so in this frantic, noisy world, I'm suggesting you do that so that you can actually be in the best position to make a decision. And there are three or four different, I should say there are three or four different strategies for each of these, all science-based, all should be familiar to people. Then thirdly, you navigate these novel perspectives so that you don't rush to judgment and you at least take into consideration other people's views. interrupting mindsets. I did quite like this one because you're mentally putting on the brakes so that you again avoid that rush to judgment or the assumption of validity. So you interrupt your own mindset and you interrupt other people's as well. And again, two or three techniques you can use for that. And then the last one is, you know, kind of recalibrating situations and strangers and strategies is the last stop to getting it right. And that's using checklists and implementation intentions. And you know, techniques that would be certainly in the behavioural science field that people would be familiar with. But again, really, really easy. And people have the choice to try some and, you know, use these as befits the particular decision scenario.

  • Speaker #2

    Yeah, it's funny because you mentioned, I think, friction. and sludge, and the interest to put friction into our decision-making process.

  • Speaker #0

    That's exactly right. And decision friction, so the sonic strategies are based on decision friction. And a lot of people don't like friction, as we know. But if you deliberately slow down, this is like a speed bump for the mind, you slow down in the initial cases so that you can at least make a better decision. If you keep hurtling along... you might make a great decision. I mean, some people make great decisions at speed, but you're more likely to make a mistake than if you don't at least reconsider it. So you're absolutely right. I'm using decision friction in a positive sense rather than the negative sense.

  • Speaker #2

    Yeah. It's funny because we have interviewed some months ago Cass Sunstein on... One of his new books, but Cash is used to publish a new book every four months, which is incredible. And this was a book about sludge and the danger of sludge in his case. What is really interesting, and I think very interesting, Nuala, is for each of these objectives, it's not only the sonic strategies that you recommend, but you recommend very specific tools or processes to be successful at implementing the sonic judgment strategy. Could you again share one or two, maybe three, examples of the tools or process you suggest for our listeners.

  • Speaker #0

    Yes, and again, they're specific to any situation, and it depends whether you're on the S, O, N, I, or C. So I'll give you an S and a C, so the beginning and the end of one. I like, I mean, and again, people like different ones. I like the five Ys because... You know, and the five whys is based on a 1970s idea from Toyota, the car manufacturer, and they developed a very simple approach to solving problems that's also now used in Six Sigma. I merely have adapted this for decision making in this case. So, you know, Toyota assumes that most technical problems is based on a human problem. And basically they ask themselves why five times to ascertain the underlying cause. And I think this is useful for. you know, probing false reasoning or checking your assumptions. And it does mitigate against biases like probability neglect or loss aversion or commitment escalation. So any combination is possible here. Why this? Why that? Why the other? Why this? It's the same as the and then and then and then and then and then. So when you do that, it's easy to remember, but you might do it when you're feeling uncertain about a particular decision. well, what if I do that? And so what? And so what? And so what? And I think when you do that, it de-escalates the stress that you're under when you're actually making a decision. But in this case, it introduces decision friction and it slows you down. So if you ask, you have to wait till you get to five. So you should be slowed down by three or four, at least by the time you've thought of the answers. So it's a way to slow you down. another one, which is one of my favorites. And this is at the end. This is a way to get people to make sure that they do this. So initially, you need to commit that if you want to be a decision ninja, well, you need to at least commit to making an effort in the space. So this is to overcome the low implementation intentions, to overcome low willpower when people are deciding fast. So if you're a really fast decision maker and you decided, I want to get this right and this really matters. Well, then it's again, it's that if then plan that addresses the behaviour intention gap. And so we know that this works from appointment keeping to voting. So we know that pre-planning and commitment works. So if you pre-commit to being a better decision, it helps.

  • Speaker #2

    Yeah, I remember we have just interviewed one month ago with Suzanne Todd Rogers. I think he made a wonderful experiment to demonstrate the power of implementing a plan to encourage action and not just intent. In his case, it is about voting.

  • Speaker #0

    Yes, absolutely right. And the fantastic studies they have done. And again, this is just adapting the same principle. So if you commit to using the perimeters checklist, and what I've done is I've put all the biases together under that perimeters checklist, and that's available to people. Or you make a decision rule that says, I'm always going to distrust the information, but verify kind of a play on the Ronald Reagan trust, but verify idea, then you then it becomes a habit. And we all know about the power of habits. And then you have increased your probability of reflecting more and reducing error.

  • Speaker #2

    Now, you have learned that we have delved into the details and again, as Suzanne mentioned before, we do recommend our listeners to read your book because there are so many concrete recommendations to help us to make a better decision. But could you help us take a step back now and in a nutshell, summarize for our listeners?

  • Speaker #0

    listener your main advice for making smart decisions i'll go back to i'll go back to the one that i mentioned earlier on eric and that is tuning out you tune out is a hidden source of misinformation but tuning in is a hidden source of opportunity and there's a fantastic cherokee proverb that summarizes that which is if you listen to the whispers you won't have to hear the screams and that's the same as saying diagnose the information before you prescribe And I think when people tune in to others, not only do they hear the right messages and hear what other people don't, giving them an advantage, but other people will tune in to them. And we do know that because you relate better, you'll understand other people more, and by definition you win. So tune in to win, tune out, lose out. And I think if people remember that, that's pretty easy.

  • Speaker #2

    Okay. Only before my last question, an additional question. I know that you work for organization leaders, CEOs and so on. It is what we also are fortunate to do at BV Energy Consulting. Do you have a specific recommendation to make smarter decisions for leaders, organization leaders?

  • Speaker #0

    I think it's in the book, Eric. I think that if they remember the perimeter is traps, if they choose it, because there are so many there. And again, it's too easy to just say, listen better or tune in differently. But if you slow down and strategically reconsider the voices that you listen to and the voices that you don't, we know that there's a problem of unheard voices. We know that there's a polarised society. We know that most people feel unheard, whether you're an employee, whether you're a citizen, whether you're a customer. people are feeling unheard now, way more than ever before. As a decision, as a decision maker, as a leader, when you choose to listen to voices differently, you know, whether it's ego, conscience, the voice of comfort, familiarity, reconsider the voices you listen to, you will make better decisions.

  • Speaker #2

    Now my final question for you as we are at the end of our really insightful conversation. You have conducted a really brilliant analysis of the external and internal causes that explain why it's so difficult to make decisions. You also propose very relevant decisions. But my question is in the end. Aren't you suggesting something that's very challenging, namely that humans become System 2 decision makers when we are intrinsically System 1 beings? How do you suggest we balance our imperfect human side with our best side?

  • Speaker #0

    It goes back to the commitment, Eric. You hold a position of power. It is your moral duty to get it right. When you are in a position of power impacting other people's lives, it is the onus is on you, the responsibility is on you to do it. And that requires, it requires a commitment to at least trying to make better decisions. Now, most of the time you might get them right, but the consequences are too high when you get them wrong. And I think that's what it is. Look at the consequences when people get it wrong. And that's the problem. Do you want to be on the right side of history or the wrong side of history? I think misjudgment is not entirely your fault in this noisy world. You know, we've got our internal problems, as you've just said, you know, internal misinformation, external disinformation. But judgment is a choice and you choose to get this right. You choose to be a decision ninja or you don't. But when you hold positions of power, other people's welfare, lives, livelihoods, happiness depends on you and your choices. I think it's beholden on you to do it.

  • Speaker #1

    Yes. Thank you so much again, Noala. This is a fantastic conversation and we're so happy that you joined us today. Is there anything you would like to leave our audience with, such as where they can find out more about you and your work?

  • Speaker #0

    Sure. You can obviously get the, you can get the book that's available on Amazon or all leading outlets. But in terms of the actual work, I've over a hundred articles and a lot more information on, on. different, you know, speaking events and stuff that I've done are on the website. And that is www.nualsh.com. So N-U-A-L-A-G-W-A-L-S-H dot com.

  • Speaker #2

    Thanks a lot, Nuala. It was a wonderful conversation. You are a very engaging speaker. And again, we do recommend with Suzanne to read your book because it is 350 pages of concrete and helpful recommendations. Thanks a lot.

  • Speaker #0

    Well, thank you both very much. It's been lots of fun. Be Good, a podcast by the BVA Nudge Unit.

Chapters

  • Introduction and Career

    01:02

  • Inspiration for the New Book, Tune In

    06:20

  • Unpacking Research Techniques

    09:48

  • Four Key Factors that Affect our Judgement

    11:45

  • Understanding the Trilogy of Error

    15:21

  • Structural Traps in Human Decision Making

    20:32

  • How to Tune into the Voices that Matter

    29:19

Description

šŸŽ§ In this episode we're excited to welcome Nuala Walsh author of the new book, Tune In: How to Make Smarter Decisions in a Noisy World. Named among the 100 most influential women in finance with over three decades of experience in investment management at BlackRock, Merrill Lynch, and Standard Life Aberdeen, she brings a wealth of knowledge to today's conversation. Currently serving as the CEO at Mind Equity, she advises organizations on behavior change, culture, and reputation.


Join us as we delve into her journey exploring the nuances of decision-making in a complex world and uncover strategies to navigate the noise and make smarter choices āœØ


During this conversation, weā€™ll explore:Ā 

  • šŸ“£ Four key factors that influence our ability to tune inĀ 

  • šŸ§  Cognitive challenges that affect our judgment and decision-making abilitiesĀ 

  • šŸ”Ž Practical case studies and behavioral science research for improving our judgementĀ 

Ā 

To learn more about Walsh's work visit www.nualawalsh.com.


šŸ‘‰Join the conversation and share your thoughts about this podcast on Twitter @BVANudgeConsult. Donā€™t have social media? Our inbox is always open at contact@bvanudgeconsulting.com.Ā Ā 


Hosted by Ausha. See ausha.co/privacy-policy for more information.

Transcription

  • Speaker #0

    Welcome to the Be Good Podcast,

  • Speaker #1

    where we explore the application of behavioral economics for good in order to nudge better business and better lives.

  • Speaker #2

    Hi and welcome to this episode of Be Good, brought to you by BVA Notch Consulting, a global consultancy specializing in the application of behavioral science for successful behavior change. Every month we get to speak with a leader in the field of behavioral science, psychology and neuroscience in order to get to know more about them, their work and its application to emerging issues. My name is Eric Singler, founder and CEO of BVN Notch Consulting, and with me is my colleague, Suzanne Kirkendall, CEO of BVN Notch Consulting North America. Hi, Suzanne.

  • Speaker #0

    Hi, Eric. I'm very happy to be back for another episode and delighted to be introducing today's guest, Nuala Walsh. Nuala is an award-winning non-executive director, behavioral scientist, TEDx speaker, and author. Named among the 100 most influential women in finance, she spent three decades in investment management at BlackRock, Merrill Lynch, and Standard Life Aberdeen as Chief Marketing Officer. Today, as CEO at MindEquity and a founding director of the Global Association of Behavioral Scientists, she advises on behavior change, culture, and reputation. Noala holds multiple appointments industry-wide. She is non-executive director at British and Irish Lions, president of Harvard Club of Ireland, chair of the Innocence Project London, council member at the Football Association, and former vice chair at UN Women UK. Her insights have been published in Forbes, Inc., Psychology Today, Harvard Business Review, the Financial Times, Fox Business, and BBC. And very excitingly, Nuala has just published an amazing book called Tune In, How to Make Smarter Decisions in a Noisy World, which is going to be at the heart of our conversation today. Nuala, welcome to our Be Good podcast.

  • Speaker #1

    Thank you very much. It is a huge pleasure to be here.

  • Speaker #2

    So thanks a lot, Nuala, again for being with us today. Before talking about your amazing book, we would like to know a little more about... You and your career! I think you received a master degree in behavioral science and business studies alongside a degree in philosophy. Can you tell us about how you came to be interested in behavioral science in general, and maybe specifically about your interest in decision-making process?

  • Speaker #1

    Sure, Eric. You're absolutely right. I've actually always been interested in human behavior. I just took a slightly circuitous route to get there. Because when I finished my first degree in Trinity, I actually studied forensic psychology straight after that. So that was always at the heart of what I wanted to do and was interested to do. And then I just had a 30 year career in the meantime before I went back to study at the London School of Economics to do the master's in behavioural science. But my interest, I think, was always there. So and it's since then that I've really applied the the insights that I learned there to business. And my thesis was actually on whistleblowing and the bystander effect. So that is not quite forensic psychology, but there is an element there, I think of that criminology. And then, as you say, when I set up my own consultancy, where I do advise firms, as you mentioned earlier on, and I sit on boards, I was able to see the mistakes that people made and they were preventable mistakes. So all of the theory, if you like, suddenly came into vogue and it became very clear why people were making these decisions and more importantly, how preventable they were. And that sort of led me to the point where, well, if mistakes are predictable, they're also preventable. And if only people knew the theory behind it. I think they would be much more equipped and enabled to prevent some of those errors.

  • Speaker #2

    Could you share now with us if you have any mentors that had a particularly strong influence on you? Do you have maybe any researchers or other people who have played an influential role in your professional career and in your interest in behavioral science?

  • Speaker #1

    quite a few I mean you know well you've interviewed most of them on this podcast I think there are so many researchers but I will say that in the beginning I attended a course in Harvard with Jennifer Lerner who is the expert on emotion and many other things but she teaches she teaches people how to be a decision architect and that was probably the first one that I had attended and and that was before I did the the master's in in the LSE so I sort of attribute that to the start and her influence there. And in fairness, she's been a supporter ever since, and she now sits on our board in Gabs as well. But I would also, I would pick out the LSE faculty, all of them, you know, I won't embarrass them by naming them, but they know who they are. And I think that group in particular, I think between them, they sparked or they created the spark, if you like. And that sort of, there are so many in this field, as you know, and they're all different. they're all really talented and they're experts in a particular area, which is probably why I've got 500, you know, references in the book. There's so many of them to pick out. But I think, whereas those individuals might have created this market, COVID provided the opportunity, believe it or not. So the timing and the chance to actually do it after so many years of being in the field and actually then applying the theory to people's reality.

  • Speaker #0

    Fantastic. So, Nuala, as I mentioned earlier, your recent book, Tune In, How to Make Smarter Decisions in a Noisy World, was just published. Before we discuss the content of it, can you tell us more about the inspiration behind writing it? How did the idea for the book come to you?

  • Speaker #1

    Sure. Well, Suzanne, I actually always wanted to write a book. I just wasn't sure what kind of book I was ever going to write. And people do say that there's a book in all of us. And I do think there is a book in all of us. We certainly all have enough stories and enough experience. education, if you like, to teach other people or lessons what we get right and what we get wrong. The output wasn't what I thought. Before I had done this master's, I was probably going to write a book along the lines of Mark McCormack's, what they didn't teach you in Harvard Business School, because that's the first one that I read when I started my career. And I always loved it and thought it was great. And I used it as a bit of a manual. So in my head, I always said, oh, I'll teach people the 50 tips and all the tricks and what you get wrong. So but when I did the master's, I learned something different. I guess it opened my mind to something different. So the output, the result is, of course, something that's completely rooted in science now, rather than me and my experience, which might have been the earlier book, if you like. So and then when I did that, when I saw all those excellent researchers and experts in this field, I guess I wanted to contribute in some way. So and that's why I thought. there are a lot of books on decision making. There are a lot of books on judgment. And that's why I suppose you ask, how did the idea come? That's why I deliberately went out of my way to try and find something that I thought was different. And I think this idea of deaf spots has not been covered elsewhere in the field. And I wanted to focus on something that wasn't just a list of biases or something that was just an academic reference. I wanted it to be really practical that people could, yes, it was rooted in science, but would be able to... use as a mnemonic and refer to it pretty easily when they were in these high stakes situations. So the idea, the actual final output, of course, it's never exactly as you start out. So the refinement. of this idea of being tuned in or tuned out came from that. And I, on all of the stories and the examples I have, I had to put a filter on at the very end, or at least halfway through to say, was this person tuned in or tuned out? Can I attribute this to deaf ear syndrome or a deaf spot, or they didn't listen, or they were motivated to mishear or hear something. And that was sort of my lens for, you know, keeping people in and taking them out. And that led me to this idea that tuning out, you know, is a hidden source of misinformation, but tuning in is a hidden source of opportunity. So it is quite, you know, binary, polarised in that way. And, you know, it came from that really. And then it was easy. It was easy to evidence why it mattered because, you know, these mistakes cost the average Fortune 500. company, 250 million a year, at least, I can give you a really long list. We could be here for an hour talking about the mistakes and the evidence as to why people's bad decisions are so critical, particularly when people hold power.

  • Speaker #0

    Absolutely. Yeah. So that title, Tuning In, makes so much sense. Can you tell us more about your research method? You talked a lot about all of your references, but also all of your experiences. Can you tell us how you combine those two?

  • Speaker #1

    Well, I didn't do RCTs and somebody once asked me, you know, is this all based on primary research? Well, it isn't primary research in that sense of RCTs. It is a blend of primary and secondary. Primary is my own research, but it's deliberately a spectrum. There's a heavy reliance on practical case studies, as in real life case studies, supplemented by all of these behavioural science experiments. And so that's why I do have an index that spans 25 pages, because it is, I think Eric said earlier on, it is very rich in terms of its content. So the conceptual framework is rooted in science. And so are all of the key concepts. So it was a blend. So I was more comfortable with that rather than going out, starting and making it based on one particular. experiment and then I exploded it into a theory. I wanted it to be a compendium of the best of other people's thinking as well, but for me to add a slightly different slant to it, if that makes sense.

  • Speaker #0

    Absolutely. That makes total sense. And I know you've given us a little bit of a preview of some of the important concepts, but before Eric asks you more details about all your great frameworks, can you tell our listeners, what is the one key takeaway? What would the headline be of your book?

  • Speaker #1

    Probably that what you hear is as important as what you see. And that we're more at risk than ever of tuning out the important voices and rushing to judgment. But if you at least remember the perimeter is traps and the need to tune in and to consciously think about who and what you're tuning into, I think you stand out in your field rather than lose out or miss out.

  • Speaker #2

    Okay, Nuala, now it's time to delve into the challenge of decision-making from an individual perspective. But before, I'd like us to discuss what you call, I think, the external context within which our decisions are made. And first, can you tell us why it's so tough to make good decisions in what, again, you call today's noisy world? What are the main factors that shape our judgment?

  • Speaker #1

    Sure. Well, and I think even just before we mention that, Eric, it's probably just useful to let people know that tuning in is the solution to the problem of tuning out. And it is a combination of two things that make us tune in. or tune out. One is the context, the external context, and the second thing is our cognition. So looking at the external environment, I point to four factors. Now, I'm sure there are more, but I think these four in particular are increasing our vulnerability to error. The first one is it's a speedy world. And I think we all know that we live in this fast-paced, frantic lifestyle, but that accelerates our speed of making decisions and our short-term our short-term thinking. And this whole ecosystem that we operate in today amplifies that probability of making fast choices, which aren't necessarily the best choices. The second one is data. And we know that there is excessive data, and we also know that it's overwhelming. Yet it is our new normal, and we're forced to make decisions in that noisy sort of context. So there's plenty of evidence to that. I mean, Microsoft find that, I think it's 68%. employees just don't even have enough uninterrupted time for them to do their jobs. There's a third layer of which I've introduced, which is quite different. And I don't think people particularly think about it. And that's the difference between, you know, the arrow words, what we see and what we hear. So I argue it's a very visual world, more so than before. And because it's visual, what we see dominates and skews our interpretation. So you only need to think about Instagram or first impressions. we see what we want and we don't consider enough what we hear. And, you know, one example of that can be stereotyping from, you know, hiring to refereeing matches or, you know, even awarding loans. So that's a problem when it comes to misjudgment. And the last one is the fact that, you know, binary thinking. We live in a polarised world. Half the world is going to vote this year and people are becoming more and more entrenched in their thinking. So we think in these polarised ways. And some of these binary classifications are just embedded in our systems and structures. I mean, if you're a marketeer, you segment your customers and it's very sensible, but we still do it. If you look at your colleagues, you know, in the BVA nudge unit, do you think of your colleagues as introverts or extroverts or high or low potential, you know, sporty or academic? But when we do that, as you well know, we narrow our perspective when we think in this way. So despite... the advances of the 21st century with data and all of these things, the odds are stacked against good judgment in this high-speed visual world. So when I work with my clients, as I'm sure you do, I find that these factors just diminish the time that the people have available to devote to tuning in and, you know, reinterpreting different conversations and not making this time, you know, is a mistake and is a judgment killer.

  • Speaker #2

    Yeah. There is this question, you mentioned this judgment. killers. And could you again explain for our listeners, I think it is three main judgment killers that you call blind spot, deaf spot and dumb spot.

  • Speaker #1

    Well, indeed, Eric, and I've combined them and I've called them the trilogy of error. Sometimes I've called them the trilogy of terror, if you get them wrong as well. And you can guess what they are. I mean, you know what the psychological blind spots are. they don't work in isolation on our judgment. And so you've heard of those, but people haven't really heard of death spots, but they do exist actually, because there's been a psychologist. that discovered these or certainly wrote about these in the 1960s. But it refers to the failure to interpret what people say. In the same way, blind spots is about what people see. And then, of course, dumb spots is about what people hear or don't hear. And this is when people typically don't speak up. Because if someone remains silent, by definition, you can't hear their voices. And again, the challenge is that we just don't think about this trilogy when we're under pressure. or in uncertainty or crisis. But I thought it was quite a nice way to pull together, you know, the fact that there is this trilogy out there. And even if people think in their own way that these exist, again, it's a simple way for people to think twice and try and at least pause and make more measured decisions when they are under pressure or in crisis or uncertainty. So the skill is to reinterpret what people hear.

  • Speaker #2

    Yes, we have been lucky to interview some weeks ago for the Human Advantage, but also for the podcast, Amy Edmondson on cycle digital safety, which is, I think, one key point which makes within an organization very difficult to make a smart decision.

  • Speaker #1

    absolutely and and and she she is the the leader in that field and she's absolutely right because when there is no psychological safety it yes there are dumb spots but actually if when there's no safety people tune out so people tune out of the voices that matter and they and they don't speak up so they develop this combination of a blind spot and and a death spot so it's a combination so sometimes they come together sometimes it might be one that dominates more than the other but together they are lethal and they are these judgment killers

  • Speaker #2

    You also highlight that interpretation is also a challenge because we can't trust what we hear. Can you tell us more about this big gap, which is classical in behavioral science, between what is said and what is heard? Generally in behavioral science it is between intent and behavior, here it is between said and heard.

  • Speaker #1

    Yes, and this could be a very long answer as well, Eric, because you can bring into this, you know, cultural factors, you know, what's heard versus what's said, language, euphemisms, tone of voice. So I was very conscious when I was writing this bit that it could be very long. So what I was keen to make sure that people didn't think of this book as a book about listening, because it's not about listening. It's about interpreting what you hear. It's a second order effect. So we all can listen. and here but we don't always choose to reinterpret we take things at face value we um jump to conclusions we misinterpret you know what we see we encode we don't decode so that's an important differentiation so this is about the difference between what you see what you sorry what's heard and and what's what's what's meant is also the extent to which you're willing to spend some bit of time decoding and it could be people's agendas it could be anything it could be you know people people's intent as you mentioned earlier on. But when you don't do it, you only have to think of, you know, people just take things at face value. I mean, you can look at genocide, you can look at misinterpreted military instructions because people didn't think about what they heard, they just did and obeyed. Or, you know, mishearing. I think there were some very powerful examples of people who genuinely just accidentally misheard, you know, air traffic controllers. Now that's not... I should pause and reinterpret because you might be in an emergency situation and you might not have time to do that. But in many cases, there is time. So you might not diagnose an unusual illness. You just have jumped to conclusions and rushed to judgment because we live in this. noisy, fast-paced, data-filmed, overwhelming world, and we just don't have time to slow down. I mean, aren't we all still catching up with yesterday and the day before? People can barely keep up with yesterday, and it's just this cycle that keeps going. And that's why I outline a number of different solutions to boost interpretation, which are all in the third part of the book.

  • Speaker #0

    Excellent. So we've covered the external challenges for cognition. Now let's talk about the internal. You mentioned in your book that there are structural traps that lead us to making poor decisions, of course. And you mentioned the great acronym that you've come up to summarize these traps, which is perimeters. And for our listeners, I'm going to walk you through that acronym real quick. So each letter is a specific trap or bias. P of perimeters is for power. E is for ego. R is risk. I is identity. M is memory. E, ethics. T, time. E, emotion. R, relationships. And S is stories. So before we get into the details of some of these traps, Nuala, could you give us an overview around the major levels that you describe, which are individual biases, organizational traps, and those related to society?

  • Speaker #1

    Yes. And to be honest, really the point there is that this can affect people at every level. It can be individual, organizational or societal. So an individual bias is probably fairly obvious. It relates to the ones for which you are responsible for. And it could be ego, it could be any of these, but ego is one that is particularly referred to an individual. So an organization bias affects the collective. So you talked about Amy Edmondson, it could be a conservative or a risk-taking culture. So, you know, risk, but risk can also be an individual bias and it can be an organizational bias based on the culture. So and then from a societal perspective, the traps can be cause and effect. So stories is one, for example. So a legend and a folklore of a particular country and, you know, the Loch Ness Monster or crying statues, whether it's true or not, people believe them. And this affects your judgment. And then the traps can hit all three levels. So, for example, if you take following the crowd. You know, that's an individual, does it? An organisation certainly does it. And societies and different groups also do it as well. So the point is that. And I deliberately interspersed in the book examples that hit all three, so that people wouldn't just say or wouldn't just think this only applies to an individual's decision making. It can be, you know, a national, well, national leaders making decisions for countries or organisation leaders. And then it's the collective because the crowd decides as well, in many respects, individuals in the crowd, but the crowd as a collective, as you know.

  • Speaker #0

    For sure. And so why do you think each of these traps is dangerous for making smart decisions?

  • Speaker #1

    Because the evidence is there. I mean, individually, they're dangerous, but collectively, they're lethal because we repeat them. And I think the evidence speaks for itself, whether you're talking about scams, whether you're talking about scandals, missed signals, whether it's 9-11, human error. There are so many examples, fines, misconduct, miscarriages of justice, because the problem is we don't notice or reinterpret what we hear. And so you have all of these examples. and as we've said it's a combination of the environment and us they're so hidden we don't notice them we're so biased we don't think we make them we think we're great decision makers we don't even question whether making mistakes yet as I said history repeats itself we still get scammed businesses fail you know ego traps predominate all the time and we miss these signals and tune out so by definition it's always going to be a problem for people in their decision making

  • Speaker #0

    And then for each of those traps in the book, you explain the specific mechanics that can lead us to making those poor decisions. So one example for our listeners, the first trap, you describe the power-based traps, and you mentioned six different biases that come into play, which include the authority bias, halo effect, champion bias, the contrast effect, and the just world hypothesis. Can you explain some of those biases and illustrate them for our listeners with some real-life examples?

  • Speaker #1

    Yes, and each of them do have quite a few. I was never short of an example, Suzanne. So is that a good thing or a bad thing? It was probably a good thing if you're a writer, it's a bad thing for reality. So even if I just take narrow focus. So narrow focus occurs when we're too goal-oriented and we're in search of power. So I should say that power-related traps are all, it doesn't mean you have power or haven't got power. These are traps that occur when you're chasing power, afraid of losing power. They're power-related. rather than being about I have power or I don't have power. Anyone at any level can have a narrow focus when they're too goal-oriented in search of power or trying to protect the power that they already have. And they make dangerously damaging decisions. So even if you look back at, I don't know if you remember in 2016, Wells Fargo, when they knowingly fabricated these millions of fake customer accounts. The leaders were goal-oriented, had a narrow focus in search of industry power, and they instructed their teams to achieve eight accounts. per customer. And the CEO, when he was interviewed by the Senate Banking Committee, you know, it just explained that eight rhymes with great and compounded by ego. He blamed his employees, not that toxic culture that we were talking about earlier. So leadership in that situation turned a deaf ear to these stress sales teams. And, you know, at the end of the day, it cost the bank three billion and a CEO's career because people accept the story that they want to hear and they turn. a deaf ear to the rest. Another one which is pretty obvious is maybe authority bias when we obey the person in charge and don't speak up enough. So lots of examples there. You could take Theranos with Elizabeth Holmes, where people knew that the Edison finger prick technology was based on a lie and the majority stayed silent. Some did speak up and that's how it came to light, but the majority for too long didn't speak up. And then you have maybe the just world hypothesis. Which happens when we trust that good people get rewarded and evil punished. And when you do that, you typically lose power. So it doesn't bear out in reality because we often over trust the system. And an example there might be the wrongfully convicted or the British post office example where employees trust that justice would serve them. But yet it didn't because they believed in a just world. And when you do that, you are naive. Your view is slightly narrow. You might also see it in M&A where the hardworking employee thinks that they're going to get noticed or rewarded or promoted. But, you know, they naively overlook and don't decode the messages in that situation. So they trust what they hear and they don't reinterpret sentences, statements, promises, you know, for politics. So we know that thinking is hard. And I think as Carl Jung says, that's why most people judge. So these occur, these occur all the time. But the power related ones, I think, are fascinating because we all seek power to some extent. And again, it depends on your definition of power. But, you know, you could be a parent and seek power in the house or over the children, or a professor or a politician or a president. So this one is really pervasive.

  • Speaker #0

    And unfortunately, we don't have time to go into each of the 10 traps and perimeters, but we definitely encourage our listeners to read your book to learn about all of them in detail. But Nuala, according to you, which one do you consider the most dangerous and most common for creating decision misinformation? Well, I might disappoint you, Suzanne, because I actually don't consider one the most dangerous. They're all common. They're all common. They're all dangerous. They're also also interrelated. So if you take the risk based traps, which, you know, they are a function of ego and emotion and time. We mentioned power a minute ago. So power based traps are a function of ego. You see the crowd in there, the crowd effect. You see ethics in there. So you could take any scandal and analyse it in terms of the different weightings of these traps, if you like, whether it's the Ukraine war or 9-11, pick anything, and you can absolutely translate it in terms of these. But the good news is, of course, is that opportunity exists once you know about these, you at least have half a chance of mitigating error and trying to prevent some of these. So it's not all doom and gloom. There is hope if you choose to spend time. and rethink about what you're doing.

  • Speaker #1

    And read the book to find out about all 10.

  • Speaker #0

    Well.

  • Speaker #2

    Yes, Nuala, we have understood. And maybe it's unfortunate, I don't know, why it is so difficult to make a smart decision, good decision because of external factor, because our internal cognition and all this bias, which are all... important and dangerous, but hopefully you have some solution. And now it's time to talk about your solution for making good decisions. So we understand that it is a really strong challenge, and our psychology, in addition to the external context, does not help us with many of the traps you have summarized in these fantastic acronym perimeters. But you suggest, you propose, you offer a roadmap and very, I think, concrete solution to become what you call a decision ninja. So before discussing this concrete solution, could you define what you mean by a decision ninja? Is it a superhuman?

  • Speaker #0

    Well, I think it's a superpower. I don't know about a superhuman, but it's someone who uses their superpower. It's very simply, Eric, somebody who's tuning in to the voices that really matter so that they stand out, as I said, rather than lose out. So it is someone who's intentional about interpretation. someone who consciously notices what's said and what's not said. So it is a little bit aspirational, but it does encourage people to be more thoughtful and more responsible for the decisions they make. And lots of people do it. So if you're a journalist or an investigator cracking cases, people do it. So yes, people get it wrong, but a lot of people also get it right.

  • Speaker #2

    Do you have in mind an example in real life of someone who is, from your perspective, a decision ninja?

  • Speaker #0

    I don't have one in particular, but in the book, I decided in every chapter, I made absolutely sure that I included at least one at the end of every chapter so that people could learn from other people. So people who are facing these perimeter traps, what did they do and how did they do it? So you've got a lot of examples of people who get it right in there as well.

  • Speaker #2

    Okay, thanks. So first, your first solution, I would say, it's about adopting a specific mindset that you call, again, and congrats for this, AAA, I don't know if it is the right pronunciation, mindset for outcome anticipation, outcome attitude, and outcome acceptance. Could you explain each of these three factors that we need to control to make better decisions?

  • Speaker #0

    Yeah, I've called it, I actually called it AAA, Eric, because it was a bit of a pun on my investment background. So that was the three A's. As you can see, I'm clearly trying to get people to remember some of this stuff by making it easy. Another mnemonic for people, but AAA was easy. So all it really says is that you control your own mindset, that if you go into your decision making, thinking about the fact that you can. you can't control everything, but you can control your anticipation, you know, your attitude to what happens and you can control whether you accept it or not. So you minimize the liability, if you like, when you do control these. So, and becoming this decision ninja relies on, you know, this being intentional, et cetera. So it's really just putting people into the right frame of mind. If you decide, woe is me and I have no control here. you're not really going to get very far anyway. So really, I was just trying to get people into a more positive mindset before they make this decision, before they adopt any of the approaches. So that's the suggestion with the triple A mindset, just an attitude of, you know, acceptance and anticipation. So predict what's going to happen, control your own attitude. And then when it happens, it happens and deal with the consequences.

  • Speaker #2

    But it's not only a question of state of mind or mindset. You have also, and you suggest also, what you call the sonic strategy. Again, I need to help your readers to remember each of the five key actions. S for slow down, O for organize your attention, N for navigate novel perspective, I for interrupt mindset, And C for calibrate situation, stranger and strategy. Could you briefly summarize each of these five key objectives?

  • Speaker #0

    Yeah, and they are very, very brief, Eric. And you would do it in a process. So I deliberately chose them. I mean, Sonic was deliberate because it plays on death spots, obviously. But you do it in a process. So first of all, you need to just slow down to enable yourself to even challenge your thinking. If you don't do that. you're kind of not even at the races. So if you don't organise your attention, so in this frantic, noisy world, I'm suggesting you do that so that you can actually be in the best position to make a decision. And there are three or four different, I should say there are three or four different strategies for each of these, all science-based, all should be familiar to people. Then thirdly, you navigate these novel perspectives so that you don't rush to judgment and you at least take into consideration other people's views. interrupting mindsets. I did quite like this one because you're mentally putting on the brakes so that you again avoid that rush to judgment or the assumption of validity. So you interrupt your own mindset and you interrupt other people's as well. And again, two or three techniques you can use for that. And then the last one is, you know, kind of recalibrating situations and strangers and strategies is the last stop to getting it right. And that's using checklists and implementation intentions. And you know, techniques that would be certainly in the behavioural science field that people would be familiar with. But again, really, really easy. And people have the choice to try some and, you know, use these as befits the particular decision scenario.

  • Speaker #2

    Yeah, it's funny because you mentioned, I think, friction. and sludge, and the interest to put friction into our decision-making process.

  • Speaker #0

    That's exactly right. And decision friction, so the sonic strategies are based on decision friction. And a lot of people don't like friction, as we know. But if you deliberately slow down, this is like a speed bump for the mind, you slow down in the initial cases so that you can at least make a better decision. If you keep hurtling along... you might make a great decision. I mean, some people make great decisions at speed, but you're more likely to make a mistake than if you don't at least reconsider it. So you're absolutely right. I'm using decision friction in a positive sense rather than the negative sense.

  • Speaker #2

    Yeah. It's funny because we have interviewed some months ago Cass Sunstein on... One of his new books, but Cash is used to publish a new book every four months, which is incredible. And this was a book about sludge and the danger of sludge in his case. What is really interesting, and I think very interesting, Nuala, is for each of these objectives, it's not only the sonic strategies that you recommend, but you recommend very specific tools or processes to be successful at implementing the sonic judgment strategy. Could you again share one or two, maybe three, examples of the tools or process you suggest for our listeners.

  • Speaker #0

    Yes, and again, they're specific to any situation, and it depends whether you're on the S, O, N, I, or C. So I'll give you an S and a C, so the beginning and the end of one. I like, I mean, and again, people like different ones. I like the five Ys because... You know, and the five whys is based on a 1970s idea from Toyota, the car manufacturer, and they developed a very simple approach to solving problems that's also now used in Six Sigma. I merely have adapted this for decision making in this case. So, you know, Toyota assumes that most technical problems is based on a human problem. And basically they ask themselves why five times to ascertain the underlying cause. And I think this is useful for. you know, probing false reasoning or checking your assumptions. And it does mitigate against biases like probability neglect or loss aversion or commitment escalation. So any combination is possible here. Why this? Why that? Why the other? Why this? It's the same as the and then and then and then and then and then. So when you do that, it's easy to remember, but you might do it when you're feeling uncertain about a particular decision. well, what if I do that? And so what? And so what? And so what? And I think when you do that, it de-escalates the stress that you're under when you're actually making a decision. But in this case, it introduces decision friction and it slows you down. So if you ask, you have to wait till you get to five. So you should be slowed down by three or four, at least by the time you've thought of the answers. So it's a way to slow you down. another one, which is one of my favorites. And this is at the end. This is a way to get people to make sure that they do this. So initially, you need to commit that if you want to be a decision ninja, well, you need to at least commit to making an effort in the space. So this is to overcome the low implementation intentions, to overcome low willpower when people are deciding fast. So if you're a really fast decision maker and you decided, I want to get this right and this really matters. Well, then it's again, it's that if then plan that addresses the behaviour intention gap. And so we know that this works from appointment keeping to voting. So we know that pre-planning and commitment works. So if you pre-commit to being a better decision, it helps.

  • Speaker #2

    Yeah, I remember we have just interviewed one month ago with Suzanne Todd Rogers. I think he made a wonderful experiment to demonstrate the power of implementing a plan to encourage action and not just intent. In his case, it is about voting.

  • Speaker #0

    Yes, absolutely right. And the fantastic studies they have done. And again, this is just adapting the same principle. So if you commit to using the perimeters checklist, and what I've done is I've put all the biases together under that perimeters checklist, and that's available to people. Or you make a decision rule that says, I'm always going to distrust the information, but verify kind of a play on the Ronald Reagan trust, but verify idea, then you then it becomes a habit. And we all know about the power of habits. And then you have increased your probability of reflecting more and reducing error.

  • Speaker #2

    Now, you have learned that we have delved into the details and again, as Suzanne mentioned before, we do recommend our listeners to read your book because there are so many concrete recommendations to help us to make a better decision. But could you help us take a step back now and in a nutshell, summarize for our listeners?

  • Speaker #0

    listener your main advice for making smart decisions i'll go back to i'll go back to the one that i mentioned earlier on eric and that is tuning out you tune out is a hidden source of misinformation but tuning in is a hidden source of opportunity and there's a fantastic cherokee proverb that summarizes that which is if you listen to the whispers you won't have to hear the screams and that's the same as saying diagnose the information before you prescribe And I think when people tune in to others, not only do they hear the right messages and hear what other people don't, giving them an advantage, but other people will tune in to them. And we do know that because you relate better, you'll understand other people more, and by definition you win. So tune in to win, tune out, lose out. And I think if people remember that, that's pretty easy.

  • Speaker #2

    Okay. Only before my last question, an additional question. I know that you work for organization leaders, CEOs and so on. It is what we also are fortunate to do at BV Energy Consulting. Do you have a specific recommendation to make smarter decisions for leaders, organization leaders?

  • Speaker #0

    I think it's in the book, Eric. I think that if they remember the perimeter is traps, if they choose it, because there are so many there. And again, it's too easy to just say, listen better or tune in differently. But if you slow down and strategically reconsider the voices that you listen to and the voices that you don't, we know that there's a problem of unheard voices. We know that there's a polarised society. We know that most people feel unheard, whether you're an employee, whether you're a citizen, whether you're a customer. people are feeling unheard now, way more than ever before. As a decision, as a decision maker, as a leader, when you choose to listen to voices differently, you know, whether it's ego, conscience, the voice of comfort, familiarity, reconsider the voices you listen to, you will make better decisions.

  • Speaker #2

    Now my final question for you as we are at the end of our really insightful conversation. You have conducted a really brilliant analysis of the external and internal causes that explain why it's so difficult to make decisions. You also propose very relevant decisions. But my question is in the end. Aren't you suggesting something that's very challenging, namely that humans become System 2 decision makers when we are intrinsically System 1 beings? How do you suggest we balance our imperfect human side with our best side?

  • Speaker #0

    It goes back to the commitment, Eric. You hold a position of power. It is your moral duty to get it right. When you are in a position of power impacting other people's lives, it is the onus is on you, the responsibility is on you to do it. And that requires, it requires a commitment to at least trying to make better decisions. Now, most of the time you might get them right, but the consequences are too high when you get them wrong. And I think that's what it is. Look at the consequences when people get it wrong. And that's the problem. Do you want to be on the right side of history or the wrong side of history? I think misjudgment is not entirely your fault in this noisy world. You know, we've got our internal problems, as you've just said, you know, internal misinformation, external disinformation. But judgment is a choice and you choose to get this right. You choose to be a decision ninja or you don't. But when you hold positions of power, other people's welfare, lives, livelihoods, happiness depends on you and your choices. I think it's beholden on you to do it.

  • Speaker #1

    Yes. Thank you so much again, Noala. This is a fantastic conversation and we're so happy that you joined us today. Is there anything you would like to leave our audience with, such as where they can find out more about you and your work?

  • Speaker #0

    Sure. You can obviously get the, you can get the book that's available on Amazon or all leading outlets. But in terms of the actual work, I've over a hundred articles and a lot more information on, on. different, you know, speaking events and stuff that I've done are on the website. And that is www.nualsh.com. So N-U-A-L-A-G-W-A-L-S-H dot com.

  • Speaker #2

    Thanks a lot, Nuala. It was a wonderful conversation. You are a very engaging speaker. And again, we do recommend with Suzanne to read your book because it is 350 pages of concrete and helpful recommendations. Thanks a lot.

  • Speaker #0

    Well, thank you both very much. It's been lots of fun. Be Good, a podcast by the BVA Nudge Unit.

Chapters

  • Introduction and Career

    01:02

  • Inspiration for the New Book, Tune In

    06:20

  • Unpacking Research Techniques

    09:48

  • Four Key Factors that Affect our Judgement

    11:45

  • Understanding the Trilogy of Error

    15:21

  • Structural Traps in Human Decision Making

    20:32

  • How to Tune into the Voices that Matter

    29:19

Share

Embed

You may also like

Description

šŸŽ§ In this episode we're excited to welcome Nuala Walsh author of the new book, Tune In: How to Make Smarter Decisions in a Noisy World. Named among the 100 most influential women in finance with over three decades of experience in investment management at BlackRock, Merrill Lynch, and Standard Life Aberdeen, she brings a wealth of knowledge to today's conversation. Currently serving as the CEO at Mind Equity, she advises organizations on behavior change, culture, and reputation.


Join us as we delve into her journey exploring the nuances of decision-making in a complex world and uncover strategies to navigate the noise and make smarter choices āœØ


During this conversation, weā€™ll explore:Ā 

  • šŸ“£ Four key factors that influence our ability to tune inĀ 

  • šŸ§  Cognitive challenges that affect our judgment and decision-making abilitiesĀ 

  • šŸ”Ž Practical case studies and behavioral science research for improving our judgementĀ 

Ā 

To learn more about Walsh's work visit www.nualawalsh.com.


šŸ‘‰Join the conversation and share your thoughts about this podcast on Twitter @BVANudgeConsult. Donā€™t have social media? Our inbox is always open at contact@bvanudgeconsulting.com.Ā Ā 


Hosted by Ausha. See ausha.co/privacy-policy for more information.

Transcription

  • Speaker #0

    Welcome to the Be Good Podcast,

  • Speaker #1

    where we explore the application of behavioral economics for good in order to nudge better business and better lives.

  • Speaker #2

    Hi and welcome to this episode of Be Good, brought to you by BVA Notch Consulting, a global consultancy specializing in the application of behavioral science for successful behavior change. Every month we get to speak with a leader in the field of behavioral science, psychology and neuroscience in order to get to know more about them, their work and its application to emerging issues. My name is Eric Singler, founder and CEO of BVN Notch Consulting, and with me is my colleague, Suzanne Kirkendall, CEO of BVN Notch Consulting North America. Hi, Suzanne.

  • Speaker #0

    Hi, Eric. I'm very happy to be back for another episode and delighted to be introducing today's guest, Nuala Walsh. Nuala is an award-winning non-executive director, behavioral scientist, TEDx speaker, and author. Named among the 100 most influential women in finance, she spent three decades in investment management at BlackRock, Merrill Lynch, and Standard Life Aberdeen as Chief Marketing Officer. Today, as CEO at MindEquity and a founding director of the Global Association of Behavioral Scientists, she advises on behavior change, culture, and reputation. Noala holds multiple appointments industry-wide. She is non-executive director at British and Irish Lions, president of Harvard Club of Ireland, chair of the Innocence Project London, council member at the Football Association, and former vice chair at UN Women UK. Her insights have been published in Forbes, Inc., Psychology Today, Harvard Business Review, the Financial Times, Fox Business, and BBC. And very excitingly, Nuala has just published an amazing book called Tune In, How to Make Smarter Decisions in a Noisy World, which is going to be at the heart of our conversation today. Nuala, welcome to our Be Good podcast.

  • Speaker #1

    Thank you very much. It is a huge pleasure to be here.

  • Speaker #2

    So thanks a lot, Nuala, again for being with us today. Before talking about your amazing book, we would like to know a little more about... You and your career! I think you received a master degree in behavioral science and business studies alongside a degree in philosophy. Can you tell us about how you came to be interested in behavioral science in general, and maybe specifically about your interest in decision-making process?

  • Speaker #1

    Sure, Eric. You're absolutely right. I've actually always been interested in human behavior. I just took a slightly circuitous route to get there. Because when I finished my first degree in Trinity, I actually studied forensic psychology straight after that. So that was always at the heart of what I wanted to do and was interested to do. And then I just had a 30 year career in the meantime before I went back to study at the London School of Economics to do the master's in behavioural science. But my interest, I think, was always there. So and it's since then that I've really applied the the insights that I learned there to business. And my thesis was actually on whistleblowing and the bystander effect. So that is not quite forensic psychology, but there is an element there, I think of that criminology. And then, as you say, when I set up my own consultancy, where I do advise firms, as you mentioned earlier on, and I sit on boards, I was able to see the mistakes that people made and they were preventable mistakes. So all of the theory, if you like, suddenly came into vogue and it became very clear why people were making these decisions and more importantly, how preventable they were. And that sort of led me to the point where, well, if mistakes are predictable, they're also preventable. And if only people knew the theory behind it. I think they would be much more equipped and enabled to prevent some of those errors.

  • Speaker #2

    Could you share now with us if you have any mentors that had a particularly strong influence on you? Do you have maybe any researchers or other people who have played an influential role in your professional career and in your interest in behavioral science?

  • Speaker #1

    quite a few I mean you know well you've interviewed most of them on this podcast I think there are so many researchers but I will say that in the beginning I attended a course in Harvard with Jennifer Lerner who is the expert on emotion and many other things but she teaches she teaches people how to be a decision architect and that was probably the first one that I had attended and and that was before I did the the master's in in the LSE so I sort of attribute that to the start and her influence there. And in fairness, she's been a supporter ever since, and she now sits on our board in Gabs as well. But I would also, I would pick out the LSE faculty, all of them, you know, I won't embarrass them by naming them, but they know who they are. And I think that group in particular, I think between them, they sparked or they created the spark, if you like. And that sort of, there are so many in this field, as you know, and they're all different. they're all really talented and they're experts in a particular area, which is probably why I've got 500, you know, references in the book. There's so many of them to pick out. But I think, whereas those individuals might have created this market, COVID provided the opportunity, believe it or not. So the timing and the chance to actually do it after so many years of being in the field and actually then applying the theory to people's reality.

  • Speaker #0

    Fantastic. So, Nuala, as I mentioned earlier, your recent book, Tune In, How to Make Smarter Decisions in a Noisy World, was just published. Before we discuss the content of it, can you tell us more about the inspiration behind writing it? How did the idea for the book come to you?

  • Speaker #1

    Sure. Well, Suzanne, I actually always wanted to write a book. I just wasn't sure what kind of book I was ever going to write. And people do say that there's a book in all of us. And I do think there is a book in all of us. We certainly all have enough stories and enough experience. education, if you like, to teach other people or lessons what we get right and what we get wrong. The output wasn't what I thought. Before I had done this master's, I was probably going to write a book along the lines of Mark McCormack's, what they didn't teach you in Harvard Business School, because that's the first one that I read when I started my career. And I always loved it and thought it was great. And I used it as a bit of a manual. So in my head, I always said, oh, I'll teach people the 50 tips and all the tricks and what you get wrong. So but when I did the master's, I learned something different. I guess it opened my mind to something different. So the output, the result is, of course, something that's completely rooted in science now, rather than me and my experience, which might have been the earlier book, if you like. So and then when I did that, when I saw all those excellent researchers and experts in this field, I guess I wanted to contribute in some way. So and that's why I thought. there are a lot of books on decision making. There are a lot of books on judgment. And that's why I suppose you ask, how did the idea come? That's why I deliberately went out of my way to try and find something that I thought was different. And I think this idea of deaf spots has not been covered elsewhere in the field. And I wanted to focus on something that wasn't just a list of biases or something that was just an academic reference. I wanted it to be really practical that people could, yes, it was rooted in science, but would be able to... use as a mnemonic and refer to it pretty easily when they were in these high stakes situations. So the idea, the actual final output, of course, it's never exactly as you start out. So the refinement. of this idea of being tuned in or tuned out came from that. And I, on all of the stories and the examples I have, I had to put a filter on at the very end, or at least halfway through to say, was this person tuned in or tuned out? Can I attribute this to deaf ear syndrome or a deaf spot, or they didn't listen, or they were motivated to mishear or hear something. And that was sort of my lens for, you know, keeping people in and taking them out. And that led me to this idea that tuning out, you know, is a hidden source of misinformation, but tuning in is a hidden source of opportunity. So it is quite, you know, binary, polarised in that way. And, you know, it came from that really. And then it was easy. It was easy to evidence why it mattered because, you know, these mistakes cost the average Fortune 500. company, 250 million a year, at least, I can give you a really long list. We could be here for an hour talking about the mistakes and the evidence as to why people's bad decisions are so critical, particularly when people hold power.

  • Speaker #0

    Absolutely. Yeah. So that title, Tuning In, makes so much sense. Can you tell us more about your research method? You talked a lot about all of your references, but also all of your experiences. Can you tell us how you combine those two?

  • Speaker #1

    Well, I didn't do RCTs and somebody once asked me, you know, is this all based on primary research? Well, it isn't primary research in that sense of RCTs. It is a blend of primary and secondary. Primary is my own research, but it's deliberately a spectrum. There's a heavy reliance on practical case studies, as in real life case studies, supplemented by all of these behavioural science experiments. And so that's why I do have an index that spans 25 pages, because it is, I think Eric said earlier on, it is very rich in terms of its content. So the conceptual framework is rooted in science. And so are all of the key concepts. So it was a blend. So I was more comfortable with that rather than going out, starting and making it based on one particular. experiment and then I exploded it into a theory. I wanted it to be a compendium of the best of other people's thinking as well, but for me to add a slightly different slant to it, if that makes sense.

  • Speaker #0

    Absolutely. That makes total sense. And I know you've given us a little bit of a preview of some of the important concepts, but before Eric asks you more details about all your great frameworks, can you tell our listeners, what is the one key takeaway? What would the headline be of your book?

  • Speaker #1

    Probably that what you hear is as important as what you see. And that we're more at risk than ever of tuning out the important voices and rushing to judgment. But if you at least remember the perimeter is traps and the need to tune in and to consciously think about who and what you're tuning into, I think you stand out in your field rather than lose out or miss out.

  • Speaker #2

    Okay, Nuala, now it's time to delve into the challenge of decision-making from an individual perspective. But before, I'd like us to discuss what you call, I think, the external context within which our decisions are made. And first, can you tell us why it's so tough to make good decisions in what, again, you call today's noisy world? What are the main factors that shape our judgment?

  • Speaker #1

    Sure. Well, and I think even just before we mention that, Eric, it's probably just useful to let people know that tuning in is the solution to the problem of tuning out. And it is a combination of two things that make us tune in. or tune out. One is the context, the external context, and the second thing is our cognition. So looking at the external environment, I point to four factors. Now, I'm sure there are more, but I think these four in particular are increasing our vulnerability to error. The first one is it's a speedy world. And I think we all know that we live in this fast-paced, frantic lifestyle, but that accelerates our speed of making decisions and our short-term our short-term thinking. And this whole ecosystem that we operate in today amplifies that probability of making fast choices, which aren't necessarily the best choices. The second one is data. And we know that there is excessive data, and we also know that it's overwhelming. Yet it is our new normal, and we're forced to make decisions in that noisy sort of context. So there's plenty of evidence to that. I mean, Microsoft find that, I think it's 68%. employees just don't even have enough uninterrupted time for them to do their jobs. There's a third layer of which I've introduced, which is quite different. And I don't think people particularly think about it. And that's the difference between, you know, the arrow words, what we see and what we hear. So I argue it's a very visual world, more so than before. And because it's visual, what we see dominates and skews our interpretation. So you only need to think about Instagram or first impressions. we see what we want and we don't consider enough what we hear. And, you know, one example of that can be stereotyping from, you know, hiring to refereeing matches or, you know, even awarding loans. So that's a problem when it comes to misjudgment. And the last one is the fact that, you know, binary thinking. We live in a polarised world. Half the world is going to vote this year and people are becoming more and more entrenched in their thinking. So we think in these polarised ways. And some of these binary classifications are just embedded in our systems and structures. I mean, if you're a marketeer, you segment your customers and it's very sensible, but we still do it. If you look at your colleagues, you know, in the BVA nudge unit, do you think of your colleagues as introverts or extroverts or high or low potential, you know, sporty or academic? But when we do that, as you well know, we narrow our perspective when we think in this way. So despite... the advances of the 21st century with data and all of these things, the odds are stacked against good judgment in this high-speed visual world. So when I work with my clients, as I'm sure you do, I find that these factors just diminish the time that the people have available to devote to tuning in and, you know, reinterpreting different conversations and not making this time, you know, is a mistake and is a judgment killer.

  • Speaker #2

    Yeah. There is this question, you mentioned this judgment. killers. And could you again explain for our listeners, I think it is three main judgment killers that you call blind spot, deaf spot and dumb spot.

  • Speaker #1

    Well, indeed, Eric, and I've combined them and I've called them the trilogy of error. Sometimes I've called them the trilogy of terror, if you get them wrong as well. And you can guess what they are. I mean, you know what the psychological blind spots are. they don't work in isolation on our judgment. And so you've heard of those, but people haven't really heard of death spots, but they do exist actually, because there's been a psychologist. that discovered these or certainly wrote about these in the 1960s. But it refers to the failure to interpret what people say. In the same way, blind spots is about what people see. And then, of course, dumb spots is about what people hear or don't hear. And this is when people typically don't speak up. Because if someone remains silent, by definition, you can't hear their voices. And again, the challenge is that we just don't think about this trilogy when we're under pressure. or in uncertainty or crisis. But I thought it was quite a nice way to pull together, you know, the fact that there is this trilogy out there. And even if people think in their own way that these exist, again, it's a simple way for people to think twice and try and at least pause and make more measured decisions when they are under pressure or in crisis or uncertainty. So the skill is to reinterpret what people hear.

  • Speaker #2

    Yes, we have been lucky to interview some weeks ago for the Human Advantage, but also for the podcast, Amy Edmondson on cycle digital safety, which is, I think, one key point which makes within an organization very difficult to make a smart decision.

  • Speaker #1

    absolutely and and and she she is the the leader in that field and she's absolutely right because when there is no psychological safety it yes there are dumb spots but actually if when there's no safety people tune out so people tune out of the voices that matter and they and they don't speak up so they develop this combination of a blind spot and and a death spot so it's a combination so sometimes they come together sometimes it might be one that dominates more than the other but together they are lethal and they are these judgment killers

  • Speaker #2

    You also highlight that interpretation is also a challenge because we can't trust what we hear. Can you tell us more about this big gap, which is classical in behavioral science, between what is said and what is heard? Generally in behavioral science it is between intent and behavior, here it is between said and heard.

  • Speaker #1

    Yes, and this could be a very long answer as well, Eric, because you can bring into this, you know, cultural factors, you know, what's heard versus what's said, language, euphemisms, tone of voice. So I was very conscious when I was writing this bit that it could be very long. So what I was keen to make sure that people didn't think of this book as a book about listening, because it's not about listening. It's about interpreting what you hear. It's a second order effect. So we all can listen. and here but we don't always choose to reinterpret we take things at face value we um jump to conclusions we misinterpret you know what we see we encode we don't decode so that's an important differentiation so this is about the difference between what you see what you sorry what's heard and and what's what's what's meant is also the extent to which you're willing to spend some bit of time decoding and it could be people's agendas it could be anything it could be you know people people's intent as you mentioned earlier on. But when you don't do it, you only have to think of, you know, people just take things at face value. I mean, you can look at genocide, you can look at misinterpreted military instructions because people didn't think about what they heard, they just did and obeyed. Or, you know, mishearing. I think there were some very powerful examples of people who genuinely just accidentally misheard, you know, air traffic controllers. Now that's not... I should pause and reinterpret because you might be in an emergency situation and you might not have time to do that. But in many cases, there is time. So you might not diagnose an unusual illness. You just have jumped to conclusions and rushed to judgment because we live in this. noisy, fast-paced, data-filmed, overwhelming world, and we just don't have time to slow down. I mean, aren't we all still catching up with yesterday and the day before? People can barely keep up with yesterday, and it's just this cycle that keeps going. And that's why I outline a number of different solutions to boost interpretation, which are all in the third part of the book.

  • Speaker #0

    Excellent. So we've covered the external challenges for cognition. Now let's talk about the internal. You mentioned in your book that there are structural traps that lead us to making poor decisions, of course. And you mentioned the great acronym that you've come up to summarize these traps, which is perimeters. And for our listeners, I'm going to walk you through that acronym real quick. So each letter is a specific trap or bias. P of perimeters is for power. E is for ego. R is risk. I is identity. M is memory. E, ethics. T, time. E, emotion. R, relationships. And S is stories. So before we get into the details of some of these traps, Nuala, could you give us an overview around the major levels that you describe, which are individual biases, organizational traps, and those related to society?

  • Speaker #1

    Yes. And to be honest, really the point there is that this can affect people at every level. It can be individual, organizational or societal. So an individual bias is probably fairly obvious. It relates to the ones for which you are responsible for. And it could be ego, it could be any of these, but ego is one that is particularly referred to an individual. So an organization bias affects the collective. So you talked about Amy Edmondson, it could be a conservative or a risk-taking culture. So, you know, risk, but risk can also be an individual bias and it can be an organizational bias based on the culture. So and then from a societal perspective, the traps can be cause and effect. So stories is one, for example. So a legend and a folklore of a particular country and, you know, the Loch Ness Monster or crying statues, whether it's true or not, people believe them. And this affects your judgment. And then the traps can hit all three levels. So, for example, if you take following the crowd. You know, that's an individual, does it? An organisation certainly does it. And societies and different groups also do it as well. So the point is that. And I deliberately interspersed in the book examples that hit all three, so that people wouldn't just say or wouldn't just think this only applies to an individual's decision making. It can be, you know, a national, well, national leaders making decisions for countries or organisation leaders. And then it's the collective because the crowd decides as well, in many respects, individuals in the crowd, but the crowd as a collective, as you know.

  • Speaker #0

    For sure. And so why do you think each of these traps is dangerous for making smart decisions?

  • Speaker #1

    Because the evidence is there. I mean, individually, they're dangerous, but collectively, they're lethal because we repeat them. And I think the evidence speaks for itself, whether you're talking about scams, whether you're talking about scandals, missed signals, whether it's 9-11, human error. There are so many examples, fines, misconduct, miscarriages of justice, because the problem is we don't notice or reinterpret what we hear. And so you have all of these examples. and as we've said it's a combination of the environment and us they're so hidden we don't notice them we're so biased we don't think we make them we think we're great decision makers we don't even question whether making mistakes yet as I said history repeats itself we still get scammed businesses fail you know ego traps predominate all the time and we miss these signals and tune out so by definition it's always going to be a problem for people in their decision making

  • Speaker #0

    And then for each of those traps in the book, you explain the specific mechanics that can lead us to making those poor decisions. So one example for our listeners, the first trap, you describe the power-based traps, and you mentioned six different biases that come into play, which include the authority bias, halo effect, champion bias, the contrast effect, and the just world hypothesis. Can you explain some of those biases and illustrate them for our listeners with some real-life examples?

  • Speaker #1

    Yes, and each of them do have quite a few. I was never short of an example, Suzanne. So is that a good thing or a bad thing? It was probably a good thing if you're a writer, it's a bad thing for reality. So even if I just take narrow focus. So narrow focus occurs when we're too goal-oriented and we're in search of power. So I should say that power-related traps are all, it doesn't mean you have power or haven't got power. These are traps that occur when you're chasing power, afraid of losing power. They're power-related. rather than being about I have power or I don't have power. Anyone at any level can have a narrow focus when they're too goal-oriented in search of power or trying to protect the power that they already have. And they make dangerously damaging decisions. So even if you look back at, I don't know if you remember in 2016, Wells Fargo, when they knowingly fabricated these millions of fake customer accounts. The leaders were goal-oriented, had a narrow focus in search of industry power, and they instructed their teams to achieve eight accounts. per customer. And the CEO, when he was interviewed by the Senate Banking Committee, you know, it just explained that eight rhymes with great and compounded by ego. He blamed his employees, not that toxic culture that we were talking about earlier. So leadership in that situation turned a deaf ear to these stress sales teams. And, you know, at the end of the day, it cost the bank three billion and a CEO's career because people accept the story that they want to hear and they turn. a deaf ear to the rest. Another one which is pretty obvious is maybe authority bias when we obey the person in charge and don't speak up enough. So lots of examples there. You could take Theranos with Elizabeth Holmes, where people knew that the Edison finger prick technology was based on a lie and the majority stayed silent. Some did speak up and that's how it came to light, but the majority for too long didn't speak up. And then you have maybe the just world hypothesis. Which happens when we trust that good people get rewarded and evil punished. And when you do that, you typically lose power. So it doesn't bear out in reality because we often over trust the system. And an example there might be the wrongfully convicted or the British post office example where employees trust that justice would serve them. But yet it didn't because they believed in a just world. And when you do that, you are naive. Your view is slightly narrow. You might also see it in M&A where the hardworking employee thinks that they're going to get noticed or rewarded or promoted. But, you know, they naively overlook and don't decode the messages in that situation. So they trust what they hear and they don't reinterpret sentences, statements, promises, you know, for politics. So we know that thinking is hard. And I think as Carl Jung says, that's why most people judge. So these occur, these occur all the time. But the power related ones, I think, are fascinating because we all seek power to some extent. And again, it depends on your definition of power. But, you know, you could be a parent and seek power in the house or over the children, or a professor or a politician or a president. So this one is really pervasive.

  • Speaker #0

    And unfortunately, we don't have time to go into each of the 10 traps and perimeters, but we definitely encourage our listeners to read your book to learn about all of them in detail. But Nuala, according to you, which one do you consider the most dangerous and most common for creating decision misinformation? Well, I might disappoint you, Suzanne, because I actually don't consider one the most dangerous. They're all common. They're all common. They're all dangerous. They're also also interrelated. So if you take the risk based traps, which, you know, they are a function of ego and emotion and time. We mentioned power a minute ago. So power based traps are a function of ego. You see the crowd in there, the crowd effect. You see ethics in there. So you could take any scandal and analyse it in terms of the different weightings of these traps, if you like, whether it's the Ukraine war or 9-11, pick anything, and you can absolutely translate it in terms of these. But the good news is, of course, is that opportunity exists once you know about these, you at least have half a chance of mitigating error and trying to prevent some of these. So it's not all doom and gloom. There is hope if you choose to spend time. and rethink about what you're doing.

  • Speaker #1

    And read the book to find out about all 10.

  • Speaker #0

    Well.

  • Speaker #2

    Yes, Nuala, we have understood. And maybe it's unfortunate, I don't know, why it is so difficult to make a smart decision, good decision because of external factor, because our internal cognition and all this bias, which are all... important and dangerous, but hopefully you have some solution. And now it's time to talk about your solution for making good decisions. So we understand that it is a really strong challenge, and our psychology, in addition to the external context, does not help us with many of the traps you have summarized in these fantastic acronym perimeters. But you suggest, you propose, you offer a roadmap and very, I think, concrete solution to become what you call a decision ninja. So before discussing this concrete solution, could you define what you mean by a decision ninja? Is it a superhuman?

  • Speaker #0

    Well, I think it's a superpower. I don't know about a superhuman, but it's someone who uses their superpower. It's very simply, Eric, somebody who's tuning in to the voices that really matter so that they stand out, as I said, rather than lose out. So it is someone who's intentional about interpretation. someone who consciously notices what's said and what's not said. So it is a little bit aspirational, but it does encourage people to be more thoughtful and more responsible for the decisions they make. And lots of people do it. So if you're a journalist or an investigator cracking cases, people do it. So yes, people get it wrong, but a lot of people also get it right.

  • Speaker #2

    Do you have in mind an example in real life of someone who is, from your perspective, a decision ninja?

  • Speaker #0

    I don't have one in particular, but in the book, I decided in every chapter, I made absolutely sure that I included at least one at the end of every chapter so that people could learn from other people. So people who are facing these perimeter traps, what did they do and how did they do it? So you've got a lot of examples of people who get it right in there as well.

  • Speaker #2

    Okay, thanks. So first, your first solution, I would say, it's about adopting a specific mindset that you call, again, and congrats for this, AAA, I don't know if it is the right pronunciation, mindset for outcome anticipation, outcome attitude, and outcome acceptance. Could you explain each of these three factors that we need to control to make better decisions?

  • Speaker #0

    Yeah, I've called it, I actually called it AAA, Eric, because it was a bit of a pun on my investment background. So that was the three A's. As you can see, I'm clearly trying to get people to remember some of this stuff by making it easy. Another mnemonic for people, but AAA was easy. So all it really says is that you control your own mindset, that if you go into your decision making, thinking about the fact that you can. you can't control everything, but you can control your anticipation, you know, your attitude to what happens and you can control whether you accept it or not. So you minimize the liability, if you like, when you do control these. So, and becoming this decision ninja relies on, you know, this being intentional, et cetera. So it's really just putting people into the right frame of mind. If you decide, woe is me and I have no control here. you're not really going to get very far anyway. So really, I was just trying to get people into a more positive mindset before they make this decision, before they adopt any of the approaches. So that's the suggestion with the triple A mindset, just an attitude of, you know, acceptance and anticipation. So predict what's going to happen, control your own attitude. And then when it happens, it happens and deal with the consequences.

  • Speaker #2

    But it's not only a question of state of mind or mindset. You have also, and you suggest also, what you call the sonic strategy. Again, I need to help your readers to remember each of the five key actions. S for slow down, O for organize your attention, N for navigate novel perspective, I for interrupt mindset, And C for calibrate situation, stranger and strategy. Could you briefly summarize each of these five key objectives?

  • Speaker #0

    Yeah, and they are very, very brief, Eric. And you would do it in a process. So I deliberately chose them. I mean, Sonic was deliberate because it plays on death spots, obviously. But you do it in a process. So first of all, you need to just slow down to enable yourself to even challenge your thinking. If you don't do that. you're kind of not even at the races. So if you don't organise your attention, so in this frantic, noisy world, I'm suggesting you do that so that you can actually be in the best position to make a decision. And there are three or four different, I should say there are three or four different strategies for each of these, all science-based, all should be familiar to people. Then thirdly, you navigate these novel perspectives so that you don't rush to judgment and you at least take into consideration other people's views. interrupting mindsets. I did quite like this one because you're mentally putting on the brakes so that you again avoid that rush to judgment or the assumption of validity. So you interrupt your own mindset and you interrupt other people's as well. And again, two or three techniques you can use for that. And then the last one is, you know, kind of recalibrating situations and strangers and strategies is the last stop to getting it right. And that's using checklists and implementation intentions. And you know, techniques that would be certainly in the behavioural science field that people would be familiar with. But again, really, really easy. And people have the choice to try some and, you know, use these as befits the particular decision scenario.

  • Speaker #2

    Yeah, it's funny because you mentioned, I think, friction. and sludge, and the interest to put friction into our decision-making process.

  • Speaker #0

    That's exactly right. And decision friction, so the sonic strategies are based on decision friction. And a lot of people don't like friction, as we know. But if you deliberately slow down, this is like a speed bump for the mind, you slow down in the initial cases so that you can at least make a better decision. If you keep hurtling along... you might make a great decision. I mean, some people make great decisions at speed, but you're more likely to make a mistake than if you don't at least reconsider it. So you're absolutely right. I'm using decision friction in a positive sense rather than the negative sense.

  • Speaker #2

    Yeah. It's funny because we have interviewed some months ago Cass Sunstein on... One of his new books, but Cash is used to publish a new book every four months, which is incredible. And this was a book about sludge and the danger of sludge in his case. What is really interesting, and I think very interesting, Nuala, is for each of these objectives, it's not only the sonic strategies that you recommend, but you recommend very specific tools or processes to be successful at implementing the sonic judgment strategy. Could you again share one or two, maybe three, examples of the tools or process you suggest for our listeners.

  • Speaker #0

    Yes, and again, they're specific to any situation, and it depends whether you're on the S, O, N, I, or C. So I'll give you an S and a C, so the beginning and the end of one. I like, I mean, and again, people like different ones. I like the five Ys because... You know, and the five whys is based on a 1970s idea from Toyota, the car manufacturer, and they developed a very simple approach to solving problems that's also now used in Six Sigma. I merely have adapted this for decision making in this case. So, you know, Toyota assumes that most technical problems is based on a human problem. And basically they ask themselves why five times to ascertain the underlying cause. And I think this is useful for. you know, probing false reasoning or checking your assumptions. And it does mitigate against biases like probability neglect or loss aversion or commitment escalation. So any combination is possible here. Why this? Why that? Why the other? Why this? It's the same as the and then and then and then and then and then. So when you do that, it's easy to remember, but you might do it when you're feeling uncertain about a particular decision. well, what if I do that? And so what? And so what? And so what? And I think when you do that, it de-escalates the stress that you're under when you're actually making a decision. But in this case, it introduces decision friction and it slows you down. So if you ask, you have to wait till you get to five. So you should be slowed down by three or four, at least by the time you've thought of the answers. So it's a way to slow you down. another one, which is one of my favorites. And this is at the end. This is a way to get people to make sure that they do this. So initially, you need to commit that if you want to be a decision ninja, well, you need to at least commit to making an effort in the space. So this is to overcome the low implementation intentions, to overcome low willpower when people are deciding fast. So if you're a really fast decision maker and you decided, I want to get this right and this really matters. Well, then it's again, it's that if then plan that addresses the behaviour intention gap. And so we know that this works from appointment keeping to voting. So we know that pre-planning and commitment works. So if you pre-commit to being a better decision, it helps.

  • Speaker #2

    Yeah, I remember we have just interviewed one month ago with Suzanne Todd Rogers. I think he made a wonderful experiment to demonstrate the power of implementing a plan to encourage action and not just intent. In his case, it is about voting.

  • Speaker #0

    Yes, absolutely right. And the fantastic studies they have done. And again, this is just adapting the same principle. So if you commit to using the perimeters checklist, and what I've done is I've put all the biases together under that perimeters checklist, and that's available to people. Or you make a decision rule that says, I'm always going to distrust the information, but verify kind of a play on the Ronald Reagan trust, but verify idea, then you then it becomes a habit. And we all know about the power of habits. And then you have increased your probability of reflecting more and reducing error.

  • Speaker #2

    Now, you have learned that we have delved into the details and again, as Suzanne mentioned before, we do recommend our listeners to read your book because there are so many concrete recommendations to help us to make a better decision. But could you help us take a step back now and in a nutshell, summarize for our listeners?

  • Speaker #0

    listener your main advice for making smart decisions i'll go back to i'll go back to the one that i mentioned earlier on eric and that is tuning out you tune out is a hidden source of misinformation but tuning in is a hidden source of opportunity and there's a fantastic cherokee proverb that summarizes that which is if you listen to the whispers you won't have to hear the screams and that's the same as saying diagnose the information before you prescribe And I think when people tune in to others, not only do they hear the right messages and hear what other people don't, giving them an advantage, but other people will tune in to them. And we do know that because you relate better, you'll understand other people more, and by definition you win. So tune in to win, tune out, lose out. And I think if people remember that, that's pretty easy.

  • Speaker #2

    Okay. Only before my last question, an additional question. I know that you work for organization leaders, CEOs and so on. It is what we also are fortunate to do at BV Energy Consulting. Do you have a specific recommendation to make smarter decisions for leaders, organization leaders?

  • Speaker #0

    I think it's in the book, Eric. I think that if they remember the perimeter is traps, if they choose it, because there are so many there. And again, it's too easy to just say, listen better or tune in differently. But if you slow down and strategically reconsider the voices that you listen to and the voices that you don't, we know that there's a problem of unheard voices. We know that there's a polarised society. We know that most people feel unheard, whether you're an employee, whether you're a citizen, whether you're a customer. people are feeling unheard now, way more than ever before. As a decision, as a decision maker, as a leader, when you choose to listen to voices differently, you know, whether it's ego, conscience, the voice of comfort, familiarity, reconsider the voices you listen to, you will make better decisions.

  • Speaker #2

    Now my final question for you as we are at the end of our really insightful conversation. You have conducted a really brilliant analysis of the external and internal causes that explain why it's so difficult to make decisions. You also propose very relevant decisions. But my question is in the end. Aren't you suggesting something that's very challenging, namely that humans become System 2 decision makers when we are intrinsically System 1 beings? How do you suggest we balance our imperfect human side with our best side?

  • Speaker #0

    It goes back to the commitment, Eric. You hold a position of power. It is your moral duty to get it right. When you are in a position of power impacting other people's lives, it is the onus is on you, the responsibility is on you to do it. And that requires, it requires a commitment to at least trying to make better decisions. Now, most of the time you might get them right, but the consequences are too high when you get them wrong. And I think that's what it is. Look at the consequences when people get it wrong. And that's the problem. Do you want to be on the right side of history or the wrong side of history? I think misjudgment is not entirely your fault in this noisy world. You know, we've got our internal problems, as you've just said, you know, internal misinformation, external disinformation. But judgment is a choice and you choose to get this right. You choose to be a decision ninja or you don't. But when you hold positions of power, other people's welfare, lives, livelihoods, happiness depends on you and your choices. I think it's beholden on you to do it.

  • Speaker #1

    Yes. Thank you so much again, Noala. This is a fantastic conversation and we're so happy that you joined us today. Is there anything you would like to leave our audience with, such as where they can find out more about you and your work?

  • Speaker #0

    Sure. You can obviously get the, you can get the book that's available on Amazon or all leading outlets. But in terms of the actual work, I've over a hundred articles and a lot more information on, on. different, you know, speaking events and stuff that I've done are on the website. And that is www.nualsh.com. So N-U-A-L-A-G-W-A-L-S-H dot com.

  • Speaker #2

    Thanks a lot, Nuala. It was a wonderful conversation. You are a very engaging speaker. And again, we do recommend with Suzanne to read your book because it is 350 pages of concrete and helpful recommendations. Thanks a lot.

  • Speaker #0

    Well, thank you both very much. It's been lots of fun. Be Good, a podcast by the BVA Nudge Unit.

Chapters

  • Introduction and Career

    01:02

  • Inspiration for the New Book, Tune In

    06:20

  • Unpacking Research Techniques

    09:48

  • Four Key Factors that Affect our Judgement

    11:45

  • Understanding the Trilogy of Error

    15:21

  • Structural Traps in Human Decision Making

    20:32

  • How to Tune into the Voices that Matter

    29:19

Description

šŸŽ§ In this episode we're excited to welcome Nuala Walsh author of the new book, Tune In: How to Make Smarter Decisions in a Noisy World. Named among the 100 most influential women in finance with over three decades of experience in investment management at BlackRock, Merrill Lynch, and Standard Life Aberdeen, she brings a wealth of knowledge to today's conversation. Currently serving as the CEO at Mind Equity, she advises organizations on behavior change, culture, and reputation.


Join us as we delve into her journey exploring the nuances of decision-making in a complex world and uncover strategies to navigate the noise and make smarter choices āœØ


During this conversation, weā€™ll explore:Ā 

  • šŸ“£ Four key factors that influence our ability to tune inĀ 

  • šŸ§  Cognitive challenges that affect our judgment and decision-making abilitiesĀ 

  • šŸ”Ž Practical case studies and behavioral science research for improving our judgementĀ 

Ā 

To learn more about Walsh's work visit www.nualawalsh.com.


šŸ‘‰Join the conversation and share your thoughts about this podcast on Twitter @BVANudgeConsult. Donā€™t have social media? Our inbox is always open at contact@bvanudgeconsulting.com.Ā Ā 


Hosted by Ausha. See ausha.co/privacy-policy for more information.

Transcription

  • Speaker #0

    Welcome to the Be Good Podcast,

  • Speaker #1

    where we explore the application of behavioral economics for good in order to nudge better business and better lives.

  • Speaker #2

    Hi and welcome to this episode of Be Good, brought to you by BVA Notch Consulting, a global consultancy specializing in the application of behavioral science for successful behavior change. Every month we get to speak with a leader in the field of behavioral science, psychology and neuroscience in order to get to know more about them, their work and its application to emerging issues. My name is Eric Singler, founder and CEO of BVN Notch Consulting, and with me is my colleague, Suzanne Kirkendall, CEO of BVN Notch Consulting North America. Hi, Suzanne.

  • Speaker #0

    Hi, Eric. I'm very happy to be back for another episode and delighted to be introducing today's guest, Nuala Walsh. Nuala is an award-winning non-executive director, behavioral scientist, TEDx speaker, and author. Named among the 100 most influential women in finance, she spent three decades in investment management at BlackRock, Merrill Lynch, and Standard Life Aberdeen as Chief Marketing Officer. Today, as CEO at MindEquity and a founding director of the Global Association of Behavioral Scientists, she advises on behavior change, culture, and reputation. Noala holds multiple appointments industry-wide. She is non-executive director at British and Irish Lions, president of Harvard Club of Ireland, chair of the Innocence Project London, council member at the Football Association, and former vice chair at UN Women UK. Her insights have been published in Forbes, Inc., Psychology Today, Harvard Business Review, the Financial Times, Fox Business, and BBC. And very excitingly, Nuala has just published an amazing book called Tune In, How to Make Smarter Decisions in a Noisy World, which is going to be at the heart of our conversation today. Nuala, welcome to our Be Good podcast.

  • Speaker #1

    Thank you very much. It is a huge pleasure to be here.

  • Speaker #2

    So thanks a lot, Nuala, again for being with us today. Before talking about your amazing book, we would like to know a little more about... You and your career! I think you received a master degree in behavioral science and business studies alongside a degree in philosophy. Can you tell us about how you came to be interested in behavioral science in general, and maybe specifically about your interest in decision-making process?

  • Speaker #1

    Sure, Eric. You're absolutely right. I've actually always been interested in human behavior. I just took a slightly circuitous route to get there. Because when I finished my first degree in Trinity, I actually studied forensic psychology straight after that. So that was always at the heart of what I wanted to do and was interested to do. And then I just had a 30 year career in the meantime before I went back to study at the London School of Economics to do the master's in behavioural science. But my interest, I think, was always there. So and it's since then that I've really applied the the insights that I learned there to business. And my thesis was actually on whistleblowing and the bystander effect. So that is not quite forensic psychology, but there is an element there, I think of that criminology. And then, as you say, when I set up my own consultancy, where I do advise firms, as you mentioned earlier on, and I sit on boards, I was able to see the mistakes that people made and they were preventable mistakes. So all of the theory, if you like, suddenly came into vogue and it became very clear why people were making these decisions and more importantly, how preventable they were. And that sort of led me to the point where, well, if mistakes are predictable, they're also preventable. And if only people knew the theory behind it. I think they would be much more equipped and enabled to prevent some of those errors.

  • Speaker #2

    Could you share now with us if you have any mentors that had a particularly strong influence on you? Do you have maybe any researchers or other people who have played an influential role in your professional career and in your interest in behavioral science?

  • Speaker #1

    quite a few I mean you know well you've interviewed most of them on this podcast I think there are so many researchers but I will say that in the beginning I attended a course in Harvard with Jennifer Lerner who is the expert on emotion and many other things but she teaches she teaches people how to be a decision architect and that was probably the first one that I had attended and and that was before I did the the master's in in the LSE so I sort of attribute that to the start and her influence there. And in fairness, she's been a supporter ever since, and she now sits on our board in Gabs as well. But I would also, I would pick out the LSE faculty, all of them, you know, I won't embarrass them by naming them, but they know who they are. And I think that group in particular, I think between them, they sparked or they created the spark, if you like. And that sort of, there are so many in this field, as you know, and they're all different. they're all really talented and they're experts in a particular area, which is probably why I've got 500, you know, references in the book. There's so many of them to pick out. But I think, whereas those individuals might have created this market, COVID provided the opportunity, believe it or not. So the timing and the chance to actually do it after so many years of being in the field and actually then applying the theory to people's reality.

  • Speaker #0

    Fantastic. So, Nuala, as I mentioned earlier, your recent book, Tune In, How to Make Smarter Decisions in a Noisy World, was just published. Before we discuss the content of it, can you tell us more about the inspiration behind writing it? How did the idea for the book come to you?

  • Speaker #1

    Sure. Well, Suzanne, I actually always wanted to write a book. I just wasn't sure what kind of book I was ever going to write. And people do say that there's a book in all of us. And I do think there is a book in all of us. We certainly all have enough stories and enough experience. education, if you like, to teach other people or lessons what we get right and what we get wrong. The output wasn't what I thought. Before I had done this master's, I was probably going to write a book along the lines of Mark McCormack's, what they didn't teach you in Harvard Business School, because that's the first one that I read when I started my career. And I always loved it and thought it was great. And I used it as a bit of a manual. So in my head, I always said, oh, I'll teach people the 50 tips and all the tricks and what you get wrong. So but when I did the master's, I learned something different. I guess it opened my mind to something different. So the output, the result is, of course, something that's completely rooted in science now, rather than me and my experience, which might have been the earlier book, if you like. So and then when I did that, when I saw all those excellent researchers and experts in this field, I guess I wanted to contribute in some way. So and that's why I thought. there are a lot of books on decision making. There are a lot of books on judgment. And that's why I suppose you ask, how did the idea come? That's why I deliberately went out of my way to try and find something that I thought was different. And I think this idea of deaf spots has not been covered elsewhere in the field. And I wanted to focus on something that wasn't just a list of biases or something that was just an academic reference. I wanted it to be really practical that people could, yes, it was rooted in science, but would be able to... use as a mnemonic and refer to it pretty easily when they were in these high stakes situations. So the idea, the actual final output, of course, it's never exactly as you start out. So the refinement. of this idea of being tuned in or tuned out came from that. And I, on all of the stories and the examples I have, I had to put a filter on at the very end, or at least halfway through to say, was this person tuned in or tuned out? Can I attribute this to deaf ear syndrome or a deaf spot, or they didn't listen, or they were motivated to mishear or hear something. And that was sort of my lens for, you know, keeping people in and taking them out. And that led me to this idea that tuning out, you know, is a hidden source of misinformation, but tuning in is a hidden source of opportunity. So it is quite, you know, binary, polarised in that way. And, you know, it came from that really. And then it was easy. It was easy to evidence why it mattered because, you know, these mistakes cost the average Fortune 500. company, 250 million a year, at least, I can give you a really long list. We could be here for an hour talking about the mistakes and the evidence as to why people's bad decisions are so critical, particularly when people hold power.

  • Speaker #0

    Absolutely. Yeah. So that title, Tuning In, makes so much sense. Can you tell us more about your research method? You talked a lot about all of your references, but also all of your experiences. Can you tell us how you combine those two?

  • Speaker #1

    Well, I didn't do RCTs and somebody once asked me, you know, is this all based on primary research? Well, it isn't primary research in that sense of RCTs. It is a blend of primary and secondary. Primary is my own research, but it's deliberately a spectrum. There's a heavy reliance on practical case studies, as in real life case studies, supplemented by all of these behavioural science experiments. And so that's why I do have an index that spans 25 pages, because it is, I think Eric said earlier on, it is very rich in terms of its content. So the conceptual framework is rooted in science. And so are all of the key concepts. So it was a blend. So I was more comfortable with that rather than going out, starting and making it based on one particular. experiment and then I exploded it into a theory. I wanted it to be a compendium of the best of other people's thinking as well, but for me to add a slightly different slant to it, if that makes sense.

  • Speaker #0

    Absolutely. That makes total sense. And I know you've given us a little bit of a preview of some of the important concepts, but before Eric asks you more details about all your great frameworks, can you tell our listeners, what is the one key takeaway? What would the headline be of your book?

  • Speaker #1

    Probably that what you hear is as important as what you see. And that we're more at risk than ever of tuning out the important voices and rushing to judgment. But if you at least remember the perimeter is traps and the need to tune in and to consciously think about who and what you're tuning into, I think you stand out in your field rather than lose out or miss out.

  • Speaker #2

    Okay, Nuala, now it's time to delve into the challenge of decision-making from an individual perspective. But before, I'd like us to discuss what you call, I think, the external context within which our decisions are made. And first, can you tell us why it's so tough to make good decisions in what, again, you call today's noisy world? What are the main factors that shape our judgment?

  • Speaker #1

    Sure. Well, and I think even just before we mention that, Eric, it's probably just useful to let people know that tuning in is the solution to the problem of tuning out. And it is a combination of two things that make us tune in. or tune out. One is the context, the external context, and the second thing is our cognition. So looking at the external environment, I point to four factors. Now, I'm sure there are more, but I think these four in particular are increasing our vulnerability to error. The first one is it's a speedy world. And I think we all know that we live in this fast-paced, frantic lifestyle, but that accelerates our speed of making decisions and our short-term our short-term thinking. And this whole ecosystem that we operate in today amplifies that probability of making fast choices, which aren't necessarily the best choices. The second one is data. And we know that there is excessive data, and we also know that it's overwhelming. Yet it is our new normal, and we're forced to make decisions in that noisy sort of context. So there's plenty of evidence to that. I mean, Microsoft find that, I think it's 68%. employees just don't even have enough uninterrupted time for them to do their jobs. There's a third layer of which I've introduced, which is quite different. And I don't think people particularly think about it. And that's the difference between, you know, the arrow words, what we see and what we hear. So I argue it's a very visual world, more so than before. And because it's visual, what we see dominates and skews our interpretation. So you only need to think about Instagram or first impressions. we see what we want and we don't consider enough what we hear. And, you know, one example of that can be stereotyping from, you know, hiring to refereeing matches or, you know, even awarding loans. So that's a problem when it comes to misjudgment. And the last one is the fact that, you know, binary thinking. We live in a polarised world. Half the world is going to vote this year and people are becoming more and more entrenched in their thinking. So we think in these polarised ways. And some of these binary classifications are just embedded in our systems and structures. I mean, if you're a marketeer, you segment your customers and it's very sensible, but we still do it. If you look at your colleagues, you know, in the BVA nudge unit, do you think of your colleagues as introverts or extroverts or high or low potential, you know, sporty or academic? But when we do that, as you well know, we narrow our perspective when we think in this way. So despite... the advances of the 21st century with data and all of these things, the odds are stacked against good judgment in this high-speed visual world. So when I work with my clients, as I'm sure you do, I find that these factors just diminish the time that the people have available to devote to tuning in and, you know, reinterpreting different conversations and not making this time, you know, is a mistake and is a judgment killer.

  • Speaker #2

    Yeah. There is this question, you mentioned this judgment. killers. And could you again explain for our listeners, I think it is three main judgment killers that you call blind spot, deaf spot and dumb spot.

  • Speaker #1

    Well, indeed, Eric, and I've combined them and I've called them the trilogy of error. Sometimes I've called them the trilogy of terror, if you get them wrong as well. And you can guess what they are. I mean, you know what the psychological blind spots are. they don't work in isolation on our judgment. And so you've heard of those, but people haven't really heard of death spots, but they do exist actually, because there's been a psychologist. that discovered these or certainly wrote about these in the 1960s. But it refers to the failure to interpret what people say. In the same way, blind spots is about what people see. And then, of course, dumb spots is about what people hear or don't hear. And this is when people typically don't speak up. Because if someone remains silent, by definition, you can't hear their voices. And again, the challenge is that we just don't think about this trilogy when we're under pressure. or in uncertainty or crisis. But I thought it was quite a nice way to pull together, you know, the fact that there is this trilogy out there. And even if people think in their own way that these exist, again, it's a simple way for people to think twice and try and at least pause and make more measured decisions when they are under pressure or in crisis or uncertainty. So the skill is to reinterpret what people hear.

  • Speaker #2

    Yes, we have been lucky to interview some weeks ago for the Human Advantage, but also for the podcast, Amy Edmondson on cycle digital safety, which is, I think, one key point which makes within an organization very difficult to make a smart decision.

  • Speaker #1

    absolutely and and and she she is the the leader in that field and she's absolutely right because when there is no psychological safety it yes there are dumb spots but actually if when there's no safety people tune out so people tune out of the voices that matter and they and they don't speak up so they develop this combination of a blind spot and and a death spot so it's a combination so sometimes they come together sometimes it might be one that dominates more than the other but together they are lethal and they are these judgment killers

  • Speaker #2

    You also highlight that interpretation is also a challenge because we can't trust what we hear. Can you tell us more about this big gap, which is classical in behavioral science, between what is said and what is heard? Generally in behavioral science it is between intent and behavior, here it is between said and heard.

  • Speaker #1

    Yes, and this could be a very long answer as well, Eric, because you can bring into this, you know, cultural factors, you know, what's heard versus what's said, language, euphemisms, tone of voice. So I was very conscious when I was writing this bit that it could be very long. So what I was keen to make sure that people didn't think of this book as a book about listening, because it's not about listening. It's about interpreting what you hear. It's a second order effect. So we all can listen. and here but we don't always choose to reinterpret we take things at face value we um jump to conclusions we misinterpret you know what we see we encode we don't decode so that's an important differentiation so this is about the difference between what you see what you sorry what's heard and and what's what's what's meant is also the extent to which you're willing to spend some bit of time decoding and it could be people's agendas it could be anything it could be you know people people's intent as you mentioned earlier on. But when you don't do it, you only have to think of, you know, people just take things at face value. I mean, you can look at genocide, you can look at misinterpreted military instructions because people didn't think about what they heard, they just did and obeyed. Or, you know, mishearing. I think there were some very powerful examples of people who genuinely just accidentally misheard, you know, air traffic controllers. Now that's not... I should pause and reinterpret because you might be in an emergency situation and you might not have time to do that. But in many cases, there is time. So you might not diagnose an unusual illness. You just have jumped to conclusions and rushed to judgment because we live in this. noisy, fast-paced, data-filmed, overwhelming world, and we just don't have time to slow down. I mean, aren't we all still catching up with yesterday and the day before? People can barely keep up with yesterday, and it's just this cycle that keeps going. And that's why I outline a number of different solutions to boost interpretation, which are all in the third part of the book.

  • Speaker #0

    Excellent. So we've covered the external challenges for cognition. Now let's talk about the internal. You mentioned in your book that there are structural traps that lead us to making poor decisions, of course. And you mentioned the great acronym that you've come up to summarize these traps, which is perimeters. And for our listeners, I'm going to walk you through that acronym real quick. So each letter is a specific trap or bias. P of perimeters is for power. E is for ego. R is risk. I is identity. M is memory. E, ethics. T, time. E, emotion. R, relationships. And S is stories. So before we get into the details of some of these traps, Nuala, could you give us an overview around the major levels that you describe, which are individual biases, organizational traps, and those related to society?

  • Speaker #1

    Yes. And to be honest, really the point there is that this can affect people at every level. It can be individual, organizational or societal. So an individual bias is probably fairly obvious. It relates to the ones for which you are responsible for. And it could be ego, it could be any of these, but ego is one that is particularly referred to an individual. So an organization bias affects the collective. So you talked about Amy Edmondson, it could be a conservative or a risk-taking culture. So, you know, risk, but risk can also be an individual bias and it can be an organizational bias based on the culture. So and then from a societal perspective, the traps can be cause and effect. So stories is one, for example. So a legend and a folklore of a particular country and, you know, the Loch Ness Monster or crying statues, whether it's true or not, people believe them. And this affects your judgment. And then the traps can hit all three levels. So, for example, if you take following the crowd. You know, that's an individual, does it? An organisation certainly does it. And societies and different groups also do it as well. So the point is that. And I deliberately interspersed in the book examples that hit all three, so that people wouldn't just say or wouldn't just think this only applies to an individual's decision making. It can be, you know, a national, well, national leaders making decisions for countries or organisation leaders. And then it's the collective because the crowd decides as well, in many respects, individuals in the crowd, but the crowd as a collective, as you know.

  • Speaker #0

    For sure. And so why do you think each of these traps is dangerous for making smart decisions?

  • Speaker #1

    Because the evidence is there. I mean, individually, they're dangerous, but collectively, they're lethal because we repeat them. And I think the evidence speaks for itself, whether you're talking about scams, whether you're talking about scandals, missed signals, whether it's 9-11, human error. There are so many examples, fines, misconduct, miscarriages of justice, because the problem is we don't notice or reinterpret what we hear. And so you have all of these examples. and as we've said it's a combination of the environment and us they're so hidden we don't notice them we're so biased we don't think we make them we think we're great decision makers we don't even question whether making mistakes yet as I said history repeats itself we still get scammed businesses fail you know ego traps predominate all the time and we miss these signals and tune out so by definition it's always going to be a problem for people in their decision making

  • Speaker #0

    And then for each of those traps in the book, you explain the specific mechanics that can lead us to making those poor decisions. So one example for our listeners, the first trap, you describe the power-based traps, and you mentioned six different biases that come into play, which include the authority bias, halo effect, champion bias, the contrast effect, and the just world hypothesis. Can you explain some of those biases and illustrate them for our listeners with some real-life examples?

  • Speaker #1

    Yes, and each of them do have quite a few. I was never short of an example, Suzanne. So is that a good thing or a bad thing? It was probably a good thing if you're a writer, it's a bad thing for reality. So even if I just take narrow focus. So narrow focus occurs when we're too goal-oriented and we're in search of power. So I should say that power-related traps are all, it doesn't mean you have power or haven't got power. These are traps that occur when you're chasing power, afraid of losing power. They're power-related. rather than being about I have power or I don't have power. Anyone at any level can have a narrow focus when they're too goal-oriented in search of power or trying to protect the power that they already have. And they make dangerously damaging decisions. So even if you look back at, I don't know if you remember in 2016, Wells Fargo, when they knowingly fabricated these millions of fake customer accounts. The leaders were goal-oriented, had a narrow focus in search of industry power, and they instructed their teams to achieve eight accounts. per customer. And the CEO, when he was interviewed by the Senate Banking Committee, you know, it just explained that eight rhymes with great and compounded by ego. He blamed his employees, not that toxic culture that we were talking about earlier. So leadership in that situation turned a deaf ear to these stress sales teams. And, you know, at the end of the day, it cost the bank three billion and a CEO's career because people accept the story that they want to hear and they turn. a deaf ear to the rest. Another one which is pretty obvious is maybe authority bias when we obey the person in charge and don't speak up enough. So lots of examples there. You could take Theranos with Elizabeth Holmes, where people knew that the Edison finger prick technology was based on a lie and the majority stayed silent. Some did speak up and that's how it came to light, but the majority for too long didn't speak up. And then you have maybe the just world hypothesis. Which happens when we trust that good people get rewarded and evil punished. And when you do that, you typically lose power. So it doesn't bear out in reality because we often over trust the system. And an example there might be the wrongfully convicted or the British post office example where employees trust that justice would serve them. But yet it didn't because they believed in a just world. And when you do that, you are naive. Your view is slightly narrow. You might also see it in M&A where the hardworking employee thinks that they're going to get noticed or rewarded or promoted. But, you know, they naively overlook and don't decode the messages in that situation. So they trust what they hear and they don't reinterpret sentences, statements, promises, you know, for politics. So we know that thinking is hard. And I think as Carl Jung says, that's why most people judge. So these occur, these occur all the time. But the power related ones, I think, are fascinating because we all seek power to some extent. And again, it depends on your definition of power. But, you know, you could be a parent and seek power in the house or over the children, or a professor or a politician or a president. So this one is really pervasive.

  • Speaker #0

    And unfortunately, we don't have time to go into each of the 10 traps and perimeters, but we definitely encourage our listeners to read your book to learn about all of them in detail. But Nuala, according to you, which one do you consider the most dangerous and most common for creating decision misinformation? Well, I might disappoint you, Suzanne, because I actually don't consider one the most dangerous. They're all common. They're all common. They're all dangerous. They're also also interrelated. So if you take the risk based traps, which, you know, they are a function of ego and emotion and time. We mentioned power a minute ago. So power based traps are a function of ego. You see the crowd in there, the crowd effect. You see ethics in there. So you could take any scandal and analyse it in terms of the different weightings of these traps, if you like, whether it's the Ukraine war or 9-11, pick anything, and you can absolutely translate it in terms of these. But the good news is, of course, is that opportunity exists once you know about these, you at least have half a chance of mitigating error and trying to prevent some of these. So it's not all doom and gloom. There is hope if you choose to spend time. and rethink about what you're doing.

  • Speaker #1

    And read the book to find out about all 10.

  • Speaker #0

    Well.

  • Speaker #2

    Yes, Nuala, we have understood. And maybe it's unfortunate, I don't know, why it is so difficult to make a smart decision, good decision because of external factor, because our internal cognition and all this bias, which are all... important and dangerous, but hopefully you have some solution. And now it's time to talk about your solution for making good decisions. So we understand that it is a really strong challenge, and our psychology, in addition to the external context, does not help us with many of the traps you have summarized in these fantastic acronym perimeters. But you suggest, you propose, you offer a roadmap and very, I think, concrete solution to become what you call a decision ninja. So before discussing this concrete solution, could you define what you mean by a decision ninja? Is it a superhuman?

  • Speaker #0

    Well, I think it's a superpower. I don't know about a superhuman, but it's someone who uses their superpower. It's very simply, Eric, somebody who's tuning in to the voices that really matter so that they stand out, as I said, rather than lose out. So it is someone who's intentional about interpretation. someone who consciously notices what's said and what's not said. So it is a little bit aspirational, but it does encourage people to be more thoughtful and more responsible for the decisions they make. And lots of people do it. So if you're a journalist or an investigator cracking cases, people do it. So yes, people get it wrong, but a lot of people also get it right.

  • Speaker #2

    Do you have in mind an example in real life of someone who is, from your perspective, a decision ninja?

  • Speaker #0

    I don't have one in particular, but in the book, I decided in every chapter, I made absolutely sure that I included at least one at the end of every chapter so that people could learn from other people. So people who are facing these perimeter traps, what did they do and how did they do it? So you've got a lot of examples of people who get it right in there as well.

  • Speaker #2

    Okay, thanks. So first, your first solution, I would say, it's about adopting a specific mindset that you call, again, and congrats for this, AAA, I don't know if it is the right pronunciation, mindset for outcome anticipation, outcome attitude, and outcome acceptance. Could you explain each of these three factors that we need to control to make better decisions?

  • Speaker #0

    Yeah, I've called it, I actually called it AAA, Eric, because it was a bit of a pun on my investment background. So that was the three A's. As you can see, I'm clearly trying to get people to remember some of this stuff by making it easy. Another mnemonic for people, but AAA was easy. So all it really says is that you control your own mindset, that if you go into your decision making, thinking about the fact that you can. you can't control everything, but you can control your anticipation, you know, your attitude to what happens and you can control whether you accept it or not. So you minimize the liability, if you like, when you do control these. So, and becoming this decision ninja relies on, you know, this being intentional, et cetera. So it's really just putting people into the right frame of mind. If you decide, woe is me and I have no control here. you're not really going to get very far anyway. So really, I was just trying to get people into a more positive mindset before they make this decision, before they adopt any of the approaches. So that's the suggestion with the triple A mindset, just an attitude of, you know, acceptance and anticipation. So predict what's going to happen, control your own attitude. And then when it happens, it happens and deal with the consequences.

  • Speaker #2

    But it's not only a question of state of mind or mindset. You have also, and you suggest also, what you call the sonic strategy. Again, I need to help your readers to remember each of the five key actions. S for slow down, O for organize your attention, N for navigate novel perspective, I for interrupt mindset, And C for calibrate situation, stranger and strategy. Could you briefly summarize each of these five key objectives?

  • Speaker #0

    Yeah, and they are very, very brief, Eric. And you would do it in a process. So I deliberately chose them. I mean, Sonic was deliberate because it plays on death spots, obviously. But you do it in a process. So first of all, you need to just slow down to enable yourself to even challenge your thinking. If you don't do that. you're kind of not even at the races. So if you don't organise your attention, so in this frantic, noisy world, I'm suggesting you do that so that you can actually be in the best position to make a decision. And there are three or four different, I should say there are three or four different strategies for each of these, all science-based, all should be familiar to people. Then thirdly, you navigate these novel perspectives so that you don't rush to judgment and you at least take into consideration other people's views. interrupting mindsets. I did quite like this one because you're mentally putting on the brakes so that you again avoid that rush to judgment or the assumption of validity. So you interrupt your own mindset and you interrupt other people's as well. And again, two or three techniques you can use for that. And then the last one is, you know, kind of recalibrating situations and strangers and strategies is the last stop to getting it right. And that's using checklists and implementation intentions. And you know, techniques that would be certainly in the behavioural science field that people would be familiar with. But again, really, really easy. And people have the choice to try some and, you know, use these as befits the particular decision scenario.

  • Speaker #2

    Yeah, it's funny because you mentioned, I think, friction. and sludge, and the interest to put friction into our decision-making process.

  • Speaker #0

    That's exactly right. And decision friction, so the sonic strategies are based on decision friction. And a lot of people don't like friction, as we know. But if you deliberately slow down, this is like a speed bump for the mind, you slow down in the initial cases so that you can at least make a better decision. If you keep hurtling along... you might make a great decision. I mean, some people make great decisions at speed, but you're more likely to make a mistake than if you don't at least reconsider it. So you're absolutely right. I'm using decision friction in a positive sense rather than the negative sense.

  • Speaker #2

    Yeah. It's funny because we have interviewed some months ago Cass Sunstein on... One of his new books, but Cash is used to publish a new book every four months, which is incredible. And this was a book about sludge and the danger of sludge in his case. What is really interesting, and I think very interesting, Nuala, is for each of these objectives, it's not only the sonic strategies that you recommend, but you recommend very specific tools or processes to be successful at implementing the sonic judgment strategy. Could you again share one or two, maybe three, examples of the tools or process you suggest for our listeners.

  • Speaker #0

    Yes, and again, they're specific to any situation, and it depends whether you're on the S, O, N, I, or C. So I'll give you an S and a C, so the beginning and the end of one. I like, I mean, and again, people like different ones. I like the five Ys because... You know, and the five whys is based on a 1970s idea from Toyota, the car manufacturer, and they developed a very simple approach to solving problems that's also now used in Six Sigma. I merely have adapted this for decision making in this case. So, you know, Toyota assumes that most technical problems is based on a human problem. And basically they ask themselves why five times to ascertain the underlying cause. And I think this is useful for. you know, probing false reasoning or checking your assumptions. And it does mitigate against biases like probability neglect or loss aversion or commitment escalation. So any combination is possible here. Why this? Why that? Why the other? Why this? It's the same as the and then and then and then and then and then. So when you do that, it's easy to remember, but you might do it when you're feeling uncertain about a particular decision. well, what if I do that? And so what? And so what? And so what? And I think when you do that, it de-escalates the stress that you're under when you're actually making a decision. But in this case, it introduces decision friction and it slows you down. So if you ask, you have to wait till you get to five. So you should be slowed down by three or four, at least by the time you've thought of the answers. So it's a way to slow you down. another one, which is one of my favorites. And this is at the end. This is a way to get people to make sure that they do this. So initially, you need to commit that if you want to be a decision ninja, well, you need to at least commit to making an effort in the space. So this is to overcome the low implementation intentions, to overcome low willpower when people are deciding fast. So if you're a really fast decision maker and you decided, I want to get this right and this really matters. Well, then it's again, it's that if then plan that addresses the behaviour intention gap. And so we know that this works from appointment keeping to voting. So we know that pre-planning and commitment works. So if you pre-commit to being a better decision, it helps.

  • Speaker #2

    Yeah, I remember we have just interviewed one month ago with Suzanne Todd Rogers. I think he made a wonderful experiment to demonstrate the power of implementing a plan to encourage action and not just intent. In his case, it is about voting.

  • Speaker #0

    Yes, absolutely right. And the fantastic studies they have done. And again, this is just adapting the same principle. So if you commit to using the perimeters checklist, and what I've done is I've put all the biases together under that perimeters checklist, and that's available to people. Or you make a decision rule that says, I'm always going to distrust the information, but verify kind of a play on the Ronald Reagan trust, but verify idea, then you then it becomes a habit. And we all know about the power of habits. And then you have increased your probability of reflecting more and reducing error.

  • Speaker #2

    Now, you have learned that we have delved into the details and again, as Suzanne mentioned before, we do recommend our listeners to read your book because there are so many concrete recommendations to help us to make a better decision. But could you help us take a step back now and in a nutshell, summarize for our listeners?

  • Speaker #0

    listener your main advice for making smart decisions i'll go back to i'll go back to the one that i mentioned earlier on eric and that is tuning out you tune out is a hidden source of misinformation but tuning in is a hidden source of opportunity and there's a fantastic cherokee proverb that summarizes that which is if you listen to the whispers you won't have to hear the screams and that's the same as saying diagnose the information before you prescribe And I think when people tune in to others, not only do they hear the right messages and hear what other people don't, giving them an advantage, but other people will tune in to them. And we do know that because you relate better, you'll understand other people more, and by definition you win. So tune in to win, tune out, lose out. And I think if people remember that, that's pretty easy.

  • Speaker #2

    Okay. Only before my last question, an additional question. I know that you work for organization leaders, CEOs and so on. It is what we also are fortunate to do at BV Energy Consulting. Do you have a specific recommendation to make smarter decisions for leaders, organization leaders?

  • Speaker #0

    I think it's in the book, Eric. I think that if they remember the perimeter is traps, if they choose it, because there are so many there. And again, it's too easy to just say, listen better or tune in differently. But if you slow down and strategically reconsider the voices that you listen to and the voices that you don't, we know that there's a problem of unheard voices. We know that there's a polarised society. We know that most people feel unheard, whether you're an employee, whether you're a citizen, whether you're a customer. people are feeling unheard now, way more than ever before. As a decision, as a decision maker, as a leader, when you choose to listen to voices differently, you know, whether it's ego, conscience, the voice of comfort, familiarity, reconsider the voices you listen to, you will make better decisions.

  • Speaker #2

    Now my final question for you as we are at the end of our really insightful conversation. You have conducted a really brilliant analysis of the external and internal causes that explain why it's so difficult to make decisions. You also propose very relevant decisions. But my question is in the end. Aren't you suggesting something that's very challenging, namely that humans become System 2 decision makers when we are intrinsically System 1 beings? How do you suggest we balance our imperfect human side with our best side?

  • Speaker #0

    It goes back to the commitment, Eric. You hold a position of power. It is your moral duty to get it right. When you are in a position of power impacting other people's lives, it is the onus is on you, the responsibility is on you to do it. And that requires, it requires a commitment to at least trying to make better decisions. Now, most of the time you might get them right, but the consequences are too high when you get them wrong. And I think that's what it is. Look at the consequences when people get it wrong. And that's the problem. Do you want to be on the right side of history or the wrong side of history? I think misjudgment is not entirely your fault in this noisy world. You know, we've got our internal problems, as you've just said, you know, internal misinformation, external disinformation. But judgment is a choice and you choose to get this right. You choose to be a decision ninja or you don't. But when you hold positions of power, other people's welfare, lives, livelihoods, happiness depends on you and your choices. I think it's beholden on you to do it.

  • Speaker #1

    Yes. Thank you so much again, Noala. This is a fantastic conversation and we're so happy that you joined us today. Is there anything you would like to leave our audience with, such as where they can find out more about you and your work?

  • Speaker #0

    Sure. You can obviously get the, you can get the book that's available on Amazon or all leading outlets. But in terms of the actual work, I've over a hundred articles and a lot more information on, on. different, you know, speaking events and stuff that I've done are on the website. And that is www.nualsh.com. So N-U-A-L-A-G-W-A-L-S-H dot com.

  • Speaker #2

    Thanks a lot, Nuala. It was a wonderful conversation. You are a very engaging speaker. And again, we do recommend with Suzanne to read your book because it is 350 pages of concrete and helpful recommendations. Thanks a lot.

  • Speaker #0

    Well, thank you both very much. It's been lots of fun. Be Good, a podcast by the BVA Nudge Unit.

Chapters

  • Introduction and Career

    01:02

  • Inspiration for the New Book, Tune In

    06:20

  • Unpacking Research Techniques

    09:48

  • Four Key Factors that Affect our Judgement

    11:45

  • Understanding the Trilogy of Error

    15:21

  • Structural Traps in Human Decision Making

    20:32

  • How to Tune into the Voices that Matter

    29:19

Share

Embed

You may also like