221. Aristotle Project - Psychological Safety

August 31, 2021

Safety in an office environment might not be something most people think about. Unlike working in a job that requires manual labor, dangers in an office are less visible. That does not make them more serious. In today’s episode, we discuss psychological safety and what this entails. There is a range of definitions for the concept, but it should be underpinned by mutual respect and acceptance. We discuss what happens when this is lacking in the workplace and then also delve into what you can do to foster it. Being open, vulnerable, and willing to show that you do not know everything goes a long way. Tune in to hear it all!

Key Points From This Episode:

  • What psychological safety means for everyone.
  • Some of the dangers that people who work in offices face.
  • When you feel unsafe sharing your thoughts, you are wasting a lot of mental energy.
  • Questions to answer to know whether you have psychological safety in your workplace.
  • If it is difficult to ask other people in the organization for help, it hinders creativity and productivity.
  • How we can foster psychological safety in our workplaces.

blog-cta_the-rabbit-hole

If you are a software developer or technology leader looking to stay on top of the latest news in the software development world, or just want to learn actionable tactics to improve your day-to-day job performance, this podcast is for you.

Apple Podcasts Spotify

Transcript for Episode 221. Psychological Safety

[0:00:01.8] MN: Hello, and welcome to The Rabbit Hole, the definitive developers podcast, living large in New York. I’m your host, Michael Nunez, our producer today.

[0:00:09.0] WJ: William Jeffries.

[0:00:10.7] MN: And our intrepid guest.

[0:00:12.3] SC: Sophie Creutz.

[0:00:13.5] MN: Today, we’re talking about the Aristotle Project and specifically psychological safety. I will say that the intro is a little different. Dave is out at the moment enjoying a lengthy vacation, as he enjoys vacation before he gets married. Which, congratulations to him. Shot out, some clap to him. Today, we’re talking about psychological safety. Sophie had the idea of pulling up this article about the Aristotle Project, which I have never heard before. We’ve done an episode on psychological safety, Episode 31 if you want to go back and listen to that, but we were babies at the time. What do we know about psychological safety? Now we know a whole lot more.

Sophie, tell us how you learned about the Aristotle Project, how did it come to you and whatnot.

[0:00:59.7] SC: Yeah. It’s funny, it’s something that I think we’re all going to be learning more and more about psychological safety as we go on. I don’t think there’s necessarily any point where you can say, “I’ve learned enough.” I think this is an evolving process. But yeah, I think I heard about it initially, maybe when I was going to Fullstack Academy and they spoke about psychological safety as being the most important factor in determining how effective a software team is, and I thought that was pretty compelling and was glad that they had actually done that research to prove that that was the case. But yeah, I mean, maybe we should talk about what we think that means and what Google thinks that means psychological safety.

[0:01:45.5] MN: William, do you have any thoughts on what psychological safety means to you?

[0:01:49.0] WJ: I think in the manufacturing industry, there was kind of a revolution around safety as a first-class citizen. The idea that making a workplace more physically safe for workers made everybody more productive, made output higher, it was really good for profits. Naturally, it would be wonderful if we could take that same advice and apply it to software development, but people sort of overlook it because there are no real present physical dangers to anybody who’s working in an office. I mean, okay, sure, there are a couple, but it’s not like you’re going to lose a limb. Yes.

[0:02:31.5] MN: Yeah. I mean, I remember walking into my college person who tries to figure out what you want to be when you grow up. I was like, “Yeah. I want –” the advice, I want to be behind a desk where all my limbs are safe, which meant that I didn’t want to be a computer engineer anymore. I didn’t want to like deal with the soldering iron where I could burn myself by accident and I have to wear an apron to do the job. No, no, no. I want to be super safe in front of my computer where I’m punching keys. The thing that will kill me is arthritis. Arthritis is what’s going to get me, not like me losing my wrist. 

[0:03:05.3] WJ: It’s a painful way to die.

[0:03:08.5] MN: Arthritis. 

[0:03:09.4] WJ: But I think that there is real value in trying to make the workplace safe for people who work in an office, have different kinds of dangers that make them less productive. I think they’re primarily social. There is real danger when you are excluded from the group or ostracized in some way that that triggers a really basic reaction in people, makes them a much more closed often. I think you can see how that would make a team less productive the same way that not having proper guardrails in place or having people out because of real workplace injury, it makes teams less productive.

[0:03:41.9] SC: Right. But perhaps it’s more subtle in some ways, so harder to address.

[0:03:46.7] WJ: Yeah, absolutely. I mean, it’s very obvious when somebody gets their hair caught in a machine, that that is a scary and dangerous thing. It’s like less obvious when people aren’t talking in meetings.

[0:03:58.5] SC: When someone feels like, right, you’re in a meeting and someone feels like they can’t contribute or they don’t feel like they can contribute in the way that they want to bring up the topic area, the elephants of the room, the paper tigers and that kind of thing because it isn’t psychologically safe to do so. There might be consequences, that kind of thing.

[0:04:17.8] MN: Or even another level, it’s like, suppose you did some work that has bugs in it, and then you’re out it because you introduced that into production and it was your fault that everyone had to wake up at 3:00 AM to fix those changes.

[0:04:33.3] SC: Well, I would argue that there’s rarely ever one single point of failure in that regard. You know what I mean? We work –

[0:04:41.4] WJ: It was Bobby’s fault. I mean, we all know.

[0:04:43.3] MN: Clearly, it was Bobby’s fault. That guy needs to write test better. That’s what he needs to do. Think about edge cases. Man, get it together, Bob. Like that probably will not feel safe at all for Bobby at all. Like Bobby is going to think twelve steps ahead and be super anxious about the work that he picks up. He’s probably not going to test his limits and be uncomfortable, and parts of the codebase that he’s unfamiliar with and Bob is going to be a mess. He’s just going to be miserable all the time.

[0:05:14.4] SC: Poor Boby, he’s just going to be unmotivated.

[0:05:16.7] MN: Unmotivated.

[0:05:16.8] SC: He’s going to think, why should I put in the effort?

[0:05:20.2] WJ: And everybody who’s afraid of being like Bobby is going to stop taking risks.

[0:05:25.2] SC: Yeah. Don’t be like Bobby. That’s exactly it. He becomes like the scapegoat, he becomes the warning, et cetera, et cetera. That was a lot. 

[0:05:34.7] MN: We know that psychological safety is important. How do we – I mean, I have the article here pulled up, where it talks about how do you foster psychological safety. I can go down the – ask the question and we can agree or disagree as you go through them. But the idea that if these statements strongly agree or disagree with you will determine whether your current workplace is psychologically safe for you to be a part of.

[0:06:06.8] SC: Yeah. Here’s one definition for psychological safety. A team climate characterized by interpersonal trust and mutual respect.

[0:06:17.0] WJ: There’s another definition I have heard or a definition of a psychologically unsafe environment. It’s an environment where everyone has to think a lot before talking. 

[0:06:28.5] SC: Oh! Mental overhead. Additional mental overhead, yes.

[0:06:31.8] WJ: If you’re spending a lot of time worrying about how what you say will be interpreted, then probably you feel unsafe.

[0:06:40.5] SC: Like I said, that’s energy. That’s energy you’re putting into mental overhead, into steps of translation just to communicate with the other members of your team. Whereas, that energy could go many other places, could go into actually writing code, actually delivering solutions.

[0:06:57.9] MN: Right. You feel safe sharing your thoughts without much of that mental overhead because everyone is in mutual agreement and respect with each other that you can kind of look past of those things.

[0:07:12.9] SC: Right. So then you can get to problem-solving and solutioning sooner, which I’m sure, all of us have stride. It seems like we really believe in the power of care programming. I absolutely do. I think that to be an affective pair, you do have to have psychological safety, and what does that look like and how do you develop that. See, I think that’s a pretty essential thing.

[0:07:39.8] MN: Yeah, because like a lot of times if you have – if you’re not psychologically safe when you’re pairing with someone, you may think that, “Oh! Bobby’s out to get me or he’s going to critique my code as I’m typing.” Like in real-time, you’re being judged when that’s not what pair programming does or is supposed to do.

[0:07:58.0] SC: Right, it’s not about judgment, exactly. It’s not about judgment, it’s not about assessment, it’s about problem-solving together. Part of that problem solving together process is going to be communicating ideas, and maybe you disagree about an idea and so you’ll discuss it. You’ll say, “I think this code is working in this.” Then if someone says, “Well, but I think it’s working in this way.” That doesn’t mean that they’re threatening your psychological safety. It doesn’t mean that they’re negating you. It just means that you are coming to a mutual understanding. If psychological safety, if it’s not there and that’s standing in the way because it’s lacking, then I think you’re going to have a harder time getting to that point of mutual understanding and can continue it along that path of actually communicating technical concepts with each other in real-time and solving these problems together.

[0:08:47.5] MN: How do you foster psychological safety? We got the definition. Thank you, Sophie. Using the documentation, I guess the results of this study that has happened should be in the show notes. We’ll make sure we’ll add it. There is an organizational-behavioral scientist named, Amy Edmondson. Edmondson asked the team members how strongly they agree or disagree with the statements and this will determine whether you have psychological safety in your workplace or not. I’ll start with the first one and we should have a discussion because the first one is already like pretty massive.

Number one, if you make a mistake on this team, it is often held against you. Right? I think that anyone –

[0:09:29.7] SC: How about that typo? That typo there, Bobby. You don’t know how to type.

[0:09:33.6] MN: Yeah. Bobby, what’s wrong with you? Yeah. Get it together. Like we should have refactored that. Why don’t you – why aren’t you thinking ahead and refactoring things.

[0:09:43.1] SC: Exactly that.

[0:09:43.5] MN: Right. Like the –

[0:09:44.7] SC: Wow! This commute message –

[0:09:47.4] MN: It sure sucks.

[0:09:48.5] SC: I cannot believe.

[0:09:52.1] MN: The idea that like, I think you mentioned it before. There are multiple steps in the systems in place where mistakes happen. It can be taken at a much bigger, like it takes more than just a person to make a mistake when you introduce something in production, right? Because there are so many different steps that exist, but if someone is pointed and ousted as the person who makes the mistake, then we know that you – one may not feel psychologically safe in that environment because you don’t want to mistakes.

[0:10:22.7] SC: Right. Exactly. If it’s evident that a scapegoat is needed if something goes wrong, we know we have an issue with psychological safety because we both know if a bug gets all the way into production, there are so many stopgaps along the way. There’s the person who wrote the code, there’s the person who reviewed the code, there’s the other members on the team, there’s the QA, there’s regression. There’s so many places.

[0:10:48.1] MN: Right. There’s demo, like the physical act of showing that the thing across your team with your stakeholders. That’s another point gap of checking all these different places. But it would, even in those parts, it would – there has been teams where it’s like, “Oh no! That’s was Bobby who introduced that bug,” as opposed to Lucy, or Daisy or whoever have you who may have seen it and then said something for that change. 

[0:11:14.1] SC: Well, I think a healthy system would then examine how did our process fail here, how did this fail us. Also, like even if someone did make a mistake and we all make mistakes and that’s part of how we learn, of course. Even if someone did make a mistake, we look at like how are we going to react to that, because I think the appropriate reaction would be something on the lines of, “Okay. What did we learn from this?” If that’s not the reaction, then you don’t qualify for number one here. I think number two is interesting also. Members of this team are able to bring up problems and though issues.

[0:11:51.5] MN: Yeah. Because, I would imagine that if you’re in a psychologically unsafe space, then you may not be able to bring up these problems because I’ve heard phrases like, that’s how we’ve always done things or there’s nothing wrong with the process, everything is fine. 

[0:12:09.9] SC: It’s a dangerous one. That’s how we’ve always done things. Because where else in the world would that be logically sound, right? Just because you’ve always done it that way, right? Like we didn’t use to have modern medicine and now we do have modern medicine. But if I went to you Bobby and I said, “Look. We use leaches here. That’s how we’ve always done it.”

[0:12:33.7] MN: Leaches. If it ain’t broke, don’t fix it. Here’s a leach. Put it in your arm, you’ll be right out. 

[0:12:38.5] SC: I mean, Bobby might not be able to say, “You know what, actually, I think perhaps just a Band-Aid would be perfect.”

[0:12:45.8] MN: Some gauze right on the arm.

[0:12:48.0] SC: Yeah, but you might not feel as if you can bring up your problem with the leaches. Then, progress is totally stalled.

[0:12:56.2] MN: Right. Because you’re stuck in that one process not thinking of other options or solutions. And even bringing that up is an issue, where like, that’s like the really big thing.

[0:13:09.0] SC: Perhaps, even if you are thinking of potential problems and solutions, you cannot articulate them. So over time, I would imagine that not being able to articulate these things ends up meaning that you do in fact stop thinking about potential solutions. I think that’s the step that gets really dangerous.

[0:13:27.3] MN: What out for that if, number two, if you’re unable to bring up tough problems, tough issues in your team. Definitely, you want to check that safety. Number three, people on the team sometimes reject others for being different. This could go – there’s all different types of life for software engineers and I think that you know, a lot of the time, psychologically safe environment will be that there are disagreements across other people for different reasons. Like, “Oh! I disagree with implementing a factoring pattern in this particular module” versus “Oh no! I’m smarter than you for obvious reasons. Come on!” That’s a completely different way of disagreements and rejection if you will. For being different is kind of, I can imagine that being quite harsh on that individual, who may be deemed the “different.”

[0:14:25.6] SC: Well, this reminds me of a quote and I’m paraphrasing a little bit here, but this is a quote from Bill Nye. He said that the more diverse a system is, the better it can respond to changes.

[0:14:40.5] MN: Yes, I agree with that because there are more – because they are – the more diverse it is, the more options are available. Yeah, points of view are different. Shout out to Bill Nye, the Science Guy, friend of the show.

[0:14:52.4] SC: Shout out to Bill Nye. Go listen to his podcast. Here’s a plug for Bill Nye. He got a podcast called Science Rules.

[0:15:00.2] MN: Yes, and it does for sure. I think that the idea, yeah, there is a difference between rejecting – I mean, reject is a hard word too, right? Like disagreement is probably the thing that I’m thing about versus rejection of something. Rejecting others for being different is, it’s quite mean, so don’t do that. It’s no good.

[0:15:21.7] SC: Don’t do it. Yeah, because everyone’s different so just remember that. A homogenous culture is not a healthy culture, as that quote would imply. If you don’t have diversity, you have a culture that’s homogenous and therefore, you don’t have the flexibility that you need, especially in a changing world like we do have in 2021.

[0:15:46.0] MN: Right. That also brings up something in terms of hiring, I’ve been in the hiring part of the space for a while. The term culture fit is often used for a person that we’re hiring. I forget where I picked up the verbiage or the vocabulary to think about, not use the word culture fit, but a culture add. We’re adding to our culture by bringing this person onto the team versus – this person fits our current culture, so they should be part of it. Adding is important, then you definitely want to do more adding.

[0:16:20.4] SC: I love that. I love changing the language in that way, that’s so well considered.

[0:16:25.4] MN: Right. Don’t reject other people for being different. That’s number three. That, you shouldn’t do that. 

[0:16:30.8] SC: The world is free. Yeah. We got a few more here that we can go through. It’s safe to take a risk on this team. It is difficult to ask the members of this team for help. No one on this team would deliberately act in a way that undermines my efforts. Working with members of this team, my unique skills and talents are valued and utilized.

[0:16:50.5] MN: Yeah. I mean, definitely, it is safe to take risks on the team, very similar to number one, right? Like if you can’t take risks, then when mistakes happen, you’re to blame.

[0:17:00.4] SC: Right. I think these ties back into this idea. If you can’t bring things up, it actually starts to kill the vibrancy of independent thought and creative problem-solving. 

[0:17:13.5] MN: And asking for help. Pair programming kind of allows you to do that, like in real time because you’re pairing with someone. When it’s difficult to ask other team members for help can be a block or two a lot of things because he’s like, “Oh! I have my work that I need to do and if I don’t finish it, I’ll get in trouble and I’ll be ousted if I don’t spend all my energy on this. Because if I introduce a bug, then I’ll be called out for it.” Not even being able to help other individuals makes that very, very difficult too.

[0:17:43.4] SC: And it creates silos if people don’t ask each other for help. 

[0:17:47.6] MN: No one on this team would deliberately act in a way that undermines my efforts. I guess, yeah, I would feel psychologically unsafe if I had introduced something and then someone went right into my code, the very next commit and change everything. It’s like, “Oh wow! Okay. You’re definitely taking the work that I’ve done.”

[0:18:07.6] SC: Yeah, antithesis of teamwork right there. It’s just the exact opposite.

[0:18:14.4] MN: Always want – you definitely want to work on a place that where your skills are valued. The fact that people appreciate you coming into work and bringing in the good work that you do.

[0:18:28.1] SC: Think about this too. If I wanted to actively undermine you, Bobby, that takes effort itself. I would have to plan, “Okay. Let’s get up in the morning and have to think. All right. Well, I got to do my job and I have to undermine Bobby and his job.” That’s a whole other task, yeah. 

[0:18:46.1] MN: Bobby always introduces bugs into the code, so I have to go and ensure that he doesn’t do that, so let me look at his PR. Oh my God! That’s gross. Let me start making some changes that I feel are better than his. Just steamrolling all of the coaching just to have it.

[0:19:06.7] SC: But a more effective solution might be just to work on pair programming with Bobby and that way, we will have shared knowledge, we will have shared understanding, we will have mutual respect, ultimately, ideally. 

[0:19:20.0] MN: In the end, there’s a TED Talk I guess that goes through some of the questions that we had as well. They talk about three things that one may be able to do to foster psychological safety in your team. The first one being, frame the work as a learning problem, not an execution problem. 

[0:19:38.5] SC: Yes, growth mindset, I think is what this is speaking to.

[0:19:42.8] MN: Right. The idea that like, hey, we’re all going to learn some new today. It’s not like, you need to go and execute this problem swiftly. It’s like, “Hey! We have this issue, how do we fix it, and let’s all learn from it.”

[0:19:55.0] SC: Right. I would say, it’s actually more impressive. This is little bit of an opinion. This is a little bit of an [inaudible 0:20:00.3]. But I think that it is more impressive if someone can demonstrate their ability to learn something new rather than demonstrating something they already know. 

[0:20:10.1] MN: Right. Yeah, because like you’re able to show, “Hey! I am learning something new and this is the way that I go about the learning process.” 

[0:20:20.0] SC: How I learned. I’ve demonstrated I can learn so clearly. I’ve demonstrated that I have good thought process and that I can – if I don’t know it, I can acquire that knowledge.

[0:20:28.8] MN: Number two is acknowledging your own fallibility. 

[0:20:32.2] SC: Fallibility, yes. I love this one. I love it. I make mistakes.

[0:20:38.1] MN: Yeah. I mean, but this one is a little difficult. I guess it goes in tandem with the – because one would have to admit their own fallibility and the team needs to make sure that because I’m owning up to it, that people can’t just point at me for it. Like there has to be changes across the entire team infallibility for us to be able to move forward psychologically safely I guess is the word.

[0:21:05.4] SC: Because if I’m the only one that’s fallible, that’s a problem. Everyone has to demonstrate their fallibility and how do we do that? Do we do it all together? Do we do it in sequence? Does the team leader do it first? Because they’re in a position of authority and power and therefore, once they’ve done it, everyone else in the team can also do it. I’d be curious to see what exactly the process is here for integrating fallibility into a team and demonstrating that.

[0:21:38.5] MN: I would think that it would be wise for – like everyone can start doing it, but like people definitely move forward when team leads or people with respect to do it too. Because then, they’re probably the people who are enforcing the psychologically unsafe like environment. That actually is not [inaudible 0:21:58.9]. Whoever is deemed the culprit, I guess of a lot of the things we discussed before. If they start to realize like, “Hey! It’s all right. I failed in introducing this new feature. We’re all going to learn.”

[0:22:12.1] SC: Lead by example.

[0:22:13.9] MN: There you go. It would definitely help. The last one is, model curiosity and ask lots of questions. This one is also I think difficult, because people may think that asking a lot of questions is like, you are –

[0:22:31.2] SC: Demonstrative of your lack of knowledge.

[0:22:33.8] MN: Yes, exactly.

[0:22:35.3] SC: But in fact, it’s not at all.

[0:22:37.0] MN: Right. It could also come off as like, “Oh! You don’t trust me. Why are you asking so many questions?” It’s like the idea too.

[0:22:43.2] SC: Yeah. If you’re asking questions to someone and they don’t feel psychologically safe, they could interpret it in that way. That’s true.

[0:22:50.8] MN: I mean, I personally think that fostering those three things, you have to do it very tactically, because I feel like they can work against you, especially those last two. It could work against you, especially those last two. It could work against you in your psychologically unsafe environment. Everyone needs to be down to do those three things.

[0:23:13.3] SC: Everyone’s got to be on board. Everybody’s got to hop on that wagon. Perhaps especially leadership. If you do have the situation where it’s not a super flat hierarchy too, then it would be really, really good for people who do you have a little bit more authority to of curiosity to ask lots of questions, to show themselves making mistakes.

[0:23:38.4] MN: Yeah, and you don’t – stop rejecting people for being different. That should be the other one, the number four. That’s what you can do to foster psychological safety. Everyone’s different and you should accept people for who they are, and learn more about them and get that mutual respect amongst your peers.

[0:23:55.9] SC: Yes. Not only that, but in terms of acknowledging other people’s differences, like find their unique strengths. That’s how you get to the unicorn status, I’m pretty sure.

[0:24:06.3] MN: Yeah. I mean, I’m sure that there’s a unicorn project out there. We’ll talk about that sometime in the future. I think that with –

[0:24:13.8] SC: That will be up next.

[0:24:14.5] MN: This one I think is the largest bit of the Aristotle Project. The following four and I’ll just go through them briefly and we’ll probably talk more about them in the upcoming episodes. We have dependability, we have structure and clarity, we have meaning.

[0:24:32.2] SC: My favorite.

[0:24:33.0] MN: And we have impact. We’ll definitely talk and dive into more of those as Google has realized over time that it’s not just about collocating your team, or workload size, or seniority that makes a good team. It’s these five different attributes and we spoke about psychological safety just. I’m interested in finding more information about the other four.

[0:24:56.3] SC: More to come. Stay tuned.

[0:24:58.9] MN: Stay tuned.

[END OF EPISODE]

[0:25:01.3] MN: Follow us now on Twitter @radiofreerabbit so we can keep the conversation going. Like what you hear? Give us a five-star review and help developers just like you find their way into The Rabbit Hole. Never miss an episode. Subscribe now however you listen to your favorite podcast.

On behalf of our producer extraordinaire, William Jeffries, and my amazing co-host, Dave Anderson, and me, your host, Michael Nunez, thanks for listening to The Rabbit Hole.

[END]

Links and Resources:

Michael Nunez on LinkedIn

Michael Nunez on Twitter
David Anderson on LinkedIn

David Anderson on Twitter
William Jeffries on LinkedIn

William Jeffries on Twitter

The Rabbit Hole on Twitter

Sophie Creutz

Google's Project Aristotle

Amy C. Edmondson

Stride