Bring psychological safety into your test team

Computers may not have feelings, but humans do. Test managers often have to address team conflicts and other personal issues that get in the way of bug fixing.

Computers may not have feelings, but humans do. Test managers often have to address team conflicts and other personal issues that get in the way of bug fixing.

November 3, 2020
Tamas Cser

Elevate Your Testing Career to a New Level with a Free, Self-Paced Functionize Intelligent Certification

Learn more
Computers may not have feelings, but humans do. Test managers often have to address team conflicts and other personal issues that get in the way of bug fixing.
Computers may not have feelings, but humans do. Test managers often have to address team conflicts and other personal issues that get in the way of bug fixing.

There is an ages-old adage: Don’t shoot the messenger. Unfortunately, businesses have an ingrained habit of ignoring that advice. It makes employees fear to speak up, even when the consequences are considerable.

A recent, tragic example pertains to problems with the Boeing 737 MAX jet. A damning Congressional report earlier this year found that the pressure to keep the plane on schedule overrode designers’ concerns with the jet’s safety. That led to two crashes, 346 deaths, and thousands of planes grounded. 

Just as tragic were the two Space Shuttle disasters, in 1986 and 2003. In both cases, engineers knew the Shuttles were doomed but were too afraid to say anything to management. 

Those were hardware issues, but they apply to the software testing world, too. Hopefully the consequences are not as momentous or tragic, but the same problem can arise: Testers find problems but are afraid to take the issue to management for fear of a shoot-the-messenger response. 

This has given rise to a new way of thinking called psychological safety, pioneered by Dr. Amy Edmonson, the Novartis Professor of Leadership and Management at Harvard Business School and author of The Fearless Organization. Psychological safety, a practice championed by Google, is about creating a climate in which people are comfortable being and expressing themselves. 

“Basically, psychological safety is about not being afraid to be punished,” says Gitte Klitgaard, an Agile training coach and lecturer on psychological safety in Stockholm, Sweden. “If you make a mistake, it means not hiding things under the table because you're afraid that you'll be fired.” 

In at atmosphere of psychological safety, says Morgan Ahlstrom, also an Agile coach based in Sweden, “In a more conscious way we can create a space where people can take difficult situations, discussions in the workplace, without feeling threatened.” The people working on the project don’t feel vulnerable to ridicule or worry about getting fired when they think in a different way or disagree with other people. 

In work settings without psychological safety, people hide their mistakes. Mistakes fester, and then become even more expensive to fix. You also lose the opportunity to learn from mistakes when you don't share them with other people. 

This phenomenon is true in any business role. But it’s particularly important to address psychological safety in software quality assurance and testing realms, because these are the people whose job it is to point out weaknesses and vulnerabilities

The levels of security vary between different industries, says Ahlstrom, but also within a company. For example, at one client, the employees said the company rated a high level of psychological safety. But then he realized that everyone was talking about their own team. When Ahlstrom instead asked employees how safe they felt in the larger organization outside of their team, safety ratings fell several notches. 

“They would feel very safe within the team with their closest workmates,” Ahlstrom says. “But whenever a problem happened, there started to be finger pointing and scapegoating between the teams.”

Where responsibility lies

Harvard professor of psychology David McLelland suggested in the 1960s that 50% to 75% of team climate variability is based on the manager’s behavior. But psychological safety is all-encompassing. Every team member, up and down the hierarchy, must be accountable for it.

Ahlstrom believes management can contribute to a psychologically safe or unsafe work environment. But he adds that safety is ultimately something to which we all contribute. 

“If you ask me a question that I find less intelligent and I started rolling my eyes, that would be a way of creating an unsafe space for you. Will you be more hesitant to ask me a question in the future?” says Ahlstrom. 

Lena Wiberg, engineering manager at Swedish tech firm Blocket and a conference speaker on psychological safety since 2017, also feels that responsibility for psychological safety lies with everyone, not just the managers. But managers are expected to lead. “If it's not clearly a priority from everyone on the top, it will be way harder to make sure it actually works all the way through the organization. This means leaders need to be vocal about it; they need to act when things happen; and they need to work on it from everything from recruiting, onboarding, daily work and long-term strategy,” she says. 

In a really toxic environment, Klitgaard says, expect it to take a long time to change – if that’s even possible. You may need to go so far as to clean out some of the staff if they are resistant to change.  “Even in this case, where we had two leaders who were really, really into this, it took over six months before we started really seeing results. And some of it didn't happen until a year later,” she says. 

The first step in addressing this issue is identifying how big of a problem you have. Create an employee survey where you ask them anonymously about the level of safety they're experiencing, says Ahlstrom. 

Another good starting point is showing from the high level management perspective that this is important. “This is something we care about setting aside time for, for education for working on the topic. And then perhaps creating workshops for the teams where you talk about the importance,” says Ahlstrom.

Safety in tech

In some ways, the tech community has an advantage. In tech, people have been working in teams for quite some time and in the true sense of the word teamwork. “So it might be less of a problem there, then in other industries, where there have been more individual ways of working,”  Ahlstrom says.

In comparison, Ahlstrom says, “Traditional legal and finance departments will usually have quite low levels of psychological safety. Because you in a legal organization, in a law firm, you just don't make mistakes. If you do, you don't let anyone know about them.” 

However, Wiberg said IT definitely has an insecurity problem. “The more homogenous a group is, the worse I feel like it is,” she says. “We are currently, and have been for many years now, seeing and hearing women, black people, LGBTQ people and others speak up about how they are treated. I fear we also have a lot of backlash from the ‘norm group,’ because when you are used to something you stop seeing it and being reminded about it can feel like a personal attack.” 

Another issue with psychological insecurity in tech circles is the industry’s dependence on constant innovation and trying new things. Klitgaard points out, “If you don't have that safety, you're going to stick to what you know. So you're not going to try that new automation language, for instance, or, you're not going to dive into that code.” 

Google helped bring psychological safety to Silicon Valley. In 2014, Google’s Project Aristotle measured the success of hundreds of teams. The company expected success to reflect the level of education in a team or its diversity in a team. But they found that the thing that really made a difference was its psychological safety. Because Google started talking about these things, the rest of the industry is now picking up on it. 

For QA teams, Klitgaard notes, most testers she knows do some kind of risk based testing to identify the biggest problem likely to occur, its likelihood of occurring, and what the damage might cost. That much risk makes them skittish, she says. 

QA people are historically seen as the gatekeepers, the people who criticize, point out weak points. “We have both an easier and a harder time. On one hand I am sure it requires a larger sense of psychological safety to dare to speak up, especially from juniors. On the other hand, it is so much a part of the role perhaps it is easier? It is expected of us, so we are not as harshly judged when we challenge things,” Wiberg  says. 

Psychological safety is present when colleagues trust and respect each other and feel able, even obligated, to be candid, according to Professor Edmondson. Getting there requires leadership, honesty, and a dose of humility across all levels of an organization. This is not a tech problem; it’s a personnel problem. Fortunately a healthy assortment of coaches are a Google search away, and not just in Sweden.

You need a leader who makes it safe to point out software vulnerabilities. Is that perhaps a chief quality officer? Our white paper explains the role.

 

by Andy Patrizio

Andy Patrizio has been covering the Silicon Valley (and everywhere else tech) for more than 25 years, having written for a range of publications that includes InformationWeek, Network World, Dr. Dobbs Journal, and Ars Technica. He is currently based in Orange County, California.