From the QA trenches: 6 signs of project success or failure

Surely, the software testing staff can see indications of QA project's success or failure well before anyone else.

Surely, the software testing staff can see indications of QA project's success or failure well before anyone else.

February 19, 2020
Tamas Cser

Elevate Your Testing Career to a New Level with a Free, Self-Paced Functionize Intelligent Certification

Learn more
Surely, the software testing staff can see indications of QA project's success or failure well before anyone else.
The software testing staff can be the proverbial canary in the development coal mine. They see indications of success or failure well before anyone else.

Among the many quotes attributed to the eminently-quotable Sun Tzu is the expression, “The victorious army is victorious first and seeks battle later; the defeated army seeks battle first and seeks victory later.”

That applies equally well to the binary battlefield of code development. Project success or failure can come well in advance of shipping the software. 

And if anyone can anticipate the outcome, it may be the software testing staff. QA professionals see the code when it is at its most problematic. While programmers are busy adding features, the testers are taking the time to see what’s wrong with what already exists. 

You need a healthy amount of skepticism to keep a test team focused on the job at hand. Michael Bolton, who trains testers and runs DevelopSense, says that as a tester, it’s his job to anticipate failure. 

“From my perspective as a tester and someone who teaches people how to test software, I observe two types of mindsets,” says Bolton. “The first is the builder mindset, which dominates development. A builder says, ‘I will make things that solve your problems.’ As a tester, I have a different mindset: ‘I will find trouble wherever I look and whenever I look,’” Bolton says. 

It’s the tester’s role to be professionally uncertain when everyone around them is sure things will turn out okay. “It’s our job as testers to anticipate the possibility of failure,” says Bolton. “There’s an important psychological reason for that. If you look for success, that’s what you’ll find. It’s the tester’s awkward job to take the designer’s beautiful baby and say, ‘Have you noticed one of the eyes isn’t lining up?’” 

So what are the signs of success or failure? How can the testing department predict smooth sailing ahead or identify problems that ought to be resolved? Test professionals suggest you look at these factors.

1.  When testers get involved in the software development process

An immediate indicator of failure is the test team being brought in at the end of the development process, says Stacy Kirk, CEO of QualityWorks Consulting Group, which specializes in placing software test professionals.

“If testers are requested to be in meetings with designers and developers and are in review meetings, there is a higher success in terms of quality,” Kirk says. “But a lot of companies don’t include testers until it’s at the here’s-your-code-and-test-it-now stage.” 

Testers should be involved at the product requirements phase. “Then tests can be derived before code is written, and technical impediments can be surfaced for clarification before the design needs those answers,” says Randy Bloom, a veteran QA engineer who has worked for web development firms such as Yahoo and OpenX. “If testers aren't involved until the technical design, they have to take the requirements on faith and ensure the design is testable, likely referring back to user stories in case of ambiguity.” 

Therefore, a sign of success is the opposite scenario, where the test staff gets involved early – at least before the code has been written! That’s particularly true if testers don't participate in code reviews, for whatever reason. Suddenly tests need to be designed and implemented based on a delivered product that may not be understood well, Bloom adds. Those are the projects that are set up to fail, or at least to slip a deadline.

2. How QA fits into the organization’s structure

How the QA organization operates is more relevant than project complexity, says Bloom. “To succeed consistently, the people in that organization need to understand what they are testing. I couldn't have worked at a one-week cadence (with multiple launches in a given week) if I didn't understand the system's arcane business logic intimately,” he says.

That’s beyond org boards and who-reports-to whom. Failure is a predictable result where testers’ input isn’t respected, Kirk says. It could be said directly – that the testers’ aren’t technical enough, or because testers’ warning are seen as a distraction. Developers who view defects as a process slowdown are apt to miss real problems. “Developers have said to me, ‘Stacy, if you find a problem later on, I’ll fix it. I don’t want to fix it while I’m developing it,’” Kirk says.

3. Oversight of the big picture

Bolton had an experience while he was working as a program manager for Quarterdeck, maker of the famed QEMM memory manager for MS-DOS. The company had developed a product called MagnaRAM; it basically attempted to put a swap file in memory, thus compressing memory contents without writing to the slower hard disk.

As Bolton describes it, MagnaRAM worked best for people who didn’t need it (because they had a lot of RAM), but it couldn’t do much for the people who needed it most (because they had very little RAM). That’s a design issue. 

Is that a bug? Not in the technical sense. The software worked as programmed. It just didn’t do what it was supposed to do. 

Some testers fail to take such design flaws into account. 

“A lot of developers think of a bug in terms of coding errors, inconsistency in the product and the spec, or something that stands out as threatening company image. But they don’t necessarily question and re-evaluate the design,” says Bolton. 

“A leading indicator of failure is insufficient critical thinking about the product, a reluctance or refusal to confront the idea there might be problems in the product,” Bolton says. “Success is its opposite: a tolerance and even enthusiasm for finding problems that matter to people. The notion of embracing the possibility of failure.”

4. How comfortable people are about speaking up

So why did Bolton let MagnaRAM ship, knowing the software didn’t do users any good? One word: fear. Fear of management and how they would react to the bad news.

Killing the messenger is no way to lead a company. But the path to failure is littered with messenger corpses. 

“We tend to suppress problems in the moment.  We didn’t want to confront the reality of the problems,” Bolton says. “It was a disaster because I didn’t think like a tester, anticipating failure. I was thinking like a builder, envisioning success.” 

Kirk concurs, saying what the team thinks should matter. “I have been on a lot of releases and you can see the tester look anxious, they look almost sick. They don’t look happy that someone on the executive level says ‘I don’t care, let’s go,’” she says. It’s vital for people in the testing organization be brutally honest, and to be trusted when they report that something is wrong.

5. Who controls the deadlines

A major factor in a project’s success or failure is whether the project has a deadline; if so, how arbitrary that deadline is; and what you communicate to the public. If a date is been communicated externally, it doesn't matter how perfect the product is if it comes out after the "promised" date, notes Bloom.

In one case, on a long-running project, during an earnings call the CFO mentioned a release date that was based on the most optimistic estimate. “Of course the project slipped; the final version was responsible for half of the company's revenue (hundreds of millions of dollars quarterly), but it shipped late and ‘failed,’” Bloom says. 

Typically, aggressive deadlines are an early indicator that a project is in trouble, especially when the people implementing the software know the deadline is unrealistic. That can also be a bit self-fulfilling because a dogged insistence on hitting a date regardless of other factors ruins morale. 

Legendary gaming studio id Software always gave release dates of “when it’s done” during its early, more capricious days. Fans complained about that vagueness, but no one could say their games were buggy on release. Of course, few business developers have the luxury of telling management a product will be done when it’s done. As Bolton experienced, you can often have orders from on high to ship a product, ready or not. 

But if you know the date is a lie? Worry.

6. How many bugs testers encounter

If the project is coming up on its promised ship date, it’s a good sign if you see few bugs and you can run through most or all of your tests without issue. But if the opposite is true – you can’t go five minutes without seeing a bug – you have a problem. Especially if it’s the same bug over and over. “How quickly you find problems is an indication of the overall quality,” Kirk says. Toward the end of the product cycle you don’t want to find bugs. If they keep appearing, it’s time for concern.

If you continue to fix the same bug or type of bugs, Kirk adds, or if the same issues reoccur, testers – and their development team in general – should see it as an indicator the software was not built right. “There’s something underneath on a foundational level that is broken.  And if you keep fixing it again and again there is a problem,” she says. 

How many of these situations have you encountered? I dare say it’s been more than one. 

(Full disclosure: I worked at Quarterdeck with Bolton and Bloom during the same time period but in a different department.)

Even if your QA team is dedicated to testing software carefully, you can run into trouble. Read our white paper on the reasons Selenium tests fail.

by Andy Patrizio

Andy Patrizio has been covering the Silicon Valley (and everywhere else tech) for more than 25 years, having written for a range of publications that includes InformationWeek, Network World, Dr. Dobbs Journal, and Ars Technica. He is currently based in Orange County, California.