Iron Man Explains QA Testing

As a frequent and avid consumer of software, I rarely think about where software comes from or how the developers got it to work semi-perfectly for my use. For all I know, someone had an idea one day, sat down at a desk and coded until it was ready. But like most other forms of creative expression, software is never perfect the first time. You have to edit, re-edit, and test it in all kinds of ways to ensure that it does what it is supposed to do. This process is pretty opaque to the non-software developer, but there is a terrific movie analogue to it: Tony Stark and the development of the Iron Man suit in Iron Man (2008). The famous suit didn’t spring like a fully formed Athena from Stark’s imagination; he built one version, tested it, made changes, built a second version, tested it, made changes, and built a third version. This obsessive, methodical testing and retesting is the essence of software quality assurance (QA).

Blowing Smoke

Smoke testing is all about making sure the simpler parts of the application work, manually or by automation. It is one of the first testing steps you take before you get into the more complex tests. When Tony Stark takes the Mark I suit out to do battle with his captors, he is testing only the most basic functionalities. Will it protect him from bullets? Does the flamethrower work? Can it fly? If so, it passes the test; he’ll deal with the details later. This is slightly different than sanity testing, which occurs when a developer has sent in a refined version of a software build, and the testers then check whether the developer’s changes make sense on a superficial level.

When Tony gets home, designs the Mark II, and begins to build it, he conducts unit testing with the individual components; this is what happens when he straps the first repulsor unit onto his arm and fires it, blowing himself across the room by mistake. With unit testing, you break the system down into testable modules and functionalities, and then test each part one by one. The advantage is that when there’s a bug, the test is small enough that you know exactly where it is; since Tony was testing the repulsor and not the whole gauntlet, he knew that the problem was in that subsystem, not another subsystem within the gauntlet.

Putting the Pieces Together

The next logical step is integration testing, which in our example means testing the gauntlets. Integration testing is the phase in software testing in which, according to Wikipedia, “individual software modules are combined and test[ed] as a group.” When Tony straps on both boots and both gloves and begins flight testing, this is integration testing at work; the whole suit isn’t ready, but he’s making sure these modules all work when the subsystems are put together and he actually tries to fly.

System testing is more akin to the first test flight, which is about seeing if the entire thing works under controlled conditions – take the Iron Man Mark II out for a spin and see if it works. This is subtly different from functional testing, which is about assessing whether the software or suit that’s been developed fits the business requirements that were given to the developers. In other words, Tony would be checking if the suit he’s developed solves the problems he had, namely a) needing to stay alive and b) having a fantastic robot suit.


Performance testing is an interesting area with lots of sub-fields. They all involve testing the performance of the software under simulated loads of users that the software might see in the real world, hence the name load testing. Testers will simulate a normal number of users, then the maximum expected number of users, and then insane numbers of users just to see when the application fails. The latter part is stress testing, and if you guessed that that was Tony trying to break the fixed-wing altitude record and nearly wrecking his suit in the process, then you are today’s lucky winner! Please proceed to the ticket booth to receive your free Iron Man super-pacemaker and complimentary wrist lasers.

(There are other kinds that don’t quite fit, too. Soak testing involves saturating the application with high numbers of users for a long time, while spike testing is about seeing what happens when there’s a sudden massive demand on the system. Configuration testing, meanwhile, is about seeing what happens when the system is configured in any number of different ways.)

After Tony is nearly killed when the Iron Man suit is frozen at high altitudes, he decides to add an ice-resistant paint coating. It’s not shown onscreen, but one assumes that his AI ran simulations to make sure that the paint wouldn’t interfere with any other essential functionalities of the suit. That’s the gist of regression testing. When anything is changed within an application, either to fix an existing bug or to add a new functionality, there’s always a risk that the changes being made to part of the existing code will throw some other piece of code out of whack and create more bugs. Regression testing accounts for that.

Testing on the Fly 

End-to-end testing is like system testing, but more so. The system is tested on how well it works with the databases, hardware, and anything else it would work with in the real world. Instead of taking the suit out for a spin in the relatively benign environment of southern California, Tony takes it to (presumably) Afghanistan to fight bad guys, where it has to interact with enemy infantry, armor, and missiles as well as “friendly” American airplanes. Although Tony’s run is for real, the effect is that of an exaggerated test. This is also sort of like ad hoc testing, which is the equivalent of your boss making a surprise inspection. How does the suit respond when a tank is shooting at it? It works? Okay, good. You’re ready for the Big Bad at the end of the movie.

User acceptance testing (UAT) is usually one of the last steps in the process; it’s about testing the software under real-world conditions, presenting it to users and seeing if it does what the users wanted it to do, that is, if it fulfills the business requirements and provides them the appropriate functionalities. In Iron Man, the user is the designer, so this really happened throughout the testing process.

Several of these tests can also be classified as either black-box or white-box. Black-box testing tests the application without looking at the code inside or having any knowledge of how it works; the testers are concerned only with what the application does (i.e., it stops bullets and saves people and flies thousands of miles). Conversely, white-box testing is all about seeing whether the internal structure of the application makes sense, without looking at what it does in the real world; this is like J.A.R.V.I.S’s internal diagnostics, when he complains that “there are still terabytes of calculations needed before an actual flight” before Tony jams the thing into second gear and screeches down a tunnel.

Positive/negative testing is very simple: is the application doing what it’s supposed to do (positive), and is it not doing stuff it’s not supposed to do? The Iron Man suit does a great many things that it’s intended to do, like shoot missiles and fly, but if Tony tells it to travel through time, it really should not do that. That would be a failure of the negative test, and probably serves as a sign that Tony has entered another dimension.

Patches and Future Suits 

Iron Man doesn’t stop advancing once the movie is over, just as the release of a software product doesn’t mean that the developers are done with it forever. They release patches to fix bugs that show up in the real world and to add functionality. So does Tony. The suit in Iron Man 2 comes in a suitcase, the suit in Avengers can be summoned with magic bracelets and donned literally on the fly, and so on and so on. For Stark and for software developers, the existing product can always be improved upon, even after the product is out the door.

By Andy Tisdel

“The project resource that you are providing has made significant contributions and has become an extremely valuable member of the team.”

“We’ve been very happy with our resources and the level of support that Resolvit provides.”

“I have been partnering with Resolvit for three years now and they have always met or exceeded my expectations. The developers we have on board have done an outstanding job over the last year. In support of our 2016 IT goals, they have been involved with major EDW projects, client data transfers, and support activities.”



“The Resolvit members are part of our team. They function like our people and offer ideas and solutions while taking on more responsibilities every day.”

“I’ve had a great experience with Resolvit so far. Our resources are great and the account managers that we’ve worked with are very attentive to make sure we’ve got everything covered!”

“I am very impressed with the level of service I’m getting from Resolvit.”


“When we were looking for an offshore resource, Resolvit came through with flying colors.”


“Resolvit does a solid job keeping in contact with us and listening to our needs. They then provide feedback on needed actions or offer recommendations.”


“The team at Resolvit has ensured that communication is open and candid between us. We have appreciated the conversations regarding various technologies and possible options for personnel. Resolvit obviously believes in building a relationship with their clients and not just simply doing business.”

“Resolvit has been a great partner and it’s clear to us that the folks we work with are dedicated, highly professional, and produce great work. We value the relationship with Resolvit and look forward to our continued partnership.”