Failure is inevitable. Projects fall apart. Clients leave. Sometimes circumstances outside of the team’s control doom them to failure. But every failure creates an option. Teams can take the loss at face value. Or learn from it and get better.
Reflection on failure can be a powerful growth tool for individuals and teams. In one study, having participants spend just 15 minutes a day reflecting on their work increased performance by 20 percent compared to just spending the same 15 minutes working. And for teams, the benefits compound.
So, in this article, we’ll explore how to reflect and learn from failure by outlining how to conduct an after action review with your team. This is a tool first created by the military and used to break down missions after the action and extract lessons to improve future missions. But its intent and format work in a variety of settings
This review should generally happen at the end of a project or a pivotal milestone. And generally, it centers around five key questions that get discussed as a team.
What was our intended result?
The first question to ask during an after action review is “what was our intended result?” Ideally, the team started this project with a set of clearly defined objectives, maybe even a team alignment meeting. So, this question is really just a review of what was decided at the beginning. But it’s important to start with this review, because the longer a project lasts, the more likely people are to shift the goalposts (accidentally or intentionally) in their mind. To honestly answer the questions that follow, teams need to bring everyone back to the start and get consensus on the original intent of the project.
What was the actual result?
The second question to ask during an after action review is “what was the actual result?” Whatever metrics were used by the team to state the intended result, in this question teams review the actual result according to the same metrics. This is important because, like shifting goalposts, people tend to look for ways they can reframe failures as success by shifting the way the result is measured. But to get a true comparison, teams need to stay true to the original, intended metrics…