Several months ago we started looking at the Jira Version Report as a means to produce release burn-ups so that we could begin to forecast our deliveries using live Jira data. During the implementation of this chart, questions were raised about what data lies behind it and what events affect the way in which the chart is drawn. Many investigations later, and with great support from Atlassian, I can summarise everything that I have learnt about the Jira Version Report.
How the team velocity is calculated
The velocity of the team is fundamental to the forecast that the chart makes. This calculation counts the number of working days and the total of estimated story points to produce the average velocity per day. It considers the number of days based on the start date of the version, not of any Sprints.
Explaining the prediction strategy
The process would be as follows:
Count the number of days since this version has started and the current date (note: start of the version, not of any sprints)
Sum up all points completed until today
Divide the latter by the former and you will have an estimated velocity (points completed per day).
Simply keep adding up this number until you reach the total estimated points in the backlog.
If all weeks will have the same number of working days, it will be a linear prediction.
Setting the start date of the fix version
If the Fix Version in Jira has no start date set then the graph will begin from the date on which the fix version was first associated with an issue. If this is some time prior to beginning work on the version then the start date of the version should be set to the start date of the first sprint in which issues associated with that fix version were progressed.
Here's an example from one of my teams. We had a Fix Version which was created quite early on by the Product Owner as he planned a new set of features for the team to work on. It was some time before the team actually began progressing any of these stories in a sprint and the Version Report looked like this, with a predicted completion date of 27th Oct:
This chart was using the default start date of the date on which the fix version was first associated with a story. When I changed this to set the start date of the fix version to the date of the first sprint in which we progressed the stories for this version, the chart looked like this:
Our prediction date had been changed considerably, to 24th Aug. This is because the first chart had been including the velocity from February to April, even though the team were not progressing any of the stories. As a result, the prediction was much further out because Jira was including several weeks of zero velocity against the version. When the start date was correctly set at the start of the first sprint which included this version, then the prediction was adjusted correctly.
What makes the chart bend
Generally, the prediction line of the chart tends to be straight. However, under some circumstances, the line will bend on 'Today'. This can happen in either direction, inflecting upwards or downwards depending on the event which has caused it. The following are the events which I have found can cause the bend.
1. Non-working days
If there are more non-working days set on one side of “Today” than on the other, then the graph will bend accordingly as it would take a different number of days in the second half to achieve the same number of completed points from the first half.
For example, the graph will bend up slightly when there are more 'non-working days' in the first half, prior to “Today”, than in the second half.
2. Closing and re-opening an issue
There is an open bug with Atlassian (GHS-11735) where, instead of subtracting the story points of a reopened issue, it keeps adding them up. Which then results in the incorrect prediction, where the velocity assumes that the team will speed up in the upcoming weeks and reach the goal more quickly. This means that closing and re-opening issues causes the line to bend upwards.
3. Removing the fix version from a closed issue
Removing the fix version from an issue which is already closed causes the line to bend upwards. (This does not happen with issues that are open, only with those which are already closed when the change is made.)
It's worth being very cautious about these events, as the bend in the chart is often not reversible. Here's an example from one of my teams, which resulted from removing the fix version from a number of closed stories:
How are scope changes handled?
Changes to scope other than those described above seem to be handled in a consistent and expected way. This includes the following:
Increasing / reducing the story points on an issue
Adding an issue to the fix version or removing the fix version from an open issue
When such a change is made, this alters the target scope on the right-hand side of the graph so that the prediction line intersects at an earlier or later point, thus giving an updated forecast date. See the following example, where the scope was reduced, giving an earlier forecast for completion:
Estimating the full scope of the project
The Version Report will forecast for the issues that have been estimated and, whist some of the work remains un-sized, the graph will not reach 100% (see above screenshot for an example). There is an open bug with Atlassian for this issue (GHS-12137), as the graph keeps increasing the range of story points when more story points are committed. In this way, the percentage of progress on the project can never reach 100%, as it will always set a higher range of story points, more than the team has committed in the stories.
I have found the Version Report to be a very useful tool in the prediction of the work that my teams are progressing. However, for the report to be widely adopted in our organisation, I think it's essential that we can understand how it is constructed so that we can trust its output. With very little documentation available online about this chart, I hope that the details that I have collated here can shed a little more light. Happy forecasting!