We like to use percentages to compare performance (“footfall is 12% up”; “retail sales increase by 5%”) – they make good headlines and quickly summarise how well we’ve performed.
However, percentages can be quite misleading and even well-intentioned analyses can be off the mark. If you are measuring and reporting on your performance, it’s important to consider quite how you arrive at your figures.
A Quick Example
For the purposes of demonstration, let’s take the high street in fictional town Argleton. Their figures are nice and straightforward: Every week is consistent, with a busier Friday & Saturday.
Now, say the town centre manager runs an event between Thursday and Saturday. It’s a success, and draws in extra visitors.
Next, the town centre manager loads up their spreadsheet and looks at the detail. They compare ‘normal’ performance when no event was running, versus footfall recorded during the event.
It might be crude, but this is a reasonably standard procedure for event-holders looking for a quick summary of their efforts. Here’s what they get:
What we see here is that during the three event days (highlighted) we saw an increase of 2000 visitors daily. The % change differs because 2000 extra visitors is a much bigger deal on quiet Thursdays than busy Saturdays.
Great! The event has brought in extra visitors, and it’s time to write the press release. The question is – which snappy figure do we quote? What was the ‘actual increase’ of the event?
Take the highest figure?
Thursday has the highest percentage change. 20% makes an impressive headline but this is because a boost of 2000 visitors is quite a significant difference on an otherwise quiet day.
Quoting “Event brings 20% increase in footfall” is accurate only for one of the three days, and certainly not representative of the whole event.
Take an average of the changes?
A slightly more sophisticated approach is to sum up all the percentages and divide by three (the number of days), to give you the average – which is around 11.7%.
If you’ve been using this method, you are certainly not alone. To many, this ‘feels’ like a reasonable way of getting to the headline figure and – while it’s closer – it’s still inadequate.
When we apply that claimed increase back to the original figures, we find that it’s too inflated. We got 6000 visits, but an increase of 11.7% works out as over 8000. Quite a difference.
Work out the total change?
The total footfall from Thursday-Saturday when the event is not running is 70,000. Take the total with the event and we get 76,000, confirming the event brought in some 6000 extra visitors.
The percentage from this is around 8.6%. Since we are considering the entirety of the event, across all of the days it ran on, this figure is much closer to ‘the truth’ than the other options.
Using three fairly trivial methods, we’ve arrived at three quite different percentages – 20%, 11.7% and 8.6% – to describe the success of the event.
The latter figure is – generally speaking – closer to reality, but it’s the least attractive of the three. It also fails to take into consideration the impressive boost on Thursday, something which should not go unrecognised.
If you are analysing figures, be very careful about the methods you use. It’s not at all trivial. At worst, you may end up with numbers that vastly over-sell the performance of events. This could erode trust with levy payers and stakeholders, particularly if their own evidence and experiences fail to agree.
Percentages are useful gauges of progress, but their context is very important – as are the methods used to reach them. In the event reports we produce, for instance, we quote change both as a percentage and as a figure, so readers can see the effects both in proportional and absolute terms.
For the reader, this reduces the ambiguity of the percentages quoted and affirms their confidence in the assessments being made. The figures might not be as attractive at first glance, but will undoubtedly pay dividends in the long-term.
This post was originally published on LinkedIn in 2015