Just over a month ago the Office for Budget Responsibility (OBR) produced their latest report on the economic and fiscal outlook which said:
“We still expect the economy to avoid a technical recession with positive growth in the first quarter of 2012.”
Last week and less than 6 weeks after the publication of the OBR report the Office for National Statistics (ONS) released the UK’s GDP growth figure for 1Q 2012 – the UK is in recession again.
This demonstrates just how difficult economic forecasting is at the moment. Given everything there is going on in that affects the UK economy from the Euro crisis, to the US Presidential election to the price of oil any forecast is almost certainly wrong.
And the OBR know this. So why do they bother?
As Andrew Dilnot, the Chair of the UK Statistics Authority said in a recent interview on the BBC:
“We all know that all forecasts are wrong. So if we could avoid forecasting we’d all love to avoid it. But the trouble is that we also have to plan. Governments have to plan how much public spending there will be and how much tax they’ll raise. Individuals have to know roughly what situation they will be in.”
However much of what makes a prediction ‘wrong’ is how it is expressed and how it is interpreted.
The OBR does publish a central forecast for GDP growth, which is the number that gets reported, analysed, rubbished and ultimately revised.
However as Andrew Dilnot went on to explain: “Much of the art comes in explaining how much uncertainty there is”. The OBR provides a ‘fan chart’ that shows the range of probabilities associated with different estimates of growth.
This is not as satisfying as a single number that can be proved right or wrong but it is the essence of prediction in complex systems such as an economy.
We confront the same situation everyday as individuals when we read the weather forecast.
The weather is a persistent topic of conversation in the UK because it is so changeable and many of our plans are dependent on what it might be.
Our frustration with weather forecasting is based on the perception that “they always get it wrong”. The problem is that the weather, like the economy, is a highly complex and dynamic system. You cannot predict exactly what is going to happen.
As such a weather forecast is expressed in terms of probabilities rather than certainties. However we, and the media, tend to interpret an ‘80% chance of rain’ as ‘it will rain’. And more significantly that if it does not, then the forecast was “wrong”.
Physicists can predict with near absolute certainty where the moon will be tomorrow, next month and next year. We crave the certainty of science. Unfortunately the weather and the economy do not work like that and cannot be understood in those terms.
Consumer markets are similarly complex, interdependent and unpredictable.
The simple answer to the question of how many additional sales will a particular marketing programme generate is ‘it depends’. It depends on the economy, on what the competition does, on whether consumers recommend the product and a myriad of other interacting factors. It may even depend on the weather.
The impact of any marketing strategy on consumer behaviour is inherently uncertain.
We build models and simulations to narrow that uncertainty. And by doing so we can establish a clearer context in which organisations can make better decisions about their marketing strategy.
Comments are closed.
- May 2017
- April 2017
- March 2017
- December 2016
- November 2016
- October 2016
- September 2016
- August 2016
- July 2016
- June 2016
- May 2016
- April 2016
- March 2016
- January 2016
- December 2015
- November 2015
- October 2015
- September 2015
- June 2015
- May 2015
- April 2015
- March 2015
- September 2014
- August 2014
- June 2014
- May 2014
- April 2014
- March 2014
- November 2013
- September 2013
- June 2013
- May 2013
- September 2012
- June 2012
- May 2012
- April 2012
- March 2012