The session consisted of the speech itself, follow by panel comments from Sushil Wadhwani (economist and former MPC member), Chris Giles (Economics Editor at the Financial Times) and
Melanie Baker (UK economist at Morgan Stanley), followed by questions from the audience to the speaker and panel.
Martin showed two separate charts, one showing the weakening of the exchange rate, and the other showing the fall and recovery in the FTSE100 and the fall and not recovery in the FTSE250 immediately following the referendum. He suggested that the fall in the FTSE250 was more representative of sentiment regarding the UK because so much of the FTSE100 consisted of expected foreign earnings from multinationals with UK listings. He then suggested that the fall in the FTSE250 was not significant enough to allow one to draw conclusions (and did little more than confirm that "prices can go down as well as up").
Sunil countered that a better measure of confidence in the UK economy would be the FTSE250 in dollar terms. This had taken a pounding following the referendum, and painted a much more negative outlook than Martin had suggested. (This effectively combines the two charts.)
Whether you agree with Martin or Sunil, the exchange was a potent reminder that its not just what data you have, but also how you analyse it.
It reminded me of a project I worked on some years ago where the data we were seeing was showing a slight decline in the performance of a particular process. Because the decline appeared to be only slight it was not ringing any alarm bells (yet). However, I knew that the process (a) dealt with 6 discrete populations and (b) included a natural delay of some months. So I requested the underlying source data, and (1) split it into the populations, and (2) did a batch cohort analysis of each. This analysis revealed a much deeper - up to 50% for some populations - decline in process effectiveness. That definitely started the alarm bells ringing!
Yes, subtly improper analysis of data can render it very misleading!
Sunil further responded to Dr Weaver's conclusion that the data was still inconclusive by remarking that:
Its a lessons that has stood me in good stead ever since. In any strategic process, there is a time to collect more data, a time to conduct deeper analysis, and a time to accept that what you've got is good enough/as good as you're going to get and its time to make some decisions and move forward.
If you fail to learn that lesson, you invariable fall into 'paralysis by analysis': where data and analysis snowball and any chance of meaningful action falls by the wayside.
Data is a strategist's friend. It is the bedrock of analysis, reasoned decision making and feedback. But it is not without its pitfalls. Good use of data is as much an art as it is a science. And it is one every strategist does well to study carefully.
Melanie Baker (UK economist at Morgan Stanley), followed by questions from the audience to the speaker and panel.
Martin Weale's speech was fascinating enough in its own right. However this is not a blog on economics or monetary policy, and so I will not even attempt to do it justice here. (If you're interested, you can read the speech itself here, and Resolution Foundation's write up of the event here.) Rather, I will pick up on two related points Sunil Wadhwani made in his remarks, and which I think have direct pertinence to business strategy.
1. Having the data is not enough
Martin showed two separate charts, one showing the weakening of the exchange rate, and the other showing the fall and recovery in the FTSE100 and the fall and not recovery in the FTSE250 immediately following the referendum. He suggested that the fall in the FTSE250 was more representative of sentiment regarding the UK because so much of the FTSE100 consisted of expected foreign earnings from multinationals with UK listings. He then suggested that the fall in the FTSE250 was not significant enough to allow one to draw conclusions (and did little more than confirm that "prices can go down as well as up").
Sunil countered that a better measure of confidence in the UK economy would be the FTSE250 in dollar terms. This had taken a pounding following the referendum, and painted a much more negative outlook than Martin had suggested. (This effectively combines the two charts.)
Whether you agree with Martin or Sunil, the exchange was a potent reminder that its not just what data you have, but also how you analyse it.
It reminded me of a project I worked on some years ago where the data we were seeing was showing a slight decline in the performance of a particular process. Because the decline appeared to be only slight it was not ringing any alarm bells (yet). However, I knew that the process (a) dealt with 6 discrete populations and (b) included a natural delay of some months. So I requested the underlying source data, and (1) split it into the populations, and (2) did a batch cohort analysis of each. This analysis revealed a much deeper - up to 50% for some populations - decline in process effectiveness. That definitely started the alarm bells ringing!
Yes, subtly improper analysis of data can render it very misleading!
2. You'll never have all of the data or analysis
Sunil further responded to Dr Weaver's conclusion that the data was still inconclusive by remarking that:
"You have to form judgments; because you are never as well informed as you would like to be; because the data is simply not there." (Tweet this!)He went on to advise:
"Resist the temptation to wait for more data before acting. There will always be more data to wait for." (Tweet this!)(That is my best recollection of the words that he used, but I cannot guarantee that it is verbatim.)
That is one of the key lessons I remember from the many case studies we did on my MBA programme. (I sometimes think that part of the objective of an MBA programme is to overload you with case studies and then put you on the spot in class in front of your peers to draw conclusions from what is inevitably inadequate data, as that is the closest they can get to what it feels like in real life within the classroom context!)
Its a lessons that has stood me in good stead ever since. In any strategic process, there is a time to collect more data, a time to conduct deeper analysis, and a time to accept that what you've got is good enough/as good as you're going to get and its time to make some decisions and move forward.
If you fail to learn that lesson, you invariable fall into 'paralysis by analysis': where data and analysis snowball and any chance of meaningful action falls by the wayside.
Conclusion
Data is a strategist's friend. It is the bedrock of analysis, reasoned decision making and feedback. But it is not without its pitfalls. Good use of data is as much an art as it is a science. And it is one every strategist does well to study carefully.
No comments:
Post a Comment