The holy grail of most organisations offering executive education interventions is knowing and justifying if they really work. Reflecting on what traditionally happens, this paper sets out to propose a more imaginative and relevant way of tracking the success (or not) of executive education interventions. We suggest the focus needs to be more on what is learnt, not how peoplefeel, and more in tune with the means of design not just the ends.
A recent McKinsey report; “Why Leadership Development Programmes Fail”, highlighted the following research findings:
Professor Jeff Pfeffer of Stanford University in his article; “Getting beyond the BS of leadership literature”, states the US spend could actually be as much as $50bn. He goes on to say that; ”Leaders aren’t doing a good job for themselves or their workplaces, and things don’t seem to be improving.”
From this evidence, a conscientious Leadership Development professional may be perfectly entitled to conclude that investing in Leadership Development Programmes and interventions is an expensive waste of time and money with questionable returns.
The counter argument is that whilst the cost of Executive Education may appear high, the cost of Executive ignorance is significantly higher. That doesn’t, however, negate the need for the pursuit of more helpful, relevant and accurate measures of success.
In his article; “The Corporate Leadership Landscape”, Tim Coburn highlights the complex contextual environment today’s Executives have to operate in. There is no doubt they live in a world where the shelf life of knowledge is getting shorter and shorter and the need to learn continuously is a must, not a nice to have. If Executives are to remain valuables, they need to focus on the complex not the repetitive, the unknown, not the known, and increasingly on the future, not today… because “today” is being standardised, outsourced, automated and digitised.
From the evidence, a conscientious Leadership Development professional may be perfectly entitled to conclude that investing in Leadership Development Programmes and interventions is an expensive waste of time and money with questionable returns.
Whichever report you choose to read there is little doubt that Executive Development is crucially important. The growing dilemma facing all those responsible for Executive Development is: how do we know if our investment will provide the value and returns we seek? The answer, of course, is that there is no simple answer! Having said that we would like to show that there are many surrogates which can help point the way and, when taken in the round, provide strong evidence for progress.
The Purpose of Executive Education
Given the challenges faced by the leaders of today’s companies, we believe the sole purpose of executive education should be to enable executives to learn and learn how to learn, and in doing so apply that to improving themselves and their organisations.
For individual leaders, the ability to learn has already been identified as the strongest factor in determining their potential to succeed. And for organisations, the ability to adapt and change has become so obviously critical to their survival.
The Traditional Approach
The most widely used method of measuring the effectiveness of Executive Education Programmes is the classic “Happy Sheet”, or delegate feedback form. These are designed to gather feedback on the quality of content, speakers, learning experiences, the venue and administrative support. Using a combination of closed (rating scale) and open questions, delegates are asked to evaluate and report on their experience.
The fundamental design flaw and perhaps unintended consequence of this sort of measurement mechanism is that it invariably forces Executives to make a one-dimensional judgement of; good/bad, yes/no, like/don’t like and not make more valuable self-reflections on their own learning experience and feelings.
If the purpose of executive education is to create people with the ability to learn how to learn then reinforcing such a judgemental approach is damaging and counterproductive to that aim.
The evaluation of a delegate’s learning experience is crucial but in doing so, we should be asking questions with a known correlation to an improvement of thinking and behaviour in job performance, like some of those identified by ABDI’s research:
As useful as these questions are however, we believe they only go part of the way in addressing the real purpose of executive education.
A Different Perspective
Given the emphasis on learning – as well as the emphasis on improving performance – knowing if Executive Education Programmes work becomes a function of both means and ends. By means, we refer to the underpinning pedagogical and design principles and processes of any intervention. The way a programme is designed will dictate whether ‘learning to learn’ and‘improving yourself and your organisation’ actually happens.
The design principles we believe in at Accelerance ensure we address the purpose of executive education as we see it. In doing so, they also provide an alternative set of measurement criteria that can be used for evaluating impact. These principles include:
Measure: Did the programme include live testing and experimentation? How many experiments, new ideas or new ways of working were tested as a result of the programme?
The list of design principles referenced is by no means exhaustive and is only a sample of those used by Accelerance to construct and deliver its work. When executive education is designed with principles like these, traditional “happy sheet” questions become less helpful when seeking to find out if a programme has achieved its real purpose. Given the importance of Executive Development to the future of every business, it’s important we move away from simplistic and often misleading measurement methods and adopt a more holistic view of what can be measured based on more enlightened design principles and practices.
Add a Comment