- Part I - Kirkpatrick and friends and enemies : The current pragmatic 'state of the art' is to apply half of the Kirkpatrick model. Trends and insights within the 50 years that followed the initial model suggest we might rethink evaluation of learning.
- Part II - Divergence: so much to potentially measure : In this post, we'll take a hike along various approaches and models for tracking or proving impact of learning and have a short thought on each of them.
- Part IIIa - Convergence: a suggestion for the pragmatic : I'll sketch out 2 models based on the findings on part II and the desire to do better than the current state of the art. One for the pragmatic that tweaks the dominant Kirkpatrick model,
- Part IIIb - Convergence : a suggestion for the revolutionary : and one for the revolutionary that throws it away and starts allover working backwards, holistic and adaptive.
Let us in this final part also start with a quote:
"And that is my problem with... assessment -- it serves people like you, people like me, rather than our learners. Yeah, I said it." (Source: InnovativeSarah)
3.6. The road from El Dorado
Let's illustrate the guiding principles of working backwards, holistic and adaptive with a nice metaphore. I know that metaphores rarely work 100% to illustrate a point, but a mental image like this saves me writing a few thousand words. The methaphore is learning's quest for gold, to the mythical city of El Dorado.
3.6.1. Backwards: start at the end
Modern instructional design methods and evaluation schemes (including the updated Kirkpatrick model) advise to start from the end, and work our way back from the desired outcome. That goes for value as well. We often use the term 'value-add' to describe how learning and all other business functions 'add' value to the bottom line. Let's flip that notion. It is not about value-add, it is about value-get. Learning doesn't add any value because the value is where it has always been: at the end. It is in the behaviour and performance. Learning -if done properly- will GET that value our way. Learning is about 'value enablement', it is one of the business functions that enables the corporation to capture more value, faster than others.
In our metaphore, it means we are not looking for the road TO El Dorado. In a corporate context, we know where the gold is. It's in the money and reputation we get by performing. We do not need to look for El Dorado, and we do not need to find the road to it. We need to make roads FROM El Dorado to get the gold to us. We work backwards from the gold to where we are now. The harsh statement is that learning doesn't come first. It comes last, if it comes.
Training isn't the only function that enables a corporation to capture value. The same goes for about all functions, eg marketing, R&D, etc. And within learning land, it doesn't matter really if we are talking about single training event x, or knowledge article y, or watercooler talk z. It is the whole environment and the whole team that in the end get the gold. Instead of focussing on the individual learning activities or business silo and derive the sum from the individual components, let us flip that notion and start with the whole in mind and go into individual correlations and relative importance of components afterwards. Training is not the only enabler just as it isn't the sole answer to any business issue. Let us integrate experience and sharing etc into the picture. We can make nice schemes and put a lot of effort into measuring the individual potentially contributing items, but it is easier to flip it and ask: "What if that particular item wasn't there?". The key question on value then is not 'what is the value of class x', but 'what would it mean for you if that class x did not exist, all other things being equal'?
Back to our metaphore 'The road from El Dorado'; it all matters. The big roads (curriculums) and small roads (classes), the pointers and road signs (tweets), the maps (your connections), traffic agents (coaches) and previous hiking (experience), etc. How do you single out the value of a road? Flip it: what if that road did not exist, how much longer would we need to travel to El Dorado, how much less gold could we transport over it, and would we still get there at all?
3.6.3. Adaptive: continuously adjust to context
I'm not known to spend a day to find an exact definition, but I did struggle to illustrate the ongoing, interative nature of development and its impact. In musical terms I believe it's "da capo al fine". The terms 'cyclic' and 'iterative' suggest there is some kind of predetermined pattern that you loop through time after time. That's not quite right as the agile times we live in are too unpredictable to allow fixed patterns such as loops. So I settled for the term 'adaptive'. Old school models dating from the times where predictions had a low risk, make a fixed casual chain all the way to value. What we need in the network age is to continuously adjust the notion of what is valuable today, and how to get there. Skills have an expiration date, as do business models, and adjustments are part of the daily work.
In our story of El Dorado, it means that we should look at El Dorado as a big area for gold digging, rather than a single street or building of solid gold. We should not only adapt where exactly we're getting our gold today, but might need last minute improvisations to bypass road blocks that popped up or find new shortcuts.
3.7.1. Anything aims at value drivers
Have you noticed how the most simple concepts often get the largest adoption? Especially recent popular technical innovations have a deceptively simple core concept and then throw technology in to achieve complex results. Take twitter (a short message), RSS feeds (just subscribe to a feed) or tags (a one word category) and think about all the tools build around those concepts and what they made possible. I'd love to have a similarly simple scheme for learning (impact) that we can open up for technology to do the complex part. Here is the simplified sketch 'learning gets gold' where simply put, anything that builds competence 'subscribes' via an 'aim link' to the key value drivers as set in KPIs.
- The learning : Following our holistic principle, anything qualifies here. In my book, I split out competence building activities into 3 families of activities: Learn, Do, Share. And with anything I also mean anyone, actually. And anything (in)formal, (a)social, (un)documented, etc.
- Gets : This is the relationship between any competence building activity and the value drivers. Previously I selected Key Performance Indicators (KPI) as excellent value drivers as they are directly linked to value capturing and yet concrete enough to steer behavior and performance of teams and individuals. Instead of making long, complex and ever changing casual chains, let us simply state that something 'aims' at enabling certain value drivers. It is a bidirectional relationship where the
- initial one is for an activity to 'aim' to contribute to certain output, KPI's or increased self-efficacy. This can be a predefined aim, or one added by social tagging or mining.
- The reverse direction is a verification on the relative importance of the activity based around the question "what would be the difference if this activity did not exist?". This can by done by claims (surveys) and facts (statistical correlation).
- The gold : We covered many models for corporate learning impact before, but one of the common findings is that the end goal is action. As value drivers, I suggest to list the Key Performance Indicators that allow your corporation to capture money and reputation. Some are long term and linked to our 'role' or 'function', others are short term and related to the tasks at hand.
- A mission is close to what we consider now a job 'role'. It is what is expected from us on a mid- to long term and usually set by our position or job role in the company. For example my mission might be to sell servers.
- A task is a short term goal that is very concrete and usually set by ourselves. For example, one of my upcoming tasks might be to prepare a business case for my client to replace their servers.
For the 'glue' to bind the whole model together I propose 'competence tags', closely related to the succesful adoption of tags. As a competence is in essence something you can do, the most simple form of a competence tag is an action verb (eg "selling" or "leading"). In a more elaborate form a competence tag adds contextual information to the action verb (eg "selling xseries servers to financial sector clients" or "leading an international team through a business transformation").
So to start with we can attach competence tags to all items in the corporate LMS and social media intranet, and do the same with job profiles. The competence tags and aims can be expressed via metadata in HTML or microformats. Technology can match the need with the competence building activity. Next, bring in the social element by allowing people to broadcast their upcoming tasks and have their peers suggest them any enabler to that goal. If you cannot imagine a technical solution on the above, have a look at the sites Rypple.com (social goals) and http://edison.thinktrylearn.com/ (social experiments) and imagine adding the learning/enabling element in there.
Rypple.com: social goals
Edison.thinktrylearn.com : run your own experiment and ask yourself "what will you do","how will you test your idea and measure success","how will you know you are done","how will you enjoy the journey".
3.7.3. Shaping the ecosystem
How do you include this concept in the ecosystem of a corporation? I see 4 sets of activities to ensure learning gets gold : mining, enabling, matchmaking and verifying.
- Mining : Where is the gold exactly? In the mining activity, a corporation needs to dig up the value drivers in terms of performance (KPI) and behavior and expressed as competence tags. One example I blogged about before would be how Google mined for valuable leadership behavior. If you're not Google you might relay on talks with your clients to see what they are willing to pay for, market research, industry trends, lessons learned from previous experience, etc. As everything in the ecosystem gets directed towards these KPIs, setting the wrong ones may have a dramatic impact.
- Enabling : The ecosystem needs a whole collection of competence building activities (learn, so , share) that aim for the gold. If activities are expensive to create, purchase or implement, make a predictive simulation of the contribution/correlation first, based on historical data. For small investments, just trial and error your way forward.
- Matchmaking : The match between the compence building activity and the need for it in the context of a mission or task relies on the aims. Establish the 'aim' metadata to express the intend of an activity to contribute to a certain competence tag. Most matchmaking can be done on the fly via analytics and search technology that matches the competence verbs tagged in the activities with those of the mission (job role) or tasks. Other matchmaking is suggested manually by the competence guardians, or preferably peers.
- Verifying: Verify the 'aims' by either 'claims' or 'facts'. Claims come from surveys where we ask people what X enabled them to do and how that would be different if they had not used X. It is anecdotal evidence. Facts come from analytics software that basically does a statistical correlation to determine the contributing degree of X. (The easiest statistics are to compare the outcomes from 'those who did' versus 'those who did not' while keeping the rest equal.)
Compared to the boundaries of today's 'training department', especially the tasks on mining and verification are new. Usually training departments get their goals from the business, who is supposed to also do the checks later on. By bringing all that together we get an integrated, holistic approach to competence building and its impact.
As long as we're on or gold metaphore let me ask you: do you have any idea how companies that deal in 'black gold' aka oil are organised? They have an 'upstream' and 'downstream' division. Upstream is all about oil exploration and locating new oil fields. Downstream is the operational division that exploits the oil fields and refines and distributes the oil. As a career tip: you'll want to work for the creative upstream branch rather than the continuously cost-squeezed downstream operations. My point is that also network area companies will need their 'upstream' team to continuously look for the gold and statistical skills to single out correlation to the gold.
As long as we're on or gold metaphore let me ask you: do you have any idea how companies that deal in 'black gold' aka oil are organised? They have an 'upstream' and 'downstream' division. Upstream is all about oil exploration and locating new oil fields. Downstream is the operational division that exploits the oil fields and refines and distributes the oil. As a career tip: you'll want to work for the creative upstream branch rather than the continuously cost-squeezed downstream operations. My point is that also network area companies will need their 'upstream' team to continuously look for the gold and statistical skills to single out correlation to the gold.
To sum up, this has been my 4-part reflection on the impact of corporate learning:
Corporations demand each and every business function to provide evidence of their contribution to the bottom line and learning is no exception to that rule. Claims like 'the ROI of learning is survival' and 'but surely our people are our most important assets' are weak forms of evidence if not excuses to the fact we haven't been very good at providing it.
The current dominant practice in corporate learning measurement is to apply the operational half of the 'Kirkpatrick 4 levels of evaluation model' because that is the part that is easy and cheap to measure and within the ownership of the training department. Unfortunately it is the half that only really matters for the training department's own operations and not the business. The model also lacks updates to our current understanding of learning (partly informal, connected, experiential, etc).
Over the years, a lot of frameworks and methods became available to suggest ways to get evidence of corporate learning's impact and a lot to potentially measure. We have discussed the good and bad of a few approaches and 'picked cherries'. Most models are embedded in a particular 'word view' on learning and my world view is that of an ecosystem. But regardless of the world view, the predetermined nature of a corporation sets the measurement squarely at value unlocked by performance.
In an extreme reduction of potential measurements, only two stand out: self-efficacy on the operational level and Key Performance Indicators (KPI) on the business level.
A pragmatic approach to prove learning's impact is to tweak the dominant Kirkpatrick model. My 'BertPatrick' suggestion turns is upside down, and tweaks the levels to make them more aligned with the richness of the current learning landscape.
A more disruptive (revolutionary) approach is based on the principles of working backwards, holistic and adaptive as illustrated with the road from El Dorado metaphore. In the 'Learning gets gold' model, competence building activities 'subscribe' via an 'aim link' to the key value drivers as set in KPIs. Competence tags glue the 4 task areas 'mining','enabling','matchmaking' and 'verifying' together.
- - - - - - - - - -
I'm done with this reflection now.
So learning folks, are we golden?
Hey Bert, I think the ideas are cool. I really like the part about corporate learning development and (I'm writing this comment to the Mika video) how learning is about the end result not about the getting of it. Anyway, I think we're starting to talk about learning environments that are complete, enterprise-wide functions of the business. In other words, learning is totally accessible as a driver to how we manage our working day; it doesn't sit trapped in a piece of the organisation called training.
ReplyDeleteBut honestly, Mika????
Hi Simon, indeed let's see learning as one of the many enabling functions in the enterprise, and not an independent one either.
ReplyDeleteOf all the things you can comment on :-) - if not Mika, who do you suggest as your favorite 'gold related' song?