Today a colleague received a request to make sure the course he was working on featured Kirkpatrick analysis.
'Well, we can do a "happy sheet" and intra/post course testing,' was the group reply, in an attempt to crudely match the good doctor's schema.
But no! The client wanted a survey at the end of the course that addressed all four levels of Kirkpatrick in one go:
- what do you think of this training course?
- have you learnt what you need to know?
- will you change your behaviour as a result of this course?
- do you think your performance will improve?
Perhaps, sadly, what it made me think about was just how out of the loop I am when it comes to the whole training cycle. For our clients we are simply a means to an end - nothing more than the design phase of the training - so I never get to learn how the training went down; I never get any learner feedback or statistics.
I used to enjoy the thrill of sorting through the post course sheets (back when I was a classroom trainer) looking for comments (instead of a straight row of satisfactory ticks) and trying to implement changes for the next time to get better. Or the nervous feeling of awaiting the six-month peer review. Still, being an elearning guy now does have its benefit - no more bloody business hotels...