I’ve had a number of people ask me recently about what questions we are asking in our Pilot surveys.
The largest survey we did was for our Unified Communications rollout.
I have started to use some of these questions during other implementations to get a feel for how the training programs impact the adoption of a new tool. These surveys are being sent out one month after the implementations.
We have found that we get a better idea as to whether the solution and the training that supports the solution worked. People have had time to work with the new solution and processes. Patterns as to what is being used and what isn’t are beginning to surface. Plus, I find I get more honest answers from people one month out.
The general questions we ask:
- How helpful did you find the training materials?
- We then list each individual training object (each type of classroom training, each tutorial, each quick reference)
- Helpful / Not Helpful / Did Not View or Attend
- What other resources did you find helpful as you learned how to x?
- This is a free-text field
- I find x intuitive and easy to use – Yes / Somewhat / Not at all
- x = the IT application
- As a result of the training, I felt I could use x – Yes / Somewhat / Not at all
- I have generally found that for Somewhat and Not at all – people will add comments without prompting
- As a result of the training, I could use x to perform y tasks – Yes / Somewhat / Not at all
- This question can be separated into the different type of tasks if the solution was modular
- Again, people will generally add comments without prompting
- As a result of the training, I felt I understood (any resulting new process / concept) – Yes / Somewhat / Not at all
- We will add this question when there is an affiliated major process change.
- The question also helps us to see whether we did a good job putting the material in context
- The training applied to how I need to use x to do my job – Yes / Somewhat / Not at all
- Another question asking whether we got the context right
- What features do you find you use most often, check all that apply
- For new IT applications, this gives us a feel for what people are actually using
- This also gives us information for where we need to do more training, or ask more questions as to why they are not using a particular feature or area we are expecting people to use
- Depending on the sensitivity of the audience, this question is best done with clear identification vs anonymously. This provides a better feel for whether we got the audience needs assessment right if we are working with multiple audiences.
- Rate your understanding of the following subjects as a result of this implementation
- Here we check to see how well we covered the individual areas
- We used a Likert scale ranking from I understand to No understanding + Does Not Apply to Me
- We are able to see what may require more training and whether we captured the appropriate audience.
- What topics do you wish were better covered? – Free Text
- How can we improve the training and support for implementations like this one – Free Text
- What improvements do you think we can make to (the resulting new process) – Free Text
- We had one question for each major process that was affected by the implementation
- Any other comments or would you like someone to follow up with you – Free Text
These surveys have really helped us get a better feel for how the training solutions we design are helping (or hurting) implementations that (hopefully positively) impact the business.
If you see other questions you think we should be asking – please add them to the comments.
Leave a Reply