Editing and Evaluating Training Programs

About DSA

Darryl L. Sink and Associates, Inc. (DSA) helps organizations design and develop learning and performance solutions that get results. DSA works cooperatively with organizations to:

  • accomplish internal custom projects
  • train and educate their internal staff in Instructional Systems Development.

Check out DSA presenters and Consultants at dsink.com.

DSA Tips Newsletter Archive

If you haven’t visited the tips archives lately, check it out – http://dsink.com/dsa-tips-newsletters

Darryl’s tips are now conveniently organized not only by published date, but by these topics:

  • Project Management
  • Front End Analysis
  • Design Strategies
  • Instructional Strategies/Techniques
  • Measurement/Evaluation
  • Implementation
  • Professional Development
  • Coaching with DSA Tips

We have lots of great ideas just waiting for you to use!

Tap into DSA’s expertise and experience!

Call me at 831-649-8384 or email me at jane@dsink.com.

Bring our expert presenters on-site with a workshop from DSA. Click here for details. Call or E-mail Jane Sink to help you decide which workshops are right for your group.

Are you almost through the design and development phase of an instructional design project? Are you beginning to think about the evaluation phase of the training materials?  Here are seven great tips we teach in our workshops and have found very useful as we get to this stage with a client on a custom project.

  1. Be sure to gather both goal-free and goal-based information. Goal-based information is information about how the instructional program will meet the business need and learning objectives. Goal-free evaluation is the process of gathering information important to the use, acceptance, clarity, and efficiency of the training program.
  2. When editing instructional materials, consider editing for only one category at a time. For example, when editing for content, only edit for content. When editing for grammar, only edit for grammar.
  3. When working with other stakeholders in the project who are editing your program for content accuracy or appropriateness to the audience, consider using a checklist to keep them focused on the type of edit they have been requested to conduct.
  4. When trying out material on the learner, consider getting one module put together and trying it out on one or two individuals from the student population before putting the whole course together. This usually provides valuable information that you can feed forward for the development of the remaining modules.
  5. Do not confuse a walk-through review of a program with managers, subject experts and other stakeholders with a try-out on the target audience.
  6. Try out materials on 3-4 individuals from the target population before running the program for a whole class. A great deal of research has been completed showing no significant difference in the quality of information gathered from a small sample vs. a regular class size for the purpose of revision.
  7. When running a trial of your materials, be sure to use a criterion-referenced test as the basis of your goal-based evaluation. For goal-free evaluation of the program, it is recommended that the learner fill out a written questionnaire before being verbally debriefed. This usually results in more objective feedback.

One of the keys to the systematic development of a training intervention, or for that matter other non-training interventions, is to try out the intervention prior to implementation. In practical terms, this means trying your training program out before running the first class. Remember, once you have run the first formal class you now have those same people as either advocates for your program or adversaries.

See you next time,