Testing and Evaluation: The Top Ten List

About DSA

Darryl L. Sink and Associates, Inc. (DSA) helps organizations design and develop learning and performance solutions that get results. DSA works cooperatively with organizations to:

  • accomplish internal custom projects
  • train and educate their internal staff in Instructional Systems Development.

Check out DSA presenters and Consultants at dsink.com.

DSA Tips Newsletter Archive

If you haven’t visited the tips archives lately, check it out – http://dsink.com/dsa-tips-newsletters

Darryl’s tips are now conveniently organized not only by published date, but by these topics:

  • Project Management
  • Front End Analysis
  • Design Strategies
  • Instructional Strategies/Techniques
  • Measurement/Evaluation
  • Implementation
  • Professional Development
  • Coaching with DSA Tips

We have lots of great ideas just waiting for you to use!

Tap into DSA’s expertise and experience!

Call me at 831-649-8384 or email me at jane@dsink.com.

Bring our expert presenters on-site with a workshop from DSA. Click here for details. Call or E-mail Jane Sink to help you decide which workshops are right for your group.

DSA would like to thank Dr. Barbara Martin for her insights on testing and evaluation (with a wink and a nod to David Letterman). P.S. Dr. Barbara Martin is an active DSA associate and teaches The Instructional Developer Workshop, The Course Developer Workshop, and The Criterion Referenced Testing Workshop. She has written many articles and an award winning book on the designing instruction for affective behaviors.

Many organizations and companies are jumping on the “let’s test after training” bandwagon. And they have good reasons for making the jump. Companies want to know if their training dollars are reaping the intended benefits. It is certainly important to ask if participants have acquired the knowledge, skills, and attitudes needed to help companies meet their business goals. However, testing is a huge undertaking. Writing tests is hard work and expensive. Likewise, checking to see if transfer of training has occurred is time consuming and it is difficult to assess. It’s easy to throw up one’s hands and jump off the bandwagon before even getting on. With that in mind, here is a testing and evaluation Top Ten List to get you thinking about how to maximize your jump onto the testing bandwagon.

  1. If testing were simple, everyone would do it!
    Pat yourself on the back for deciding to start a testing program. Begin by deciding what kind of information will give you the most benefits from testing. Then start small and learn from your experiences.
  2. To address objective-based testing, write a test blueprint or specification sheet.
    A testing blueprint is an overview of the number and types of questions that will be written for each objective. Preparing a blueprint gives you a big picture perspective of how many items you will need to prepare. You may decide that there is not enough time in this life (and the next) to write all the items you would like to. The blueprint lets you make decisions before you ever write the first item.
  3. Checklists are valuable tools for assessing products (e. g., tangible objects) and performances.
    Sometimes the best “test” is a checklist. Because a checklist has a list of tasks and skills, these components are often enabling skills for the larger skill. You may not need to create additional tests if you have a good checklist.
  4. You don’t have to test everything you teach!
    Use a risk rating diagram to help you determine which objectives are most important to test. Test those objectives that have the “greatest chance of error” and the “greatest consequence of error” for on-the-job performance.
  5. If you only have time to test one objective, test the key or terminal objective.
    Developers often forget to test the most important objective -- the key or terminal objective. This objective is the one that most closely mirrors on-the-job performance and should always be tested even if you cannot tests the enabling objectives.
  6. Match the test to on-the-job performance.
    Test items should reflect as closely as possible what participants are expected to do on-the-job. Test the application of concepts and principles and problem solving before you test recall and recognition of facts.
  7. A well written objective is the end all and be all of writing good tests.
    Since objectives specify what participants will be learning and how they will be performing, there should be a 1-to-1 match between the learning objectives and the test items or checklists. Be sure each objective includes an action, the conditions under which testing will take place, and a standard of performance.
  8. If attitude change is important, give the participants a questionnaire or survey to see if your training is having an impact.
    We want participants to value the content we are teaching because if they do the participants are more likely to use what they have learned back on-the-job. However, we need to know if we have influenced their attitudes. Administer a questionnaire or survey to check.
  9. Good tests and checklists must be valid and reliable.
    It’s fairly easy to write test items and checklists, but much harder to write valid and reliable tests. Use SMEs to check instruments for content accuracy and then conduct a pilot test to make sure that tests and checklists are valid.
And the Number 1 Tip...
  1. The only thing that really matters is whether participants can transfer what they have learned in training to their jobs.
    Good tests administered immediately after training ask the learner to apply what they learned, that is, they evaluate whether the learner can transfer skills and knowledge back to the job. Once participants are back on the job, data from follow-up surveys and interviews will help companies determine whether participants are correctly performing their jobs.

See you next time,