Evaluating Learner Performance

About DSA

Darryl L. Sink and Associates, Inc. (DSA) helps organizations design and develop learning and performance solutions that get results. DSA works cooperatively with organizations to:

  • accomplish internal custom projects
  • train and educate their internal staff in Instructional Systems Development.

Check out DSA presenters and Consultants at dsink.com.

DSA Tips Newsletter Archive

If you haven’t visited the tips archives lately, check it out – http://dsink.com/dsa-tips-newsletters

Darryl’s tips are now conveniently organized not only by published date, but by these topics:

  • Project Management
  • Front End Analysis
  • Design Strategies
  • Instructional Strategies/Techniques
  • Measurement/Evaluation
  • Implementation
  • Professional Development
  • Coaching with DSA Tips

We have lots of great ideas just waiting for you to use!

Tap into DSA’s expertise and experience!

Call me at 831-649-8384 or email me at jane@dsink.com.

Bring our expert presenters on-site with a workshop from DSA. Click here for details. Call or E-mail Jane Sink to help you decide which workshops are right for your group.

DSA would like to thank Dr. Barbara Martin for contributing this month’s tip on Evaluating Learner Performance. Dr. Martin  teaches DSA’s “The Criterion Referenced Testing Workshop”.

Did you know that the savvy test-taker can pass most multiple choice tests written by novice test-writers by either selecting response choice “c” or the longest response? It’s true. Novice test-writers overuse response “c” and they also write longer correct answers because they provide more detail in the correct response. Incorrect responses are often less detailed and shorter. Check a few of your own self-made tests to see if you have overused response choice “c” and if you have longer correct answers.

Multiple choice tests are the favorite type of test for many training programs because they allow course developers to test large amounts of content in a relatively short timeframe. They can also be machine scored. However, good multiple choice items are difficult to write. Novice test writers often cue learners to the correct answer without intending to do so. Some of the most common ways to cue learners are:

The first four bullets address “format” guidelines for multiple choice items, while the last bullet addresses a “content” guideline. The format guidelines are relatively easy to apply once you understand them. However, making all the response choices plausible is much more difficult. It is important to remember that anytime a test-writer uses a response that is not plausible, e.g., the Easter Bunny or something that would never be chosen as a correct response, the test-taker is being cued that this response is incorrect. You are giving the learner a freebie. Just think, if one of the four responses is not plausible, the test-taker only has to choose between three possible answers. This gives the test-taker a 33% chance of guessing the correct answer! This is good if you are the test-taker; bad if you are trying to get an accurate account of what the learner knows.

Multiple choice items and tests are a great way to evaluate learner knowledge and skills providing you are not giving away the answer by writing poor test items. If you want to evaluate learners’ performance rather than knowledge, a better strategy is to use a checklist. We will address checklists in a future Tips Newsletter.

Dr. Barbara Martin is an active DSA associate and teaches The Instructional Developer Workshop, The Course Developer Workshop, and The Criterion Referenced Testing Workshop. She has written many articles and an award winning book on the Cognitive Learning Domain.

See you next time,