Reaction evaluations are helpful at revealing what the learners thought about the workshop, but how can the design team find out if the learners have truly learned anything in the workshop? The learning evaluation is mapped directly to the objectives and teaching points developed previously in the design process. This is the kind of evaluation we are most familiar with—a knowledge or performance test—during which the learners demonstrate what they have learned. Let’s look at both of these types separately.
72 Step 9: Build Evaluation Tools
Sometimes the reaction evaluation is also called a “smile sheet” because it was common to use smiley faces in the measurement scales.
Tip
There are many more examples of questions available at http://www.businessballs.com/
trainingevaluationtools.pdf. A reaction evalua- tion is also included in the case study at the end of this chapter.
Tip
Performance Test
A performance test challenges the learners to apply the skills they have learned to an actual situation. The most effective performance test pulls the entire work- shop together and asks students to engage in an activity that reflects their real- world situation (some readers may recognize this as something called a criterion test). This would challenge the learners to integrate their new skills and knowl- edge and apply it in their actual situation. For example, the last part of a work- shop on conducting company research in preparation for a job interview might be devoted to a final assignment during which each learner is given a company’s name and time to research that company (individually, in pairs, or in teams) and to complete a short worksheet that reflects the key areas covered in the workshop.
The library instructor would circulate, assess the learning, and coach the learn- ers as needed.
This would be ideal. But often there is simply not enough time either during the workshop or later for the instructor to give such an extensive performance test. Instead of asking learners to complete the entire research process as taught, the learners may be asked to complete just a piece of it. You’ve already designed the basic skeleton for this approach during the last step of the design process: cre- ating learning objectives. The learning objective merely needs to be reworked into a performance question that can stand on its own when used with the learn- ers. For example:
Step 9: Build Evaluation Tools 73 If the objective reads . . . The performance question may read . . .
Given a research topic and access to the library’s home page, find three relevant articles that the library owns.
Using an article citation, search the cata- log to find if the library owns a certain journal.
Using the library’s home page, find three articles on the topic of affirmative action in higher education and identify which library these articles are in.
Search the catalog for the citations listed below and identify which library you would find the item in or indicate if the library does not own the item.
Once the performance questions have been identified, creating the work- sheet or testing sheet can be done very quickly. In many circumstances, these worksheets may be integrated into step 13 and a traditional end-of-class skills test may be eliminated. (See the next chapter for discussion of this strategy.)
Knowledge Test
For many instructors, administering and analyzing a performance test as outlined above is simply an unrealistic time investment. Given this, a knowledge test may be the best-case scenario. Knowledge testing is the kind of assessment that librar- ians and students are most familiar with. Although rather difficult to craft, once
finished, the questions can be used and reused in a variety of workshops.
Quantifiable test questions are very easy to grade and analyze, and qualitative (open-ended) questions do not have to create a great deal of work.
Each question on the test should be mapped directly to a particular module of the workshop. Teaching points and objectives can provide the basis of each question, as shown in figure 9-2.
The results can be extremely helpful in assessing the effectiveness of both the instructor and the lesson plan. If many learners get a particular answer incorrect, the designer knows there is a problem with the workshop. The designer and the instructors would then need to diagnose the root cause of the incorrect answer.
74 Step 9: Build Evaluation Tools
FIGURE 9-2 Sample knowledge test
Objective Teaching Point Test Question
Using an article citation, search the catalog to find if the library owns a journal.
Need to search the catalog for the name of the maga- zine, journal, or newspaper rather than the article title.
Why? Because articles are not included in the catalog.
In the library catalog, I need to search by the __________
to find out if the library has the article I want.
a. title of the article b. name of the journal/
magazine/newspaper c. author of the article d. any of these
Given a list of citations, iden- tify three factors that distin- guish a popular article from a scholarly article.
Examine the intended audi- ence of a journal or maga- zine to help distinguish between scholarly and popu- lar sources. Popular sources are written for generalists;
scholarly sources are written for experts and academics in a particular field.
__________ articles are nor- mally written for experts in a particular field.
a. Popular b. Scholarly c. Citations of
d. Journal or magazine
Incorrect Answer Diagnosis
What happens when the learners consistently get a question wrong? Look for the following three problems.
Test Problem
Is the test question the problem? Is the wording of the question or the possible answers confusing? Are any of the possible answers misleading? Could more than one of the possible answers be correct? Writing effective test questions is a chal- lenge, even for full-time teachers who must do so all the time. Library instruc- tional designers of knowledge tests may need to test the questions ahead of time with representative target learners or consult with educational specialists.
Delivery Problem
Is the instructor digressing from the lesson plan in a way that is not meeting the particular learning objective? Can the instructor pinpoint the problem and elim- inate that behavior in future workshops? If so, did that address the problem? On the flip side, there may be an instructor who consistently gets stellar results on her tests. These “deviations” from the lesson plan could be standardized for future instructors.
Instructional Design Problem
Is the problem due to some aspect of the lesson plan that the question corre- sponds to? Is there adequate skills practice in the module in question? Are learn- ers getting the feedback they need to adjust their skills and knowledge during this module? Are the teaching points not getting through to the learners? See steps 15 and 16 for further discussion of various aspects of the lesson plan that might be useful in additional diagnosis.