• Tidak ada hasil yang ditemukan

REVIEW OF EXISTING INSTRUMENTS

Dalam dokumen Information Literacy Assessment (Halaman 167-170)

To what extent are [UMBC] students familiar with the concept of copyright?

Do [UMBC] students who self-report a high level of confidence with information literacy skills perform well when responding to questions that represent those skills?

Do [UMBC] students who self-report a high level of confidence with computers perform well when responding to questions that represent information literacy skills?

To what extent do [UMBC] faculty model good library use behavior?

To what extent do [UMBC] faculty encourage students to use the library?

and authoring in the surveys we reviewed. Many were not dated, were out of date, or did not have contact information. If you decide to post survey instru- ments on the Web, remember to include identifying information (including dates) on all web pages, word-processed documents, and PDF (Portable Document Format) documents in order to ensure appropriate credit and recognition for your hard work.

We were able to characterize the instruments we reviewed into the fol- lowing categories, based on their purpose:

To test students’ knowledge, attitudes, or competencies with the local library web page, online catalog, or other locally owned or subscription resources (databases)

To test students’ knowledge of resources in a particular subject, major, or academic discipline

To gather attitudinal or self-report data from students To determine students’ technological competencies To gather data on incoming freshmen’s skills

To compare pre- and posttest results for short-term intervention purposes

Many of the instruments we reviewed contained similar or identical queries, making it difficult to determine which instrument originated the query. Few of the instruments identified were designed with the ACRL Standards in mind, which is understandable given that the standards were approved in January 2000 and published in March of that same year. Still, many of the instruments contained elements of the standards, which reflects how reference and instruction librarians have long been concerned about students’ ability to identify appropriate resources for their information needs, develop successful search strategies, evaluate information, and employ crit- ical thinking skills.

TheUMBC Surveyis one of the few instruments currently available that was developed solely with the ACRL Standards in mind. A group of re- searchers at Kent State University (Ohio) are developing a survey based on the standards, but their approach is a little different. The purpose of Kent State University’s project for the Standardized Assessment of Information Literacy Skills (SAILS) “is to develop an instrument for programmatic-level assessment of information literacy skills.” The researchers are using a systems approach to develop queries and “an item response theory for data analysis”

Developing Assessment Instruments 159

to develop an instrument that can be used across institutions. Once the in- strument is validated, it will be used to assess the skills of incoming college- level students and to determine long-term improvements or other changes in students’ skill levels over time.4

Other survey instruments based on the ACRL Standards include the Portland State University Library’s Information Literacy Inventory, the Bay Area Community Colleges Assessment Project, and a dissertation based solely on Standard 2.

The Portland instrument, though not completed or updated since July 2001, is clearly divided into sections based on the ACRL Standards. The pur- pose of the Bay Area Community Colleges Assessment Project was to “de- velop and field test an information competency exam for California Commu- nity College students.” The exam was developed based on the ACRL outcomes from standards 1, 2, 3, and 5.5Connie E. Constantino’s 2003 dissertation in- cluded a “questionnaire and interview” addressing Standard 2.6In addition, the Minneapolis Community and Technical College has developed information literacy midterm and final exams for its Information Literacy and Research Skills course, INFS 1000, that satisfy the college’s information literacy re- quirement. These exams are mapped to the ACRL objectives.7 None of the other seventy instruments we reviewed specifically mentioned or referred to the standards.

In November 2004 the Educational Testing Service (ETS) announced a partnership with seven colleges and universities to develop the Information and Communication Technology (ICT) Literacy Assessment, which will build on the ACRL Standards, among others.8The project defines informa- tion and communication technology literacyas

the ability to use digital technology, communications tools and/or networks appropriately to solve information problems in order to function in an information society. This includes the ability to use technology as a tool to research, organize, evaluate and communi- cate information, and the possession of a fundamental under- standing of the ethical/ legal issues surrounding the access and use of information.9

The ICT Literacy Assessment is different from most information literacy and library research assessment tools in that it is “simulation-based,” as- sessing multiple aspects of ICT competencies by “requiring test takers to use basic technology as a tool to arrive at solutions,” instead of posing multiple- 160 Developing Assessment Instruments

choice queries. Examples of some of the sixteen tasks that students tackle in a two-hour session include building a spreadsheet and composing e-mail messages that summarize research findings.10

On March 3, 2005, the ETS announced that by the end of that month, approximately 8,000 students nationwide would have taken the assessment.

Aggregate results will only be released to participating institutions during the first year or so of launching. Although an ETS presentation at the American Library Association’s Annual Conference in June 2005 reported that students involved in beta testing of the assessment gave overall positive feedback, re- sponses from the library community were less encouraging, noting there was

“not enough emphasis on print media” and “too much emphasis on tech- nology” in the assessment.11

Dalam dokumen Information Literacy Assessment (Halaman 167-170)