Document Type
Conference Proceeding
Publication Date
1-2009
Publication Title
Proceedings of the 2008 Library Assessment Conference
Publisher
Association of Research Libraries
First page number:
529
Last page number:
533
Abstract
The use of Clickers as a tool for library instruction has been growing in popularity because library instructors view this technology as a mechanism to foster interactivity within library instruction sessions in order to increase overall student engagement. However, a newly emerging area of interest for library instructors is the use of Clickers as a tool for library instruction assessment. This paper posits some of the viewpoints of various instructors using Clickers including the viewpoints of library instructors. The central question considered in this paper is whether Clickers are an effective and feasible tool for library instruction assessment. This examination extends further in considering the value of Clicker systems against the value of traditional paper-based methods for library instruction assessment. An example of a substantial library instruction assessment initiative at the University of Nevada, Las Vegas Libraries is provided as a case for consideration of the current feasibility of Clicker systems for library instruction assessment. Additionally, differing configurations for Clicker systems are outlined as are various alternatives to Clickers currently available in the interest of presenting scalable options for library instructors.
Keywords
Library orientation; Library orientation – Evaluation; Student response systems
Disciplines
Communication Technology and New Media | Curriculum and Instruction | Library and Information Science
Language
English
Repository Citation
Griffis, P.
(2009).
Assessment tool or edutainment toy.
Proceedings of the 2008 Library Assessment Conference
529-533.
Association of Research Libraries.
https://digitalscholarship.unlv.edu/lib_articles/393
Included in
Communication Technology and New Media Commons, Curriculum and Instruction Commons, Library and Information Science Commons
Comments
Paper presented at the 2008 Association of Research Libraries biannual Library Assessment Conference