Toward the development of a model to estimate the readability of credentialing-examination materials
Award Date
5-2010
Degree Type
Dissertation
Degree Name
Doctor of Philosophy in Educational Psychology
Department
Educational Psychology
First Committee Member
Alice J. Corkill, Chair
Second Committee Member
Gregory Schraw
Third Committee Member
CarolAnne Kardash
Graduate Faculty Representative
Mark Ashcraft
Number of Pages
448
Abstract
The purpose of this study was to develop a set of procedures to establish readability, including an equation, that accommodates the multiple-choice item format and occupational-specific language related to credentialing examinations. The procedures and equation should be appropriate for learning materials, examination materials, and occupational materials. To this end, variance in readability estimates accounted for by combinations of semantic and syntactic variables were explored, a method was devised to accommodate occupational-specific vocabulary, and new-model readability formulas were created and calibrated. Existing readability formulas were then recalibrated with the same materials used to calibrate the new-model formulas. The new-model and recalibrated formulas were then applied to sample items extracted from a professional licensing examination and the results were compared.
Keywords
Certification testing; Construct-irrelevant variance; Credential testing; Licensure testing; Readability
Disciplines
Educational Psychology
File Format
Degree Grantor
University of Nevada, Las Vegas
Language
English
Repository Citation
Badgett, Barbara Anne, "Toward the development of a model to estimate the readability of credentialing-examination materials" (2010). UNLV Theses, Dissertations, Professional Papers, and Capstones. 185.
http://dx.doi.org/10.34917/1436770
Rights
IN COPYRIGHT. For more information about this rights statement, please visit http://rightsstatements.org/vocab/InC/1.0/