Suche

Wo soll gesucht werden?
Erweiterte Literatursuche

Ariadne Pfad:

Inhalt

Literaturnachweis - Detailanzeige

 
Autor/inRao, Vasanthi
TitelImpact of Psychometric Decisions on Assessment Outcomes in an Alternate Assessment
Quelle(2012), (210 Seiten)
PDF als Volltext Verfügbarkeit 
Ph.D. Dissertation, University of South Carolina
Spracheenglisch
Dokumenttypgedruckt; online; Monographie
ISBN978-1-2678-4736-2
SchlagwörterHochschulschrift; Dissertation; Alternative Assessment; Psychometrics; Item Response Theory; Models; Disabilities; Test Construction; Test Theory; Scoring; Test Items; Difficulty Level; Ability; Correlation
AbstractIn 1997, based on the amendments to Individuals with Disabilities Education Act (IDEA), all states were faced with a statutory requirement to develop and implement alternate assessments for students with disabilities unable to participate in the statewide large-scale assessment. States were given the challenge of creating, implementing, and executing these tests within a short period of three years. In most cases, innovative assessments were developed by tweaking, combining and modifying traditional assessments already in place for non-disabled students. When changes were made to the traditional assessment program, the psychometric properties associated with the test and scores are likely to differ due to the unique characteristics of the student population and testing program. However, few changes had been made to the traditional analyses of scores of students with disabilities. This study examines the impact of changes to three psychometric elements--design, examinee population and analysis issues on ability and item estimates from the assessment computed using Rasch measurement model. Using data from a statewide selected response alternate assessment designed for students with disabilities in southern United States, the impact of test design and analysis decisions related to partial credit scoring, analyses within homogeneous groups and treatment of lead questions were examined to determine the effect on test outcomes. We examined estimates of item difficulty and ability as well as a select sample of items and students' scores to compare them to estimates from "original" analyses. The results indicated considerable variability of student scores and differential impacts on item measures based on the choices or a combination of choices made during analyses. Clear patterns of change in scores when the lead questions were treated as correct seem to suggest a systemic change in item measures. High correlations between student scores from the homogeneous and the heterogeneous group analyses indicated little difference between the two. The findings in this study provided evidence for building validity evidence for an assessment, decisions made in computing psychometric measures need to be explicitly based on theory and aligned to the use and the interpretations of scores. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page: http://www.proquest.com/en-US/products/dissertations/individuals.shtml.] (As Provided).
AnmerkungenProQuest LLC. 789 East Eisenhower Parkway, P.O. Box 1346, Ann Arbor, MI 48106. Tel: 800-521-0600; Web site: http://www.proquest.com/en-US/products/dissertations/individuals.shtml
Erfasst vonERIC (Education Resources Information Center), Washington, DC
Update2020/1/01
Literaturbeschaffung und Bestandsnachweise in Bibliotheken prüfen
 

Standortunabhängige Dienste
Die Wikipedia-ISBN-Suche verweist direkt auf eine Bezugsquelle Ihrer Wahl.
Tipps zum Auffinden elektronischer Volltexte im Video-Tutorial

Trefferlisten Einstellungen

Permalink als QR-Code

Permalink als QR-Code

Inhalt auf sozialen Plattformen teilen (nur vorhanden, wenn Javascript eingeschaltet ist)

Teile diese Seite: