Interpreting Results from the Scantron Exam Scanning Service

Do you want to help shape the future of bubble sheet scanning at UConn?

ITS is piloting bubble sheet scanning through Gradescope, which may replace Scantron for bubble sheet scanning functionality in the future. Gradescope provides a more convenient scanning solution that does not require expensive and custom scan sheets, and can be managed entirely through HuskyCT.

This article provides information about how to interpret the various exam reports from test scoring.

Professors are commonly concerned about the accuracy and fairness of their tests. Establishing the validity of a test is a fairly complicated process, but a simple and practical criterion comes from the test scores themselves. If a test’s total score is used as an anchor, then each item may be judged against this anchor. For example, did students who scored well on the whole test tend to get item 14 right? And did students who fared poorly on the whole exam tend to get item 14 wrong? What is the correlation between responses on item 14 and total test scores? Of course, if the whole test is flawed, these questions will not give sensible answers. Statistical answers produced by the test scoring process can not substitute for careful test preparation, but if the test has good coverage of content and if items are carefully written, then the total test score serves as a sturdy anchor.

ExamGrades.csv

A CSV file is an Excel-type file used to display each student's answer to each of the possible 200 questions. It displays the student's information, including Name, StudentID (PeopleSoft number), and Exam Version. The student's grade, percent score, and total score (the number of correct answers) are displayed at the far right in Columns HB, HC, and HD.

Report 100 – Test Analysis Report

The Test Analysis Report provides summary information about the test. It is useful for a quick view of overall performance along with any anomalies of which you should be aware. You can use the other reports to drill deeper into the results.

Report 101 – Student Statistics Report

The Student Statistics Report provides the scores for all students in the class. Students are listed down the left side of the report, along with the grade, number of correct answers to total number of answers, and a percent score with a bar graph to the right of the table containing the statistics. The bar chart displays scores at or above, the mean in green, scores below the mean in red, and the mean in blue.

Report 204 – Condensed Item Analysis Report

The Condensed Item Analysis Report displays the correct answer from the answer key (highlighted with an asterisk), the frequency (the number of times a particular answer choice was chosen by students), and the corresponding percentage of the total, along with a bar graph to the right of the table containing the statistics. The bar chart displays the correct response(s) in green and the incorrect response(s) in red. If an incorrect response is chosen more than the correct response(s), its bar chart is yellow.

Report 310 – Test Statistics Report

The Test Statistics Report displays basic statistics on the test as a whole and is useful for gaining a quick look at performance and basic statistical analysis. The Test Statistics Report displays:

  • Score Data: possible number of test points, number of graded items, and high and low scores.

  • Statistics: mean, benchmark, range, standard deviation, and variance.

  • Percentiles: 25th and 75th percentile, plus interquartile range, and median score.

  • Confidence Intervals: 1%, 5%, 95%, and 99% intervals

    • Test Reliability: Kuder-Richardson and Cronbach Alpha statistics.

ScanResults.rbbx

The RBBX file is an XML-type file used to upload grades to HuskyCT and only has student IDs (PeopleSoft numbers) and grades. See Uploading Grades to HuskyCT (BlackBoard) for more information.

ScanResults.rmx

The RMX file is a proprietary file type used to upload exams into the grading software.

When two or more scores fall at the 27th or the 73rd percentile, all the tied scores are dumped into the upper or lower category. Thus, the claim of exactly 27% of the scores being accurate is only true when there is a single score at those percentile ranks.

The EASINESS index was formerly called the “difficulty” item and is the inverse of the number presently used.

Related Articles