Abi Winegarden scans exams
Abi Winegarden

Since rolling out a new exam-scanning system last semester, ITS cut the average wait time for instructors as well as ITS’ own staff time to provide the service in half.

Launched on campus in Fall 2017, InstructorTools by DigitalDesk also has drastically reduced the number of printed reports, which decreases printing costs.

In addition, “with this system, there are more checks and balances,” said Gina Bradford, ITS Classroom Hotline Manager, with ITS Teaching & Learning. Feedback from instructors has been positive, she added. They appreciate having direct access to the new system’s many features.

InstructorTools replaced a system that Carolina created and had used for some 35 years.

“What used to be a three-step process with the old system is now combined into one step with the new system,” said Patrick Lesane, Classroom Consultant, Tier 1, with Teaching & Learning. “During final exams, the average wait time was between 15 to 30 minutes with the new system compared to 45 minutes to one hour with the old system. Instructors are pleased with the faster turnaround because it enables them to get the information to their departments and to their students in a timely manner.”

New system integrates with ConnectCarolina

Because the new system pulls instructor and student information from ConnectCarolina, there’s less manual entry to set up reports and more validations, Bradford said. The old system did not integrate with ConnectCarolina and would, for example scan whatever was fed into it. The new system rejects forms that are not matched with students enrolled in the class and alerts faculty to these mismatches.

Developed in the mid 1980s, the old exam-scanning tool was powered by Perl and SAS and served up out of the AFS distributed file system. The exam-scanning system was extremely reliable, but it required much more staff effort. File-based reports were available for faculty and web-based reports were available for students, but some aggregations of data were available only in print. During 2016-2017, more than 100,000 pages were printed and distributed to faculty.

A stack of examsInstructors can customize reports

InstructorTools provides all reports to instructors online, which has all but eliminated printed reports. The web interface enables instructors to make corrections, update answer keys, customize reports and release grades to students without requiring ITS staff assistance, Bradford said.

InstructorTools enables instructors to generate and analyze a multitude of custom reports by exporting raw data into Sakai or Excel spreadsheets. Seeing how many students selected each answer to a question can inform instructors whether that question is effective. If no students answered correctly, perhaps the instructor needs to review the wording of that question. If an instructor sees that scores were pretty low on an exam, InstructorTools also enables the instructor to add bonus points and rescore.

“Placing the data at the fingertips of the instructors provides for a more individually designed user experience,” said Abi Winegarden, Technology Support Analyst with Teaching & Learning. “The instructor can look over results, make answer key adjustments and gain feedback on the quality of their exams, all with a few clicks of a button and at their own convenience.”

During the Fall 2017 semester, the first and only entire semester since InstructorTools launched, the exam-scanning tool scanned 67,506 bubble sheets and has been used for 621 exams and by 234 faculty members with 41 schools or departments, said Suzanne Cadwell, Director of Teaching & Learning. The top five units by number of faculty using service were Psychology (38), Exercise and Sport Science (28), Pharmacy (20), Biology (16) and Political Science (14).

Louise Fleming
Louise Fleming

Nursing school switched to ITS’ service

The UNC School of Nursing used to scan its own exams. It switched over to the ITS service when the new exam-scanning tool launched. Louise Fleming, Assistant Professor at the UNC School of Nursing, spearheaded that move after testing InstructorTools and arranging for training.

“The service has been very easy to use,” Fleming said. “I love the fact that once the test results have been entered into the online portal, I can go in myself and make changes to the answer key after viewing the item analysis on the test. I am able to print out individual student reports as well, which makes the exam review very easy to do with my students.”

She added: “The turnaround time has been very quick and the process of getting the Scantron answer sheets to their office to be processed was simple and convenient. I found Gina and Patrick in the centralized scanning department to be extremely helpful and patient as they oriented me, and later the School of Nursing, to Digital Desk.”

Other Carolina instructors have said that their students enjoy receiving part of their grade immediately after taking the exam. Instructors also appreciate that most of the work grading multiple-choice questions is automated, saving them time and reducing grading errors. They also like that the InstructorTools software lets them input multiple keys while also providing aggregate scores when they use multiple versions for each exam.

Prepared for one year

ITS groups spent more than a year prepping for the implementation of the new exam-scanning tool. The system consists of a web application front-end for instructors, an independent student grade portal, a database back-end and a couple of client stations used by Classroom Hotline for scanning.

“We worked with several different ITS groups for preparation of the new service and regularly worked with the vendor to get the application to the point we were comfortable with moving forward,” said J Bazemore, Technical Services Administrator with Teaching & Learning.

The ITS Systems group provisioned the application servers. ITS Middleware hosts the database, and CloudApps hosts the student portal. The application requires regular reports of the data from ConnectCarolina. ITS Enterprise Applications runs those regular exports of the information.

ITS also worked with the Office of the Registrar to obtain the necessary access and rights for information to feed into the new system.

“The data is imported into the application database and used to match scanned exams to the respective courses, instructors and students,” Bazemore said. “We worked with a couple of instructors testing the new system with existing exams and collected their feedback. The new system provides more capabilities for the instructors in managing and processing exams, but we wanted to be sure it also provided the level of functionality and reporting they needed and received from the old system.”

“We were able to take the suggestions instructors made during testing back to the vendor and work to have them incorporated into our application instance ahead of the launch,” Bazemore added. “All throughout the process, we regularly tested the scanning and reporting against the previous scanning solution to verify consistent results. We experienced a couple of minor data-matching issues after launch and worked through them by updating the import process or correcting the source information.”

Remembering Carolina’s exam scanning past

Todd Lewis holds a bubblesheet
Todd Lewis

Todd Lewis, Solutions Engineer of ITS Infrastructure & Operations, witnessed the birth of Carolina’s previous exam-scanning system as well as its retirement.

Back then, he was a student employee of the UNC Computation Center (UNCCC), which would later merge with Administrative Data Processing to become what is now ITS. The Comp Center “had a huge Scantron machine that did all sorts of bubble sheet processing,” Lewis recalled. “An unfortunate failure of the Scantron service caused a large number of errors on one year’s North Carolina Pharmacy Boards.”

As a result, Lewis said, the Comp Center decided to write its own scanning analysis system in SAS.

“That SAS code was modified occasionally over the next decade by various UNCCC staff, and probably around 1994 it was given to me for more modifications,” he said. “That turned into an almost complete rewrite. Soon thereafter we migrated it from IBM’s MVS system to UNIX and added support for weighted questions, Roger’s style true/false scoring, multiple correct answers, ‘gimme’ questions, scoring based on other than 100-point scales, and probably some other things I’ve forgotten. Biology Professor Jean DeSaix’s urging resulted in students being able to see their scores through the web.”

“After ITS was created, the SAS code and supporting scripts followed me to Middleware, where I merely kept it working,” he said. “That temporary arrangement lasted for much longer than anyone expected. Fortunately, it did keep working until it was finally replaced. Personally, I’m happy for it. It’s put in its 35 years, and I’m glad for it to have retired on good terms.”

Comments are closed.