UCAT: a road to testing

By Natela Cutter, Mission Public Affairs

One would think that students in general hate taking tests. Not so at the Defense Language Institute Foreign Language Center where well over 2,500 students are currently engaged in online learning due to the COVID-19 pandemic.

“Students were actually freaking out because they were not sure how they were doing in the course,” said Chief Military Language Instructor Sgt. 1st Class Rebecca Babcock, about the service members learning Persian Farsi. “We obviously turned to online instruction very quickly, but we did not have a method for testing students from afar,” Babcock explained.

While both instructors and students rose to the challenge of transitioning to a virtual classroom by using commercial software that allows for online collaboration through meetings, chat and document sharing, unit testing outside the schoolhouse walls remained an issue. “We quickly tested several different systems to see what would work,” explained Babcock.

The technical solutions team faced a daunting challenge. How does one find a tool that supports more than 10 different language scripts and provides listening, transcription, text selection and ordering features? The system also needed to support more than 200 students simultaneously taking a test with real-time interactivity – and be secure.

An unlikely candidate rose to the surface – the Universal Curriculum and Assessment Tool, called UCAT, initially designed as an in-house curriculum development support tool around 2010.

As the product gained popularity, by 2016 it became evident that the lack of assessment was a major gap. Two years later, the testing feature was added and operationalized but was not broadly used.

“We looked at different systems to see what would work to meet our needs, but commercial options would not handle many of the foreign language scripts or DLIFLC’s listening assessment requirements,” explained Babcock, a lead member of the technical solution recommendation team.

“I made the decision to go with UCAT because we needed a tool that could build assessments quickly, be extremely agile, offer adequate security, adequately assess a student’s performance, and be cost effective,” said DLIFLC Commandant Col. Gary Hausman, with the knowledge that more work needed to be done.

In fact, Hausman told the UCAT development team, “I want you to break it, so that we know what we have to fix to make it effective.”

For Umer Farooq, lead of the UCAT development team, the message was clear – make this work. “We were excited that UCAT was finally getting the attention it needed to make it a viable tool but we were also afraid of not having enough time to make it work,” he said, adding that the front end was ready, but that there were still questions about how the system would handle the assessment load.

“Everyone banded together, the MLIs, testing department and the teachers [to input exams] … we jumped on the problem and tackled it … to keep the GPAs reflective and give our students good assessments,” explained Babcock, adding that providing this measurement was key for students to track their progression and prepare for the final Defense Language Proficiency Test.

“By early May, a lot of the problems had been overcome,” said Joey Holguin, director of Information Technology support. “Once we saw that this was the solution the Institute was going with, we stepped in to help and make sure the program was up to industry standards in terms of securing data.” A single-sign on solution was also achieved, easing access for users.

“Since May, we have conducted 2,674 assessments, or GPA influencing events,” said Farooq proudly, illustrating data on slides that showed more than 150 tests being conducted per week for about 2,500 students, currently attending 155 classes which are running simultaneously in 16 different languages.

“UCAT is a great tool,” chimed in Babcock. “You can assign the student all of their military studies materials and they can go through them at their own pace. It gathers all the student data so you can go in and look at the minor assessments that the students have done …and it saves all of their answers of their daily work – as well as quizzes.”

As for the students, Spc. Mark Meixell said he didn’t feel he was short-changed by the rapid switch to online learning or the initial struggle with assessments. “I thought the teachers did a good job of transferring the classroom to remote teaching…. UCAT definitely had some bumps, like using the right browser and going through a virtual desktop, but it was fine for evaluating our progress.”

Though claiming that he often experienced distractions while studying in his room six to seven hours per day, Meixell received a 3/3 on the final DLPT exam, a score that is deemed quite exceptional, not least during a pandemic.



Posted Date: 30 October 2020