Data-Driven Student Success

UNLV mines its own grade books to ensure students aren’t weeded out of STEM classes.

BY BRIAN SODOMA • Read this article at the UNLV News Center

A high school student explores a project at the 2017 Beal Bank USA Southern Nevada Regional Science and Engineering Fair co-sponsored by the UNLV College of Sciences. (R. Marsh Starks/UNLV Creative Services)

For the past decade, high schoolers across the United States have heard the STEM (Science, Technology, Engineering, and Math) career pitch. It’s an enticing one: Get a STEM degree. Get a higher-paying jobs. You’ll outearn other grads by $15,500 on average, according to a 2014 Department of Education report.

In Southern Nevada, officials are making that pitch in the hopes that developing the STEM workforce will spur economic development. Las Vegas is ranked 97th (among 100 metropolitan areas evaluated) in terms of employees in STEM-related fields, with 3.6 percent of the workforce compared with an 8.7 percent national average.

But once they enter college, though, students face a harsh reality check. Training for STEM fields is academically rigorous and demands study skills that many students did not acquire in high school. As a result, roughly 40 percent of those entering college as a STEM major end up either switching to a non-STEM field or not completing a degree at all. This reality is further compounded for first generation college students, who have less exposure to mentors who can help them navigate college. Unsurprisingly, they switch their majors at even greater rates.

In 2014, Matthew Bernacki, assistant professor of educational psychology and higher education, felt UNLV was in a position to change this trend.

“Prior research shows that students often abandon STEM majors after poor performance in early coursework,” he said. “That experience leaves students feeling they lack ability in the STEM disciplines, when in fact they may only lack some specific learning skills.”

Bernacki believed the key was to identify struggling students long before their grades revealed they were having trouble and to provide learning support. He wanted to dive into the student activity data in UNLV’s WebCampus learning management system for the answers.

In 2014, the National Science Foundation awarded him a three-year grant to test his strategies. Today, as the project nears completion, his results are encouraging. More of UNLV’s STEM majors his project targeted moved forward in their degree programs, achievement has improved in critical courses in math and science, and work is underway to make permanent the efforts that are producing these results.

A Three-Step Approach

The project involved partnerships with instructors of four entry-level courses: human anatomy and physiology, college algebra, calculus, and an introductory engineering course. These large lecture courses have often been called “weed-out classes.”

“That trial-by-fire mentality was all wrong,” said Carl Reiber, senior vice provost. “We’re here to teach students, not weed them out of their futures. It’s an approach that’s been proven harmful to first-generation students and underrepresented minorities in particular.”

About 40 percent of students entering college as a STEM major end up either switching fields or not completing a degree at all. But a new algorithm from the UNLV College of Education is predicting which students will struggle and then getting them on the path to success. Life sciences professor Jenifer Utz, saw immediate benefits in her anatomy classes. (R. Marsh Starks/UNLV Creative Services)
Jenifer Utz, a professor in the School of Life Sciences, teaches freshman anatomy and physiology courses. “Some students do not achieve at their potential simply because they’re not equipped with appropriate study skills and strategies,” she said. “They’re completely capable of being successful in the course — and ultimately in the field — if they can just get past those early hurdles.”

Bernacki’s project was conducted in three phases. In the first year, he met with each instructor before the start of the semester to review the learning objectives for each class and the resources provided to students on the WebCampus course site.

Civil engineering professor Donald Hayes hadn’t thought about his introduction to engineering course from this perspective. “This opened my eyes to educational research and how it can be really helpful to us. I’ve learned a lot from him about how to organize a class,” Hayes said.

Utz adjusted her materials too, adding practice quizzes for each individual chapter. The students who regularly used the quizzes to study scored 12 percent higher on the final exam than non-users. Differences were even larger among students who entered the course with minimal prior biology knowledge. “To simply say ‘Here are some practice tests to see if you’re on track’ and then see someone’s grade suddenly jump 10 percent was impressive,” she said.

Bernacki also used that first year to observe students’ behavior in the WebCampus environments for each course. He collected data about which resources students accessed — or ignored — and when. Then he observed how students’ behaviors correlated with their grades. This would be critical information for the third year of the study.

Learning to Learn

In the second year, Bernacki studied whether a program called “The Science of Learning to Learn” could be delivered in WebCampus to improve performance. It first acknowledges the challenges students face when transitioning to college coursework. It then introduces important learning priniciples — like “retrieval practice” to improve factual information and “self-explanation strategies” to break down complex concepts — and helps students select the most appropriate strategy for their course. It trains them on behavioral strategies for managing a college lifestyle.

“It’s one thing to study, but it’s another to be aware of the correct objectives to study and to keep track that the knowledge is being learned and retained,” said one engineering student who provided anonymous feedback. “Sounds simple, but it’s something I never thought about.”

First-generation and underrepresented minority students who used the “Science Learning to Learn” tools in entry-level biology saw markedly higher scores on exams compared to peers who did not.

In the initial study with anatomy and physiology students, those who completed “Learning to Learn” modules after their first unit exam outperformed a control group on the next two exams. The pattern of results was replicated the next semester when students completed the modules in the first weeks of the course. A third run of the study showed similar results in math courses. College algebra students who completed the training outperformed peers (who spent equivalent time solving algebra problems) on the next two course exams.

Power of Prediction

After gathering data about students’ online behavior for the first two years, Bernacki created an algorithm to predict performance. “We’re at a point where after four weeks into the biology, three weeks into the calculus, or five weeks into the engineering course, we can identify which students are going to earn that poor outcome about 80 percent of the time,” he said.

Students typically need to pass with a B or a C in these initial courses to advance in their STEM coursework. Without prediction modeling, students may not know if they are on track to make that grade until after their first exam.

“That’s a problem because when a poor first exam grade arrives — often as late as mid-semester — it means the time available to adjust learning methods is short,” Bernacki said. “What’s worse, the first chance to perform well in the course has now been missed, which raises the stakes of the remaining exams.”

Utz added, “It can become mathematically impossible to recover a passing grade.”

Bernacki and his team created an early alert system so students know they need to change study habits. A week before their first exam, a message from their instructor reminded students about the upcoming test and proposed they use some powerful learning methods — things that had worked for students in past semesters.

They were directed to an advice page authored by real UNLV students and faculty and hosted on WebCampus. Students also were encouraged to use the “Learning to Learn” modules.

In spring 2016, more than 300 anatomy and physiology students identified as likely to struggle received the message; more than one-third beat projections and earned A’s or B’s in their course. Follow-up studies showed similar improvements in the biology course. And when applied to calculus, messaged students outperformed others predicted to struggle by nine to 15 points on all five exams.

“So far, those who get the message and take us up on the offer [of learning support] ultimately outperform those who don’t get a message,” Bernacki added. “I was pleasantly surprised that when students did what we’d hoped they would do they were as successful as we thought they could be.”

The Clicks of Academic Success

Cam Johnson can get roped into a lot of interesting projects, but the operations manager in UNLV’s office of information technology, didn’t see this one coming. In 2014, backed by a National Science Foundation (NSF) grant, Matthew Bernacki approached Johnson. The education professor wanted to collect student data from UNLV’s WebCampus course management system to create an early warning system for student success.

“We do work with faculty on occasion, maybe just to mentor a group or speak to a class, but we’ve never done something like this with hands-on research,” Johnson said.

His team had been working with Splunk, a system used by Fortune 100 companies to collect data and effectively organize it for searches.

“It allows us to collect data from disparate sources and make it searchable,” Johnson said — a powerful tool for organizations where systems tend to be developed over time for specific functions. These systems don’t always “talk” to each other. “First we had to answer the questions, ‘Do we have the data he needs?’ and ‘Can we curate it in a way to support his research?’” he said.

The project’s novel use of the data — it was the first time Splunk was applied to faculty research — garnered a 2016 Splunk Public Service Innovation Award. Johnson’s team developed a data modeling solution that recorded clicks within the WebCampus system. Their solution allowed Bernacki to see exactly what students are — or are not — clicking on for a course and what kinds of learning resources they accessed. They then created a model that aggregates and mines these data in order to predict each student’s success.

“One of the things we’re trying to deal with now is how to scale this out, help students graduate, and do things we know are predictive of them doing well at the university,” Johnson said. Bernacki is working to apply what he has learned to other STEM classes on campus and he hopes other institutions will learn from UNLV’s success.

“[Bernacki] saw data and found a way to interact with it,” Johnson noted. “We built a solution I wouldn’t describe as an enterprise solution, but it was a challenge and it was fascinating.”

Most important: It worked.