This complete research paper analyzes the impact of offering mock exams to student performance in an introductory computing course. In the summer of 2019, our university’s collaborators attended the Annual ASEE Conference and in particular, a session presentation on the use of practice exams in introductory math and science courses at the University of Kansas (Shew et al., 2019). As student retention and four-year graduation rates in engineering are of continued interest and concern at our university, a collaboration between the School of Engineering and the campus Learning Center continues to work on academic support and interventions that will promote student learning and positive grade outcomes in first year engineering courses. The required introductory courses for the Electrical and Computer engineering students at the university report high percentages of D’s, F’s, Q’s (drops), and W’s (withdraws).
The impressive results seen from the aforementioned article inspired our university collaborators to implement the mock exam session protocol prior to each of the midterm exams of the introductory computing course, with a few changes in our implementation. The course chosen is optimal for an initial pilot implementation of the practice exam, for several reasons. The course is capped at under 100 students, and with limited collaborative learning space, the learning center’s drop in tutoring center fit the expected number of attendees. Thus, the exam review does not require registration, simply accurate capture of attendees, using the attendance collection system.
The course is also taught by one of the collaborators, so creation of the exam, communication and promotion of the review to students and alignment with the goals of the project is fluid. The faculty member maintains that students can not take photos, but amends the option to post the exam; after the review session, the exam is posted online along with answers (not solutions), should students who are not able to attend the practice exam still wish to test themselves. The engineering department has also provided undergraduate and graduate support, so the collaborators are able to make use of undergraduate and graduate TAs to proctor the mock exam session, as well as conduct the third portion of the exam review, where they present solutions and hold discussions of the individual problems and related course content.
This report investigates the effects of the mock exam on student grade outcomes. The study utilizes a mixed-methods approach, incorporating quantitative data relating to grades and attendance with qualitative data relating to student perceptions about the impact of the exam review to their performance on the actual exam, as well as changes to their study approach. The collaborators will collect multiple types of data, including students’ exam review attendance and academic performance in the course, students’ demographic data, and the D’s, F’s, W’s and Q drop rates (QDFW rates) for attendees and non-attendees. Qualitative data was collected in the form of surveys administered to attendees after each exam.
Shew et al., were able to compare students with similar ACT scores and found a more pronounced positive effect on average end of semester course GPA for those who attended the exam reviews compared to those who did not attend. As the collaborators are aiming to improve grade outcomes in first year engineering courses and in turn retain more students to the engineering program, especially those students who are most at risk (low SAT or ACT scores), we also plan to compare attendees vs non-attendees grade outcomes using SAT/ACT scores for a more accurate reflection of the effects of of the exam review.
Are you a researcher? Would you like to cite this paper?
Visit the ASEE document repository at
for more tools and easy citations.