The Department of Electrical and Computer Engineering at a large Midwestern University is seeking to enhance undergraduate engineering education through a combination of programmatic efforts to create departmental change. Three distinct programs aim to transform ECE education through collaborative course design, enhancements to the department climate, and increases in the opportunities for underrepresented undergraduate engineering students. Due to the integrative and corresponding programmatic goals, it was vital to develop a unified evaluation in line with the program evaluation standards (Yarbrough, Shulha, Hopson, & Caruthers, 2011). Further, the interaction of multiple programs necessitated evaluating goal attainment at both the programmatic and departmental levels to determine not only the effects of individual programs but also to examine the broader effect of the interaction of multiple ongoing programmatic efforts to enhance engineering education.
To facilitate this process, program team members developed comprehensive lists of ongoing activities designed to create change in the department within each program. Evaluators worked with the program teams to theme and cluster activities into similar groups. To understand how each cluster of activities was positioned to create departmental change and revolutionize engineering education, the evaluators and team members then attempted to identify how each cluster of activities worked as change strategies within the model by Henderson, Beach, and Finkelstein (2011). Thus, evaluators were able to identify over twenty distinct clusters of change activities working as change strategies within the four pillars of the change model: Curriculum and pedagogy, reflective teachers, policy, and shared vision. Positioning activities within this model allowed the evaluators and team members to 1) Better understand the broad scope of departmental activities and change strategies, 2) Identify strengths and challenges associated with their current efforts to transform engineering education within the department, and 3) Develop and integrate ongoing evaluation efforts to further understand both the programmatic and interactive effects of having multiple programs designed at facilitating departmental change and enhancing engineering education.
The model for understanding department change and the approaches within that model that are being used to transform ECE education will be presented. We will further explain how the change model approach facilitated evaluating each program and the interactive effects of the combined programmatic efforts within the program evaluation standards of utility, feasibility, propriety, and accuracy (Yarbrough et al., 2011). Specific programmatic and interactive evaluation approaches will be discussed.
References
Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. Journal of Research in Science Teaching, 48(8), 952-984.
Yarbrough, D. B., Shulha, L. M., Hopson, R. K., & Caruthers, F. A. (2011). The program evaluation standards: A guide for evaluators and evaluation users (3rd ed.). Thousand Oaks, CA: Sage.
The full paper will be available to logged in and registered conference attendees once the conference starts on June 22, 2020, and to all visitors after the conference ends on June 26, 2021
Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.