Strong Design, Strong Outcomes: Instructional Design Support Makes a Difference

By | October 21, 2025

This post was contributed by Amy Hofer, Open Oregon Educational Resources; and Chandra Lewis and Benjamin Skillman, RMC Research. Thank you to Nicholas Colvard, University of Georgia, for reviewing our data and providing helpful comments.

The Fund for the Improvement of Postsecondary Education Open Textbooks Pilot Program call for proposals in Fall 2020 included a requirement that proposed projects should demonstrate impact on student savings and student outcomes. Open Oregon Educational Resources shared our approach to this grant requirement in a post that is archived at Improvements in Student Achievements Resulting from Equity Lens.

After four years, we are finally ready to share the results of our work!

  • This post describes the student outcomes impact of our federal grant project, which we conducted in collaboration with RMC Research. Our key finding regarding student outcomes is that students benefit from affordable, high-quality course materials implemented with the support of an instructional designer.
  • We share our student savings impact in the post More Bang for No Bucks: Students Save Big with Federal Grant Courses. Our key finding regarding student savings is that we almost doubled the projected student savings estimate, far exceeding our target.
  • Links to the textbooks and ancillary materials that we created through the grant are available via Open Curriculum Projects Soft Launch

Top-Level Findings

Access to affordable course materials is inarguably important for students. We found that student success also depends on the quality of course materials. We defined high-quality course materials as relevant, aligned with course outcomes/workforce standards, accessible, and designed with an equity lens. Further, our results affirm the importance of thoughtful instructional design, implemented in collaboration with a professional instructional designer.

These conclusions are supported by findings from our research:

  • Students in our open textbook pilot courses achieved higher average grades than those in our comparison group. Additionally, students in our pilot courses were more likely to earn an A grade than students in our comparison group.
  • Students in courses where the instructor implemented instructional design frameworks introduced by an instructional designer exceeded our targets for increased course grades and decreased letter D grade, fail, and withdraw (DFW) rates.
  • We did not find that historically underserved students saw the greatest benefit from piloting our open textbooks. Rather, our textbooks effectively supported students regardless of Pell status, race/ethnicity, and part-time/full-time status.

Student Outcomes

The federal Open Textbooks Pilot Program requires evidence of improved student outcomes as well as savings. Per the call for proposals, improved student outcomes had to be measured in two ways: through improved course grades and lowered failure/withdrawal rates.

When we designed our grant proposal, Open Oregon Educational Resources did not expect that adopting open textbooks, per se, would lead to improved student outcomes. It stands to reason that a one-to-one swap of an all-rights-reserved copyrighted textbook with an openly licensed textbook might not improve student outcomes, given that the copyright/licensing status is unrelated to the quality or effectiveness of the content (for more on this line of thinking, see “But is it sustainable?” and On Quality and OER). Successive meta-analyses have found that the use of openly licensed course materials, in and of itself, is unlikely to have a measurable impact on academic outcomes (for example, studies by Clinton and Khan and Mullens and Hoffman).

Rather, we predicted that our project design would lead to improved outcomes for all students, with a potential for greater benefit for historically underserved groups (as found by Colvard et al.), because students would have access to higher quality materials than those available commercially, implemented in well-designed courses with support from instructional designers.

We requested historical student outcomes data beginning in 2016 so that we could compare outcomes before and after open textbook adoption. Because the pandemic had an overall negative effect on learning outcomes (The impact of Covid-19 on student achievement: Evidence from a recent meta-analysis), we conducted two analyses. One analysis compared outcomes before the open textbook pilot (i.e., prior to 2022), and a second analysis compared outcomes for sections of courses taught during the grant period (2022-2024) that either piloted our open textbooks (our treatment group) or used other course materials (our comparison group, where the course materials were unknown). Student grade data was provided by the Higher Education Coordinating Commission Office of Research and Data.

Grades and DFW Rates

We found higher average grades overall for students in courses that piloted our open textbooks than for students in our comparison group. Students in open textbook pilot courses did not generally have lower DFW rates than those in our comparison group. However, we found that students in open textbook pilot courses were more likely to earn an A than students in our comparison group.

Our pilot open textbooks helped Pell students, students of color, and part-time students, but this study found no differential benefit based on demographics; all students in our pilots saw a similar benefit.

Implementation with an Instructional Designer

Our project design connected instructors with instructional designers to implement best practices during pilot course development. This collaboration introduced course pilot instructors to four frameworks for course design: Universal Design for Learning, Culturally Responsive Teaching, Transparency in Learning and Teaching, and Open Educational Practices, as described by Open Education Instructional Designer Veronica Vold in Equity-minded Open Course Design. Our study broke out student outcomes by level of implementation. Instructors that reported making at least four of the six changes listed in our survey instrument were considered to be high implementers.

Our study’s key finding is that pilot instructors who implemented more of these practices achieved better student outcomes than our overall group of pilot instructors. This finding was true for all of our measures: end-of-course grades, DFW rates, and A grade rates. In contrast to our findings for the pilot courses overall, students in open textbook pilot courses with highly implementing instructors had significantly lower DFW rates than students in similar courses in our comparison group. As with our overall findings, we found no differential benefit based on demographics in high-implementing courses; all students saw a similar benefit.

Put another way, the instructors who collaborated with an instructional designer to implement instructional design best practices were able to get the most leverage out of adopting affordable and high-quality open educational resources. This intervention enabled students to benefit from our open textbooks in the way that we hoped to see. This finding makes a contribution to the open education field by affirming that students benefit from affordable, high-quality course materials implemented with the support of an instructional designer.

Limitations

We have two caveats to offer about these student outcome results.

First, our outcomes are not generalizable to other openly licensed course materials. They apply only to the textbooks and courses developed through Open Oregon Educational Resources’ Targeted Pathways program because that was the sole focus of our research. Further, because our research was conducted during course pilots, students were using prelaunch versions of textbook manuscripts that were subsequently revised.

Second, our analysis aggregates information to a statewide level. Further research could examine effects by institution type, size, student population, etc. For instance, in an exploratory analysis, the main effect for students in pilot courses was observed in 4-year institutions but not in 2-year institutions; however, we don’t have a way to explain this discrepancy because disaggregating by institution type wasn’t part of our study design.

Student Experience and Instructor Practices

We developed additional targets as measures for grant activities to assess students’ affective experiences in our open textbook pilot courses, and instructors’ changes to teaching practices as a result of their participation with the project. Self-report data from students and instructors was collected via survey by RMC Research. For the student survey data we used a sample of students from each course to equally weight the responses, because some courses were much larger than others.

We expected to demonstrate a positive effect on students’ affective experience of the curriculum because of the project’s Equity, Diversity, and Inclusion approach. A synthesis of approximately 40 peer-reviewed studies on culturally relevant education by Aronson and Laughter finds that a culturally responsive approach has a measurable effect on both academic achievement and on affective student outcomes, including increased motivation, increased interest in content, increased engagement, and increased academic confidence. Our data did show a positive impact on students’ affective experience of the curriculum; in fact, we more than doubled our target for student self-reporting of positive impact.

In survey responses, students shared how they felt about the open textbook pilot courses:

“I really liked how the course had a free text book. It was nice to not worry about an additional class cost. I also appreciated how the instructor gave us two extension dates over the term. Life happens & it felt nice to be acknowledged like that.”

“What worked the best for me was videos in tandem with the textbook for assignments. I like having videos to reflect or write on as well as the textbook because it makes me more confident in my understanding of the textbook.”

“I genuinely loved how down-to-earth the textbook was. I loved that other students wrote it and I could relate it back to myself.”

“Overall, I loved the course – I wasn’t expecting to, but the way it was structured was very eye opening and I learned a lot.”

We also expected that most instructors would report changing at least one of their teaching practices as a result of working with our project, and we met this target. This finding aligns with previous research that Open Oregon Educational Resources conducted with RMC Research to determine the effectiveness of the Equity and Open Education Faculty Cohort Model developed by Jen Klaudinyi, Faculty Librarian at Portland Community College. In that study, nearly all participants reported that the training helped them make high-impact changes in their teaching practices (Spoiler Alert: Equity and Open Education Training Helps Faculty Make High-Impact Changes).

In survey responses, instructors shared how they changed their practices as a result of working with an instructional designer:

“I loved the tools for course design and have applied the concepts to designing other courses.”

“Based on my work with Veronica, I changed much of what I was doing in other courses, and it is definitely changing (improving, I hope!) how I access and create materials to share with students.”

“Working with Veronica is the bomb!!”

Replicating Colvard et al.

When we wrote our project proposal, we set targets that matched the findings from an influential 2018 article by Colvard, Watson, and Park, The Impact of Open Educational Resources on Various Student Success Metrics. Their study assessed the impact of OER adoption on student academic performance, specifically analyzing end-of-course grades and DFW rates. Colvard et al. informed our hypothesis that our project design would lead to improved outcomes for all students, with a potential for greater benefit for students identified with historically underserved groups. Our study did not replicate many of Colvard’s results – except, as shown below, where we exceeded them in courses with high-implementing instructors.

We want to explore where our results fell short of our targets because of the prominence of the Colvard et al. study in the open education field. That said, setting the targets during the planning phase involved some guesswork on our part, and therefore we do not overemphasize these results in our findings. On the whole, we consider our student outcome results encouraging: our study found higher average grades for students in our open textbook pilot courses and more A grades overall, as well as lower DFW rates when students had instructors who implemented best practices during pilot course development. The section below is intended to offer analysis and questions that we hope will advance the open education field’s research agenda.

Targets and results are summarized in the table below. If we met these targets, we would exactly replicate the findings in Colvard et al.

Project Target Project Result
End-of-course grades will increase by 5% overall, with increases of 10% for Pell recipients, 13% for students of color, and 28% for part-time students. Partially met: all end-of-course grades increased, but increases did not reach the targets. However, this target was exceeded for high-implementing instructors.
Letter D, fail, and withdraw (DFW) rates will decrease by 2% overall, with decreases of 4% for Pell recipients, 5% for students of color, and 10% for part-time students. Partially met: DFW rates increased overall. However, this target was met for high-implementing instructors.

End-of-course Grades

We found significantly higher end-of-course grades in our open textbook pilot courses than in the comparison courses, but our results did not reach the target. When we broke our findings out among demographic groups, we found that this was true regardless of Pell status, race/ethnicity, and part-time status.

Student Group Treatment Group Grade Comparison Group Grade Percent Difference Project Target
Overall 3.20 3.13 2.24% 5%
Pell Grantee 3.12 3.02 3.31% 10%
Students of Color 3.07 2.98 3.02% 13%
Part-Time 3.08 3.04 1.32% 28%

Note. F = 0.0; D- = 0.7; D = 1.0; D+ = 1.3; C = 1.7–2.3; B = 2.7–3.3; A = 3.7–4.3. W grades are not included in the calculation of average grades. Students of Color includes students who are American Indian, Alaska Native, Black/African American, Native Hawaiian, Pacific Islander, Hispanic/Latine, Multiracial/Multiethnic. Following the Colvard et al. study, Asian students are not included.

DFW Rates

This target was not met. For the most part, DFW rates increased in our study. Pell students, students of color, and part-time students all had higher DFW rates than these same students in the comparison group. However, this difference was only statistically significant for students of color.

Student Group Treatment Group DFW Rate Comparison Group DFW Rate Treatment Group Grade A Rate Comparison Group Grade A Rate
Overall 18% 16% 57% 53%
Pell Grantee 19% 18% 52% 49%
Students of Color 21% 19% 52% 47%
Part-Time 21% 21% 53% 50%

While DFW rates were slightly higher in the open textbook pilot courses than in the comparison courses, students in the open textbook courses were also more likely to receive an A grade.

Results for High-Implementing Instructors

As we stated above, our key finding was the positive impact of working with an instructional designer to implement best practices for course design while adopting an open textbook. Our study broke out student outcomes by level of implementation of the instructional design frameworks. When looking only at this subgroup of courses, we exceeded our targets for end-of-course grades and met our targets for DFW rates.

Student Group Treatment Group Comparison Group Percent Difference Project Target
High implementing instructors end-of-course grade 3.31 3.13 5.75% 5%
High implementing instructors DFW rate 14% 16% -2% -2%

However, as with the main analysis, we found no differential benefit based on demographics; all students saw a similar benefit in high-implementing courses.

Results in Context

While we are not overemphasizing these numeric results, it’s still a bit disappointing to miss our targets. Further, we did not replicate Colvard et al.’s signature finding that historically underserved students saw the greatest benefit from the use of open textbooks.

Our research design had some differences from the Colvard et al. study that may help to explain why we only partially replicated their results:

  • The Colvard et al. study included 21,822 students at a single university, with roughly equal numbers of students in the treatment and comparison groups. In contrast, our study includes a larger number of students at 7 universities and 17 community colleges, with a much larger number of students in the comparison group than the treatment group (approximately 3,254 students in the treatment group using open textbooks developed through the grant and 56,286 students in the comparison group during grant years).
  • The Colvard et al. study focused on adoption of OpenStax textbooks in Biology, History, Psychology, and Sociology courses. Our study looked at adoption of open textbook manuscripts in progress for courses in Criminology, Human Services, and Sociology.
  • Our research extends Colvard et al.’s work by analyzing instructors’ implementation levels (high vs. low), and the percentage of students earning an A.

Other research studies have had mixed success in replicating the Colvard et al. findings, such as Smith et al., Delgado et al., and Dempsey. In the article The Impact of Free and Open Educational Resource Adoption on Community College Student Achievement, Megan Dempsey points out that the relationship between student withdrawal and open textbook adoption is correlative, and that qualitative follow-up research would be needed to fully understand why students withdraw from courses and whether there is a causal relationship with course material selection.

Open educators can consider whether the findings of the study shared here should inform the field’s research agenda. We can better understand how to use Colvard et al.’s findings to assess student outcomes. We can follow Dempsey’s recommendation to pursue mixed- methods studies. We can also determine whether differential benefits for historically underserved students is a goal in our field, and if so, how to consistently achieve it.

Funding

Our grants drew from Governor’s Emergency Education Relief funding and the Fund for the Improvement of Postsecondary Education (FIPSE) in the U.S. Department of Education (eighty percent of the total cost of the program is funded by FIPSE, with the remaining twenty percent representing in-kind personnel costs funded by Open Oregon Educational Resources).

The contents of this post were developed under a grant from the Fund for the Improvement of Postsecondary Education, (FIPSE), U.S. Department of Education. However, those contents do not necessarily represent the policy of the Department of Education, and you should not assume endorsement by the Federal Government.

Share

Leave a Reply

Your email address will not be published. Required fields are marked *