Collecting local data to advocate for Oregon State students

By | December 1, 2022

This post was contributed by Stefanie Buck, Director of Open Educational Resources, Oregon State University. 

The recent Florida Student Textbook and Course Materials Survey and USPIRG Fixing the Broken Textbook Market reports tell us that anywhere from 64-66% of students don’t purchase their textbook because of cost. These numbers are cause for concern and we often rely on these national and regional figures when talking to our faculty and administrators about textbook affordability. However, Florida is not Oregon and while the numbers are useful, it was not clear how Oregon State University measures up to these figures. Are our numbers as high? Do we have equal cause for concern?

To find out, I received permission to run a modified version of the Florida survey here at Oregon State University during March 2022, which also coincides with Open Education Week. I say “modified” because one of the things I added to the survey, which the Florida survey did not do, is to gather data on the ethnicity, Pell-eligibility, and first-generation status of the respondents. Researchers in the field have called upon all of us as OER leaders to conduct research where we can disaggregate the results to see how the high cost of textbooks impacts these specific populations (two examples of open education research with disaggregated results are The Impact of Open Educational Resources on Various Student Success Metrics by Nicholas B. Colvard, C. Edward Watson, and Hyojin Park; and Efficacy of Open Textbook Adoption on Learning Performance and Course Withdrawal Rates: A Meta-Analysis by Virginia Clinton and Shafiq Khan).

Read the full report: 2022 Oregon State University Student Textbooks and Course Materials Survey: Results and Findings.

Running the survey

Running the Florida survey here at Oregon State required IRB approval, which was granted in April of 2021. I chose to run the survey around the time of Open Education Week (2022) because we already had activities planned and communications going out.

One of my big concerns in running this survey was getting student participation. We know that student participation in surveys is down and there is a lot of competition for our students’ attention. Oregon State is a large institution, but it was not possible for us to email all 25,000 undergraduates and ask them to take the survey (the email would have been flagged as spam and my email account would have been turned off). That made reaching out a little more difficult. Instead, we asked for a random sample of 7,000 undergraduates from the Institutional Analytics and Reporting unit and asked for assistance from the Associated Students of Oregon State University in advertising the survey on their social media sites.

In the end, we received nearly 500 useable responses. Although the response rate was lower than we had hoped, it was still a large enough sample to provide important descriptive information and working with the Ecampus Research Unit (ECRU), we were able to glean some useful information about our sample.

What did the survey tell us?

61% of the respondents do not purchase their textbooks because of cost

This is lower than what the Florida Virtual Survey reports but not by much. It means that in any given class, half the students may not have the textbook. However, it should also be pointed out that many students said they did not purchase the textbook because they were unsure if they would actually need it; according to the open-ended responses, many waited until the second or third week to purchase the textbook. This warrants further investigation and some clarification on the next version of the survey. Are students not purchasing textbook because they literally cannot afford it or are they, based on experience, not purchasing the textbook to see if they really need it in the first place? In either case, they don’t have their course materials on the first day of class, but the problem may have to be addressed in different ways. If students are waiting to see if they really need the textbook, the problem may be one of pedagogy rather than finances.

Students are good economists

At Oregon State University, strategies to lower costs include sharing the textbook with a friend, using the library reserves instead, or downloading an illegal copy (students were very forthright on this topic). Some very candidly told us they just didn’t purchase the textbook and “hoped for the best.” Additionally, we found that 38% don’t register for a course, 44.5% take fewer classes, 18% drop the course and 14% withdraw from the course because of textbook costs. Again, these numbers are lower than what both the Florida survey and the US PIRG surveys report but let’s think about what that means for students, departments and institutions. Any attrition means a loss of revenue, so colleges and departments will be hurt financially if students drop out or withdraw. For the institution, any attrition is a financial loss that could have far-reaching consequences on the university budget allocation. For students, it means a longer time to graduation and a possible increase in their debt.

The cost of textbook impacts our BIPOC students more

Across the three groupings of students, we found that historically underserved students were more likely to choose “frequently” when selecting “took fewer courses” and “dropped a course.” They were also more likely to choose “frequently” or “sometimes” when selecting “withdrew from a course” and “sometimes” when selecting “failed a course because I could not afford to buy the textbook.” We did not find any such differences with our first-generation or Pell-grant eligible students.

Why run the survey locally?

Running the survey at my institution was not easy – going through IRB, building the survey in Qualtrics and getting it out to the students were all time-intensive activities but it was worth the effort. Now, I can go to my administration and faculty and tell them with some confidence what the picture is here at Oregon State.

I hope that in the future, more Oregon institutions will consider running the survey. There are many good reasons to undertake this work at the local level.

  1. We need reliable local data to tell our story. Telling our story is what gets us attention; just telling the national story is not quite as potent.
  2. Local initiatives often start with local data. This is true of many industries and higher education is no exception.
  3. We need to have local data in order to make data-driven decisions based on our specific needs. For example, at Oregon State University, the drop and withdraw rate is lower than what we see in the Florida survey and therefore may be less of a concern for us. Students choosing to take fewer classes is higher than the Florida survey results, meaning this could be more of an issue for us. This will help shape our interventions.
  4. Having local data is essential to testing the success of our interventions. If we implement something new, how will we know if it was successful or not if we’re not gathering the local data?
  5. Local data helps us select appropriate metrics and allocate resources to the greatest area of priority which may be different from what the national data tells us.

Also, it seems that local data resonates more with our faculty and administrators. Whether that is true remains to be seen but it certainly felt more relevant when presenting my findings to various departments and administrators.

Reflections

It’s not easy to come by good data. In fact, it is downright frustrating on occasion. There is a lot of data that we OER leaders must track and running an institution-wide survey is a daunting task. Here are a few things I learned along the way that made this project more manageable:

  1. Find yourself a partner – someone who can do the statistical analysis. Being able to dig into the data is where the real benefit lies in having local data.
  2. Engage your student government. They can help get the word out about the survey.
  3. Work with your registrar to get a random sample of student emails, if your institution allows. Direct emails always work better.

I plan to run this survey every 2 years (every year seemed like a bit much) and, hopefully, we will be able to get more responses in the future. This first survey has helped us to set a baseline and subsequent surveys will give us something to compare to. I want to continue to share the results of my work because it will benefit everyone in Oregon to have more local data and it offers a potential model on how to disaggregate the results of the research.

The data that was collected in this survey was priceless because now I feel I can go to my faculty and administrators with 100% relevant, local data. I have been sharing the results of the survey with faculty and administrators in the hopes that this will resonate more with them than the national data. So far, my presentations have been well received; I even got to present to the new President of the University! Raising awareness is an important part of our jobs and going local is one way of doing it.

Share