Support for a Local Approach to Statewide OER Data Collection

By | July 10, 2019

Thank you to Bob Schroeder for his research assistance and feedback on a draft of this post. 

There isn’t an agreed-upon method for calculating student savings resulting from the use of no-cost or low-cost course materials. For example, this spring I participated in a webinar with three other OER leaders titled Calculating Cost Savings Associated with OER Implementation, and each presented a different – and valid – method, backed up by research and practice.

This issue is relevant to me because I want to be able to answer the research question: What is the estimated student savings represented by the statewide no-cost/low-cost schedule designation mandated by Oregon’s HB 2871?

In Spring 2018, I piloted a method of answering this question by having each campus estimate student savings using the method that is the best fit for them (shared in the post Estimated 2017-18 Student Savings in No-Cost/Low-Cost Courses). The resulting statewide estimate aggregates the data reported irrespective of method; institutional information was included that clarifies which method was used at each participating college or university. This means that the people who created the data make the case for how they use it. Each institution shows their work by sharing their method so that the aggregated estimate can be understood as a sum of differentiated components.

This year, I continued to ask each college and university to report savings data using the method that works best for their own local campus environment and concluded that the courses with the no-cost and low-cost designation in the schedule at 19 institutions are estimated to have saved over 375,000 students (by headcount) in 21,000 course sections approximately $34 million in two academic years. The post Estimated 2017-19 Student Savings in No-Cost/Low-Cost Courses provides more detail on the savings data, while this post shares my thoughts on why an aggregate of mixed methods is appropriate to a statewide student savings calculation.

Emphasis on estimate

I believe that the most important factor to consider in estimating student savings numbers is that any result will be just that – an estimate. We’ll never know for certain the exact dollar amount saved, for two reasons. First, it is difficult to capture complete information about course material adoptions because faculty do not always report their adoptions accurately or on time. Second, it is difficult to pinpoint the amount saved because student behavior is so varied: some buy used, rent, unbundle, share, sell back, order overseas, pirate, or skip a purchase altogether, and pricing for every option can change from one day to the next.

It’s best practice to be transparent in communicating that savings reports are estimates. If the discussion shifts to the accuracy of the exact number presented, that means that the discussion is getting off track. It’s most effective to stay focused on the big picture: saving money has a tremendous impact on students, regardless of the method used to calculate the dollar amount.

From a statewide perspective, since all conclusions would be partial anyway, imposing a top-down research methodology wouldn’t necessarily result in more accurate data. The number representing student savings would still be an estimate, but one arrived at through one research method rather than many.

Emphasis on local

This being the case, it makes sense to prioritize local decisions about data. In Oregon, OER initiatives that have been running for years have developed practices for tracking and reporting savings. It would be problematic to impose a change on these institutions because their past and future data would no longer be comparable. Meanwhile, institutions that are just beginning to track student savings are entering discussions about what method will be most meaningful to their own campus stakeholders and will match the resources available for the project.

Foregrounding local needs and expertise is a widely accepted research approach that has gained traction through use in overlapping theoretical frameworks: feminist, post colonialist, queer, critical race theory, and so on. Each of these interpretivist methodologies describes the validity of multiple perspectives on the truth, rather than accepting the positivist view that objectivity exists.[1]

These approaches sit well with participatory research methods, which invite the researcher and research subjects to collaborate. In this case, I’ve asked each campus to determine a method of calculating savings represented by the no-cost/low-cost schedule designation that makes the most sense locally. Stakeholders at each campus understand the choices that they made and can explain them to others if needed (for example, if a student sees a banner celebrating the savings at a college and asks a librarian where the numbers come from). I’ve shared templates and examples, without prescribing a statewide method.

Aggregating local estimates

If everyone did their savings calculations the same way, we could compare apples to apples. In this case, I’ve let each campus decide what kind of fruit to send me: apples, oranges, grapes, and so on. Can I make a convincing case that we should accept a fruit basket when we were expecting a bushel of apples?

My opinion is yes. Everyone is sending me their best estimate, on their own terms, for student savings represented by the no-cost/low-cost schedule designation. Adding together the estimates results, again, in an estimate. The aggregated estimate is valuable not for its precision, but rather because it enables us to understand the scope of the impact. On what order of magnitude can we represent the savings we’re talking about? Is it in the thousands, the hundreds of thousands, the millions, or more? How does it compare to other big numbers that may be related, such as the total cost of attendance or the additional cost of tuition hikes?

As one example, SPARC estimates that use of OER has saved students one billion dollars in five years. A billion is an order of magnitude that differentiates this bold claim from other research. It is also a very round number. We can’t be more specific than our methodology lets us be, and in this case the researchers were aggregating many data points that were collected with an even wider range of methodologies than among Oregon’s 24 public higher ed institutions.

As with the SPARC claim, my claim about student savings helps us understand the order of magnitude of the savings represented by the no-cost/low-cost schedule designation, and therefore helps us describe and understand the benefit to students. The schedule designation’s power is that it takes cost information held by the institution and shares it with students at the point of registration, enabling more informed choices and better advance knowledge of the total cost of attendance. Estimating the savings represented by the no-cost/low-cost designation shows the impact of faculty choices and helps build momentum for this work.

[1] For example:

Many feminist scholars and others influenced by postmodernism, poststructuralism, and postcolonialism argue that identity, knowledge, and meaning are constantly shifting in light of continued interaction with others, and hence, there is a continued reframing of meaning. They also highlight the myth of objectivity and discuss how positionality and multiple subjectivities shape the research process. This argument is not to suggest, however, that the research is then totally subjective and hence not dependable. Rather, the point in these forms of feminist research is to be clear and upfront about the nature of one’s subjectivity by addressing issues such as one’s theoretical perspective in conducting the research, the degree of participation of the researcher in the interview or observation, the role participants had in responding to the write-up, and the ways that the positionality of participant and researchers shaped the interactions and thus the research and knowledge production processes. This questioning enhances the dependability of the research in that it makes the process and assumptions clear. Paradoxically, by making the subjectivity clear, the research becomes more objective.

Given, L. M. (2008). Feminist Epistemology. In The SAGE encyclopedia of qualitative research methods (Vols. 1-0). Thousand Oaks, CA: SAGE Publications, Inc. doi: 10.4135/9781412963909

Share