Menu Close
Students chatting at a table while one looks at a tablet
Brooke Cagle/Unsplash

5​ problems with the Student Experience Survey’s attempt to understand what’s going on in higher education post-COVID

Each year tens of thousands of higher education students complete the Student Experience Survey. It’s seen as a litmus test of student engagement, satisfaction and educational quality. But do the ways in which institutions and governments try to understand student experiences still add up?

The pandemic has transformed enrolment patterns and the ways in which students interact with their institutions and the courses they offer. We suggest the data from the 2021 survey released today no longer adequately capture students’ experience of study. The current version of the survey was designed for a time when modes of study were more clearly defined than they have become since COVID-19 emerged.

The student survey is part of the Australian Quality Indicators of Learning and Teaching (QILT) suite of measures for higher education. The 2021 report shows ratings are more positive compared to 2020 for younger and internal (classroom-based) students. According to the report, this “can likely be attributed to some return to on-campus learning and also a change in the expectations and experience of students”.


Read more: COVID has changed students' needs and expectations. How do universities respond?


But how are “internal” students engaging in their studies? Does learning look the same today compared to 2019, and should it?

New forms of flexibility in student mode of study have to be matched with new forms of support to enable students to make smart choices. The mode of study categorised as internal for the survey now includes so much variation that it no longer serves a useful function for reporting and analysis purposes.


Read more: Digital learning is real-world learning. That's why blended on-campus and online study is best


Why QILT results matter

Individual higher education providers might use results to:

  • set key performance indicators – for example, “by 2030, we will be in the top 3 universities for learner engagement”

  • market themselves – “we are the top Australian university for teaching quality”

  • undertake evidence-informed planning – “develop sense-of-belonging roadmap to increase scores”.

Student survey data are also used in research that informs policymakers. Drawing on many years of survey results, social scientists analyse datasets to answer big, high-level questions.

It’s more than a matter of comparing universities and providers. Questions of equity and access are investigated. For example, how are rural and regional students engaging in higher education?

These data are used in research with other national datasets. For example, reports from the National Centre for Student Equity in Higher Education at Curtin University demonstrate the importance of such data.

COVID has changed how we study

The pandemic shone a light on issues of student equity as mode of study shifted (as a recent review showed). Mode of attendance is defined as:

  • internal: classroom-based

  • external: online, correspondence, and electronic-based (the language used for data-collection purposes shows how outdated it is)

  • multimodal: mix of internal and external.

In 2019, about 75% of Australian higher education students were enrolled as internal students. Multimodal studies accounted for roughly 14%.

Even at that time, it could have been argued that the lines between internal (classroom-based) and external (online) were already becoming blurred. Lecture recordings, learning management systems, flipped classrooms, endless debates about the “lecture”, and growth in digital technologies not only broadened access to knowledge but also enabled a mix of online and in-class interaction.

The use of existing technologies was a key reason the higher education sector could pivot online in a week when the pandemic hit in early 2020. Imagine if the pandemic had happened in 2005 instead of 2020? Higher education institutions would have simply shut down without these technologies.

Now we have had two years’ experience of online learning and new modes of study. Examples include attendance via Zoom rooms, live online, hi-flex (making class meetings and materials available so students can access them online or in person), swapping from on-campus to online due to lockdowns, students moving between internal and external study on a week-by-week basis. Does the either-or categorisation of modes of attendance – internal or external – still make sense?

illustration of hybrid learning with some online students interacting with a physical class
The old hard-and-fast divisions between learning online or in a physical class are no longer appropriate – technology means students can be involved in both at the same time. Shutterstock

Read more: Beyond Zoom, Teams and video lectures — what do university students really want from online learning?


5 problems with categorising attendance this way

We have identified at least five problems with the current survey categorisation of modes of attendance:

1. categorising attendance as purely one or other mode, rather than a combination of modes, stifles research and analysis of important national datasets

2. the existing categorisations stifle innovation, limiting institutions from creating distinctive blends of modes of teaching and learning

3. it perpetuates an outdated, either/or mindset that permeates discussion in the sector

4. it masks important implications of differences between new and established modes of attendance, including:

  • hidden workloads for staff, leading to questions of burnout and mental health

  • unclear expectations for students, which hinders decision-making and effective study approaches

  • hidden costs and unclear planning processes for differing modes of study

  • lack of clarity about blurred modes of study being offered, which can restrict access to higher education and create obstacles to success for equity students.

5. the sector is missing opportunities to gather relevant mass-scale data on modes of attendance to guide policy and practice.

Sector needs to agree on a new model

The crude categorisation of modes of study is hindering evidence-based decision-making. Across the sector, institutions are scrambling to sort out how best to maintain the flexibility many students now demand while ensuring students meet expected learning outcomes. And institutions need to do so in ways that are sustainable and healthy for staff.

As the chaos of the pandemic hopefully subsides, the higher education sector would benefit from a sector-wide process of developing an agreed way of describing the full range of modes of attendance. A framework is needed that enables shared understanding of all these modes. This will enable institutions to better plan, resource, innovate and engage students and staff.

Such a framework could then inform ongoing national data collection, such as QILT, so social scientists and educational researchers can, in turn, better guide policy and practice.

Want to write?

Write an article and join a growing community of more than 180,900 academics and researchers from 4,919 institutions.

Register now