A teacher walks into a classroom and begins a lesson. As she speaks, the audio is translated in real time into a variety of languages that students have pre-selected, so each can hear the lecturer’s voice in their own language. It can even be delivered directly into their auditory canal so that it does not disturb other students. The lecturer’s voice is also transcribed in real-time, appearing in a display that presents digital content over the students’ visual field.
As the lesson progresses, students identify concepts they feel need further clarification. They submit highly individual queries to search engines that use artificial intelligence algorithms to filter and synthesise results from a variety of sources. This information is presented in their augmented reality system, along with the sources used, and additional detail in the form of images and animations.
All of the additional information gathered by students is collated into a single set of notes for the lesson, along with video and audio recordings of the interactions. It’s then published to the class server.
This isn’t science fiction. All of the technology described here currently exists. Over time it will become more automated, economical and accurate.
What does a scenario like the one described here mean for lecturers who think that “teaching” means selecting and packaging information for students? There are many excellent theoretical reasons for why simply covering the content or “getting through the syllabus” has no place in higher education. But for the purposes of this article I’ll focus on the powerful practical reasons that lecturers who merely cover the content are on a guaranteed path to redundancy.
The future isn’t coming - it’s here
The technology described above may sound outlandish and seem totally out of most students’ reach. But consider the humble - and ubiquitous - smartphone. A decade ago, the iPhone didn’t exist. Five years ago most students in my classes at a South African university didn’t have smartphones. Today, most do. Research shows that this growth is mirrored across Africa. The first cellphones were prohibitively expensive, but now smartphones and tablets are handed out to people opening a bank account. The technology on these phones is also becoming increasingly powerful, and will continue to advance so that what is cutting edge today will be mainstream in about five years’ time.
This educational technology can change the way that university students learn. But ultimately, machines can’t replace teachers. Unless, that is, teachers are just selecting and packaging content with a view to “getting through the syllabus”. As demonstrated above, computers and algorithms are becoming increasingly adept at the filtering and synthesis of specialised information. Teachers who focus on the real role of universities - teaching students how to think deeply and critically - and who have an open mind, needn’t fear this technology.
Crucial role of universities
In a society where machines are taking over more and more of our decision-making, we must acknowledge that the value of a university is not the academics who see their work as controlling access to specialised knowledge.
Rather, it’s that higher education institutions constitute spaces that encourage in-depth investigation into the nature of the world. The best university teachers don’t just focus on content because doing so would reduce their roles to information filters who simply make decisions about what content is important to cover.
Digital tools are quickly getting to the point where algorithms will outperform experts, not only in filtering content but also in synthesising it. Teachers should embrace technology by encouraging their students to build knowledge through digital networks both within and outside the academy. That way they will never become redundant. And they’ll ensure that their graduates are critical thinkers, not just technological gurus.