File 20171130 30931 1hg05um.jpg?ixlib=rb 1.1

Using university language tests for migration and professional registration is problematic

Purpose-built English language tests should be applied only to the proficiency they were built to test. Shutterstock

Using university language tests for migration and professional registration is problematic

Using the International English Language Testing System (IELTS) to test things other than the English readiness of international students commencing study or training, which is what it was designed to do, is problematic. The same can be said of tests like TOEFL iBT, which were built for academic purposes, and should only be used in that way.

When they are applied to other contexts, such as migration or professional registration, it reduces the validity of these tests. They may not target the right proficiency for those purposes.

Take driver’s licences for example. Compare a car, bus and motorcycle license: they all have the same road rules in common, but passing a car driver’s test doesn’t automatically qualify you to handle a motorcycle or drive a bus. Tests of English are similar. They often have basic commonalities, such as the road rules of grammar and basic vocabulary, but the test focus and purpose varies.

IELTS scores

The test was created in 1989 in response to Australia opening its tertiary sector to international students. It was first used in 1999 for skilled migration. In 2001, it was used for general Australian migration. Language testing to gain professional registration was already established by this time.


Read more: English test for international students isn’t new, just more standardised


IELTS is a standardised test. It has four sub-tests of speaking, listening, reading, and writing. It scores between 0-9, rising in 0.5 increments.

Improvement on lower scores is usually much easier to achieve than at higher levels. The effort needed to improve in English from one to two is much easier to develop than that for six to seven.

Institutions use the overall average score and/or the sub-test scores. To get a score of seven in reading and listening sub-tests, a candidate can get about 25% of the questions wrong. You can hear big differences in speaking ability between a score of five:

A score of six:

And a score of seven:

IELTS Academic and IELTS General Training differs in the writing and reading sub-tests. The listening and speaking sub-tests are the same. The academic reading sub-test is based on three long complex texts and the writing sub-test involves writing a formal essay and writing about information in a chart/diagram. The general training reading sub-test is based on everyday written materials (such as newspapers, brochures, advertisements), and the writing sub-test involves writing a letter and writing an essay using a personal style.

Misuse and misapplications

Complaints about IELTS are many. Essentially, these arise from the misuse and misapplication of the test.

For example, currently in the UK, overseas nursing recruitment has halted because native English-speaking nurses are failing IELTS. The first problem is that IELTS was not meant to test health care communication. It focuses on topics that have nothing to do with nursing, such as bee communication or pagoda construction.

Unsurprisingly, the more appropriate Occupational English Test is now being considered. It’s surprising the Occupational English Test hasn’t become the sole test used for healthcare registration, and other poor-fitting tests such as IELTS and TOEFL iBT haven’t already been removed as alternate accrediting options. The Occupational English Test is the only purpose-built test for the health care profession.

The second problem is IELTS was not meant for native English-speakers, who are expected to have much different linguistic skill sets to non-native speakers. This is because non-native speakers acquire English differently, more through reading and writing, and have different skills, like extensive English testing experience. On the other hand, native speakers have a lifetime of experience in English (acquired about five years of speaking and listening before learning to read and write), but less experience in being tested for their English.

IELTS doesn’t test for knowledge of the slang, idioms, and phrasal verbs a patient will use regularly. Jargon and culturally-specific materials are edited out before each IELTS test is released, yet a native speaker would easily ace a test that contained those elements. A non-native speaker would struggle.

Currently, the Australian government accepts the results of a number of different independent standardised English tests to establish functional, vocational, proficient, and superior categories of English language skills for migrants.

Now the government is proposing to tighten the English language requirement to screen certain types of migrants, such as refugees, for citizenship. They would require an IELTS score of six, but it’s unclear whether they will be using IELTS Academic or IELTS General Training which is much easier to pass.


Read more: English language bar for citizenship likely to further disadvantage refugees


The IELTS organisation has not officially disapproved of the use of the test beyond its original purpose. It comments on recommendation test scores for study, but is quiet on its use for migration or work purposes. But, at least one of the IELTS original designers has openly objected to it.

We should find alternatives which might better test proficiency for each specific purpose. This could include successfully completing an English course, looking at alternative tests which focus on general proficiency (no academic component), or building a new test.

Facts matter. Your tax-deductible donation helps deliver fact-based journalism.