Menu Close
Five thousand people on Newstart or Youth Allowance may be targeted for a drug test trial. AAP Image/Dan Peled

Drug testing welfare recipients raises questions about data profiling and discrimination

The Australian government’s proposed random drug test trial for welfare recipients is not so random.

Announced as part of the 2017 federal budget, Treasurer Scott Morrison wants 5,000 people on Newstart or Youth Allowance in three locations to undergo random drug testing from January next year.

Traces of drugs including ecstasy, marijuana and ice will be sought using saliva, hair follicles and urine samples. If drugs are detected, the user could find their welfare quarantined.

But rather than doing people “a big favour”, as Prime Minister Malcolm Turnbull put it on ABC Radio Wednesday, such data-based programs often disproportionately target those of low socio-economic status.

Concerns are already being raised that the trial undermines the needs-based focus of Australia’s welfare system. The use of data tools to profile people seeking help only adds to the problem.

How job seekers will be profiled

The characterisation of the testing as “random” is questionable.

The government says the testing will be “based on a data-driven profiling tool developed for the trial to identify relevant characteristics that indicate a higher risk of substance abuse issues”.

In a press conference Thursday, minister for social services Christian Porter said a “combination of data” developed with Data61 and the CSIRO would be used, as well as internal information from the Department of Human Services and Department of Social Services.

“We’ll put all of that together and identify a broad group of people and then randomly select inside that broad group inside each of the three trial sites,” he said.

In an interview with BuzzFeed Thursday, Scott Morrison also suggested the three test areas may be chosen using the results of a national program that looks at drugs in wastewater sewage.

While we may think profiles built from such data sets are rational and without prejudice, computational models are not necessarily free from discrimination.

Rather than being “pure”, like any model, they are based on human-generated assumptions.

Stereotyping on steroids

The use of data to profile consumers is nothing new.

Insurance companies, for example, use it to assess customer risk based on factors such as age, profession and type of car. This modelling is used to identify those more “at risk” of having an accident, with insurance premiums priced accordingly.

Even the most careful under-25 driver will feel the impact of falling into a high risk age-based profile, whether or not they’re a bad driver.

But data profiling can become stereotyping on steroids, with human assumptions magnified by computational power.

Data-driven profiling often looks for a target attribute – or class attribute – that the profiler is most interested in predicting. When looking for groups to participate in a drug test trial, for instance, the class attribute could possibly be sensitive groupings such as race, gender, socio-economic background and education level.

This could lead to discriminatory practices where an entire category of people is considered suspect and therefore is more heavily scrutinised.

A 2012 analysis of American privacy laws, for example, found lower socioeconomic groups were more impacted by invasive surveillance, such as mandatory drug testing.

I suggest that to be representative and not discriminatory, the sites selected for the federal government’s drug testing trial would need to have the same proportion of drug users as the general population, with the same distribution of ages, gender split and mix of high- and low-skilled labour.

Instead, as previously mentioned, the trial site selection may be informed in part by analysis of drug trace levels found in sewerage.

Porter said “astonishingly high” levels of ice usage found in some regions helped prompt the trial: “We want to drive behavioural change in some of those areas at that critical point where people are job searching,” he added.

Too many unanswered questions

The data-driven profiling of welfare recipients raises a number of ethical questions the government should answer.

Among them:

  • Can the security of the data be adequately protected?

  • Will the information be used solely for its original purpose?

  • What procedures will there be to challenge your selection for a drug test?

Not to mention, if the welfare recipient is open to scrutiny, to what extent is the “contracted third party provider” running the testing also required to be transparent? Already, the cost of the measure has been deemed “commercial-in-confidence”.

A government spokesperson declined to comment, saying it would make further announcements about the trial at an appropriate time.

Many of our actions are now observable, searchable and traceable, and surveillance is more intrusive and extensive than ever. But the impact of this can fall more heavily on disadvantaged communities.

So-called “random” drug testing is just another example of this worrisome trend.

Want to write?

Write an article and join a growing community of more than 180,400 academics and researchers from 4,911 institutions.

Register now