Menu Close

Facebook crackdown just the beginning as regulators look to protect social media consumers

Facebook, which listed on the Nasdaq in May, is the latest target of Australian regulators. AAP

The ACCC’s focus on Facebook comment moderation is part of a broader but potentially futile campaign to protect consumers using social media, experts say.

Yesterday the Australian Competition and Consumer Commission said large companies should act to remove offensive material from their Facebook pages within 24 hours, or risk being fined.

The ACCC crackdown comes after both the Advertising Standards Bureau, and advertising industry body the Communications Council deemed third-party comments on Facebook pages to be advertisements, making them subject to the advertising industry’s codes of ethics.

The ACCC has taken a very strategic approach towards targeting large technology companies, said Matthew Rimmer, ARC Future Fellow and Associate Professor in Intellectual Property at Australian National University.

“The important context in this is the ACCC have made it a priority in the last year to think about defending and protecting Australian consumer law in cyberspace.”

Prior to throwing its weight behind the Advertising Standards Board over Facebook comment moderation, the regulator has been involved in major court actions with both Apple and Google.

“It’s quite an interesting approach in terms of taking on the biggest players and really sending a more general message about how they want to deal with misleading and deceptive conduct in the new digital economy,” Dr Rimmer said.

It also comes at a time, Dr Rimmer said, when companies are relying upon social interactions and testimonials to boost their reputation and get consumers to do their advertising for them.

It all goes towards holding companies more responsible for their marketing claims, said Sean Rintel, lecturer in strategic communication at the University of Queensland.

“In some ways you’d hope that consumers might be a little more wary of advertising, but it gets harder and harder to tell what advertising is, especially as organisations like Facebook, Google and Bing are trying to build in the recommendations of your friends into search results,” Mr Rintel said.

“They’re obviously trying to get as much information about you and your friends so they can sell advertising and sell you to advertisers, and in a sense they’re also trying to get you what you want by showing that your friends like a service,” Mr Rintel said. “But if no-one’s being held to account for this then what is a real search result, which includes recommendations from friends, or people who are supposed to know, become much more dubious.”

Mr Rintel said astroturfing, where an organisation, corporation or political group with a particular agenda plants comments on a site or Facebook page appearing to be from a grassroots organisation or citizen group, is becoming a bigger problem as the use of social media sites grows.

“The ease of astroturfing campaigns, especially digital astroturfing campaigns is huge, the bar for that is so low that I do think we should be protecting consumers against it,” Mr Rintel said.

But David Glance, director of the Centre for Software Practice at the University of Western Australia, said the problem with social media regulation is it is impossible to enforce.

“Facebook pulled the aboriginal memes page, and within half a day it had been put back up again,” Associate Professor Glance said.

Professor Glance added that when asked to adjudicate, regulators tend to err on the side of companies. “Even though they’ve (the ACCC) made this blanket statement, which hasn’t been challenged in court, it’s hard to see how they will enforce it and what it means in practical terms.”

Mr Rintel said many businesses haven’t had to spend much money on social media in the past. “This is a redistribution of the cost of doing business,” Mr Rintel said.

“They will be able to claim this in the same way they claim other sorts of business expenses. The cost of either hiring moderators or developing technologies to help moderation is a cost of doing business. Arguably there’s also a new kind of business which might pop up which is a business aimed at helping other businesses moderate this sort of stuff.”

Professor Glance agreed, and argued that software and algorithms designed to call out spam and objectionable comments would emerge over time.

“The problem is companies don’t budget for the appropriate level of staff in social media anyway. This takes it out of the admin person’s spare time job to something which people have to do and take seriously.”

Want to write?

Write an article and join a growing community of more than 128,900 academics and researchers from 4,065 institutions.

Register now