Australia's media regulator to investigate how tech platforms handle misinformation

Australia's media regulator

The Australian Communications and Media Authority has been granted additional powers to review tech platforms' misinformation processes and improve a self-regulatory code being drafted by the industry.

The Australian Communications and Media Authority (ACMA) is to be given additional powers to oversee the activities of big tech platforms, including the ability to request information and data on how they have handled misinformation and disinformation, it said Monday (March 21).

It is the latest move by the Australian government to crack down on tech platforms like Facebook and Google, and follows the publication of a report by ACMA which raised concerns about the platforms' processes to report on and tackle misinformation and disinformation.

The report found that four in five (82%) Australian adults had seen misinformation about Covid and three quarters (76%) thought tech platforms should do more to reduce the amount of false and misleading content shared online.

Facebook, WeChat and Twitter have the highest levels of reported Covid misinformation, the report found.

However, it also found that stricter content moderation on large platforms like Facebook is driving some conspiracy groups to move to alternative social networks such as Telegram, Gab, Parler and Rumble.

If those platforms rejected industry-set content guidelines, "they may present a higher risk to the Australian community", ACMA said.

The regulator raised concerns about a single code of practice being developed by Digi, an Australian industry association representing Facebook, Google, Twitter and TikTok. The draft code requires platforms to sign up to the objective of "providing safeguards against harms that may arise from disinformation and misinformation". The platforms may opt-in to other code objectives, such as disrupting advertising incentives and supporting strategic research. The code provides signatories flexibility to implement measures to counter disinformation and misinformation in proportion to the risk of potential harm, and stresses the need to balance interventions with the need to protect users’ freedom of expression, privacy, and other rights.

The ACMA said the scope of the code is limited, that it does not include the need for platforms to have robust internal complaints processes, and that it should be an opt-out rather than opt-in model.

Digi said it supported the recommendations and noted it had already set up a system to process complaints about misinformation.

The ACMA’s report also raised concerns regarding the quality of platforms' annual transparency reports. "The initial set of transparency reports was inconsistent and, in general, lacked the level of detail necessary to benchmark individual platform performance or assess the effectiveness of measures," the ACMA said.

The government granted ACMA new regulatory powers to address these concerns and improve the self-regulatory code.

The ACMA will also be able to enforce an internet industry code on uncooperative platforms.

The ACMA will continue to monitor platforms’ measures and the implementation of code arrangements to inform additional advice to the government by the end of the 2022-2023 financial year.


Start Your Free 30-Day Free Trial

Get the very latest news and insight from Campaign with unrestricted access to, plus get exclusive discounts to Campaign events.

Become a subscriber


Don’t miss your daily fix of breaking news, latest work, advice and commentary.

register free