Legislation aimed at preventing misinformation and disinformation online could backfire and “undermine its own objectives”, according to critics.

Despite the latest revisions, digital communications expert Dr Michael Davis said the bill could “fail to adequately protect and promote user rights”.

The newly revised legislation would empower the Australian Communications and Media Authority (ACMA) to enforce consistency and transparency in content moderation, but it will not have the power to remove specific content directly.

Dr Davis, a research fellow at the Centre for Media Transition at the University of Technology Sydney, said while the revised bill set better limits for the ACMA as the regulator of power, “the problem still remains that the scope of both ACMA power and platform accountability is set by the definitions of misinformation and disinformation”.

These definitions are very limited, which means only very specific scenarios will fall within those margins.

“Unfortunately, because so many platform content moderation decisions will fall outside the scope of ACMA’s powers, the bill will fail to adequately protect and promote user rights,” Dr Davis added.

“If it doesn’t meet the definition or high threshold of harm set by the bill, then we have no way of making platforms accountable for those decisions.

“The scope problem is the biggest weakness of the bill and means that the bill effectively undermines its own objectives.”

There have been numerous high-profile examples recently of when misinformation does cause harm, including the misidentification of a university student as the perpetrator of the Bondi Junction stabbing attack in April.

Their code in the past has been pretty enforceable, but only if you want it to be enforced.

Ben Cohen was accused by the Seven Network of being the culprit responsible for stabbing six people to death after social media users incorrectly named him on platforms including X and TikTok. He later filed and settled a defamation case against Seven.

However, Cohen also called for the identification and criminal charges to be laid against individuals who harassed him online.

In a statement issued through his lawyers, Cohen said: “People online who target individuals or communities should be held accountable for the consequences of their actions, and platforms should be more accountable for the content they host.”

Dr Rob Nicholls, an expert industry consultant at the University of Sydney and an experienced policy and regulatory specialist, said the revised bill had significantly reduced the definition of “serious harm”.

“Something has to have severe and far-reaching consequences for the Australian community, a segment of the community or an individual,” the associate professor said, adding the high threshold would give ACMA the power to accept codes of conduct from those digital media players.

The Digital Industry Group Inc (DIGI) representing Meta, Google, Twitch, Apple, and a variety of other social media platforms, who all agree, under the DIGI Disinformation Code, to commit to safeguards to protect Australians against harm from online disinformation and misinformation.

“Their code in the past has been pretty enforceable, but only if you want it to be enforced,” Dr Nicholls said.

Misinformation and disinformation are very challenging to define, and there is no universally accepted definition for either term, according to Dr Davis.

The government has chosen very narrow terminology in order to guide the functioning of the legislation and not impose greatly on freedom of expression, however, this could be perceived to create issues even within itself as the level of accountability by platforms also becomes less, he added.

“They become only accountable for the decisions that they make above that threshold,” Dr Davis said. “That will not stop them from moderating posts that don’t meet their terms of service.”

The problem with X, from a regulator’s perspective, is that X had been signatory to those agreed codes.

According to Dr Davis, the government’s effort to balance freedom of expression with accountability creates challenges, as narrow definitions may allow platforms to avoid responsibility.

“There needs to be a broad scope of platform accountability, and someone else needs to set the parameters for what is considered acceptable or unacceptable online speech,” he said.

Dr Davis said this could be done by establishing something like the Oversight Board, which provides independent checks on Meta’s content moderation.

Improvements such as mandatory recognition of recommendations and a shared council or board between several platforms would improve effectiveness and consistency.

Following the live streamed stabbing of a bishop in Western Sydney in April, both Meta and X were put on notice by the eSafety Commissioner not to allow the content to be shown. Both companies removed the content in Australia, however, X refused a global takedown order and was taken to court by the government.

The matter was later thrown out of court, with a judge ruling the Australian government had no power legally to order the removal of content outside Australia.

Dr Nicholls said: “The problem with X, from a regulator’s perspective, is that X had been signatory to those agreed codes.

“In November 2023, DIGI, as the code enforcer, told X they were in breach and needed to do something about it. X did nothing and therefore got kicked out of the scheme.

“The ACMA will be able to enforce schemes that the industry has put together, and all the players, with the exception of X, are pretty comfortable with ACMA enforcing the rules that they wrote themselves.”

Main image by Mike MacKenzie/Flickr.