Meta Expands Fact-Checking Efforts in Australia to Combat Misinformation and Deepfakes Ahead of Election
Meta Platforms, the parent company of Facebook and Instagram, announced on Tuesday that it is strengthening its independent fact-checking program in Australia to identify and reduce false content and deepfakes ahead of the national election due by May.
In a blog post, Meta stated that any content that could incite violence, cause physical harm, or interfere with voting would be removed. Additionally, misleading content flagged by fact-checkers will have warning labels attached and will be downranked in Feed and Explore, making it less visible to users.
Meta’s fact-checking partners in Australia include Agence France-Presse (AFP) and the Australian Associated Press (AAP), which will review questionable content, according to Cheryl Seeto, Meta’s Head of Policy in Australia.
Beyond misinformation, Meta is also addressing the growing threat of deepfakes—highly realistic AI-generated videos, images, or audio that can be misrepresented as real content.
Meta said that deepfake content violating its policies will be removed, while altered AI-generated content will be labelled as "altered" and ranked lower in feeds to limit its reach. Users will also be required to disclose when they post AI-generated content.
"For content that doesn't violate our policies, we still believe it's important for people to know when photorealistic content they're seeing has been created using AI," Seeto said.
Meta’s Australian fact-checking initiative aligns with its broader strategy to combat misinformation during recent elections in India, Britain, and the United States.
However, in January, Meta shut down its fact-checking programs in the U.S. and eased restrictions on controversial discussions, such as immigration and gender identity, following pressure from conservative groups to revamp its political content policies.
Meta faces increasing regulatory challenges in Australia. The government is considering a levy on big tech firms to compensate for the advertising revenue they generate from sharing local news content.
Additionally, social media platforms will be required to enforce a ban on users under 16 by the end of this year. Tech companies are currently in discussions with the government on how best to implement these restrictions.
With a tight election race between the opposition Liberal-National coalition and the ruling Labor party, Meta’s policies in Australia will play a key role in preventing the spread of misinformation and AI-generated content in the lead-up to the vote.