Bold claim: online information integrity is facing an escalating crisis defined by AI-driven propaganda, coordinated disinformation campaigns, and aggressive online assaults on people and institutions. And this is precisely what the latest Senate hearings on climate and energy information integrity unfold, revealing a landscape where truth is increasingly contested and action on climate change is impeded. The Senate select committee on information integrity regarding climate change and energy in Canberra just wrapped two days of public hearings, hearing from major platforms like Meta (the parent company of Instagram, Facebook, and WhatsApp), TikTok, and from industry voices including Coal Australia and the Minerals Council, along with academics and community groups.
The inquiry examines how bots, trolls, and disinformation campaigns—plus tactics such as astroturfing, or fake grassroots campaigns—slow global progress toward renewable energy and climate policy. It also probes the ties among Australian groups and international think tanks and influence networks that may shape public opinion and policy in ways that benefit fossil fuel interests. In response, policymakers face three intertwined tasks:
- Understanding the contemporary challenge posed by AI-driven misinformation, toxic algorithms, and pervasive data surveillance.
- Keeping pace with evolving lobbying strategies and campaign techniques in politics.
- Recognizing the existence of a global network of think tanks funded by fossil fuel interests over decades, aimed at delaying climate action and shaping public discourse.
Australia’s Human Rights Commissioner, Lorraine Finlay, warned senators that it will be extraordinarily difficult to clean up online information channels without curtailing free speech and expression. She underscored that engagement-driven algorithms elevate extreme or sensational content, complicating democratic participation and eroding public trust. She cautioned that overly broad or vague regulation could chill genuine public debate on critical issues facing the nation. The core issue, she explained, is the balance between safeguarding rights and curbing bad-faith actors who claim freedom of expression while spreading misinformation that erodes social cohesion. But how should “bad faith” be proven, and who decides?
The discussion also centered on the role of algorithms, viral media, and information quality. Lawmakers are being urged to warn Australians that information quality online may deteriorate further in coming years, and that a whole-society effort will be required to combat misinformation beyond climate-specific debates. Senators heard that technology is evolving at breakneck speed, and the internet has become a primary battlefield for power over our collective future. The fusion of AI, social platforms, and data surveillance has heightened the risk that online channels are polluted by propaganda and foreign influence.
What does this mean for Australia? Meta representatives stated that the company has removed hundreds of millions of bots and tackles coordinated inauthentic behavior, but they argued that they do not censor politicians’ speech, given the scrutiny politicians face from traditional media. They acknowledged awareness of powerful political figures leveraging viral algorithms to disseminate harmful or false messages, and noted Brandolini’s law—the principle that disproving misinformation takes far more effort than creating it. Yet they maintained that censorship of politicians falls outside their remit unless speech incites violence. Labor Senator Michelle Ananda-Rajah pressed that this stance misses a key tension in safeguarding public discourse while curbing manipulation.
Astroturfing and opaque political funding were also in focus. The committee heard concerns about lobby groups increasingly funding third-party entities to run campaigns, raising questions about who ultimately bears responsibility for political messaging and how transparent such funding is. Coal Australia defended channeling nearly $4 million to the third-party group Australians for Prosperity in the previous financial year, arguing it simply delegates campaign logistics to specialized groups rather than engaging in astroturfing. Critics question whether this approach obscures the true origin of campaign influence and confuses voters about which ideas or policies are being promoted. Australians for Prosperity counts substantial support from Coal Australia, and its leadership has connections to the Liberal Party, prompting broader questions about political influence and transparency in Australian elections.
The Atlas Network’s global footprint also drew attention. University researchers described the long-standing links between Australian think tanks and Atlas Network affiliates, illuminating how a network of free-market think tanks has been used by fossil fuel interests to stall climate action for decades. The revelation that Australians may have limited awareness of Atlas’s reach sparked discussion about how such networks shape policy and public understanding when their funding sources and strategic aims are not widely disclosed. As Angus Taylor ascends to party leadership, several Atlas-linked think-tank alumni have gained influential roles in the opposition, which some attendees view as a signal of how think tanks influence political trajectories.
The public hearing also featured science communicator Dr Karl Kruszelnicki’s exchange with One Nation senator Malcolm Roberts, illustrating how the science debate can devolve into partisan standoffs. Dr Karl described efforts to develop an AI chatbot trained on rigorous, peer-reviewed climate science to counter misinformation more efficiently than could be done in hours of one-on-one outreach. He stressed that the AI would be trained exclusively on credible scientific papers and decades of climate data, including his personal collection of 40,000 vetted sources accumulated over 40 years. The aim is to equip scientists with scalable tools to push back against climate denial while protecting accuracy.
The Senate committee is slated to deliver a final report on March 24, outlining findings and recommendations. This moment in Australia’s policy conversation highlights the complexity of balancing free expression with accurate information, the rapid evolution of online campaigning, and the enduring influence of global think tanks in shaping climate discourse. As audiences react, a pressing question remains: will reforms protect democratic participation while curbing disinformation, or will they risk stifling legitimate debate? Share your view in the comments below.