Facebook owner Meta Platforms FB.O will help train Australian political candidates on cybersecurity aspects and coach influencers on how to stop the spread of misinformation in a bid to bolster the integrity of an upcoming election, it said Tuesday.
Australia has not yet set a date for its next general election, which is due in May. Authorities are already on high alert about election interference, having earlier highlighted attempts at foreign interference targeting all levels of government and targeting both sides of politics.
“We will remain vigilant to emerging threats and will take additional steps as necessary to prevent abuse on our platform, while empowering the people of Australia to use their voice through voting,” said Josh Machin, the Australian Chief of Public Policy for the company a statement to be posted online.
The social media giant added that it had convened a university to help with fact-checking in Australia and would require disclosure of the names of those paying for election-related advertising, which it described as its most comprehensive electoral strategy.
The steps show how social media companies are trying to combat online distortion and misuse of information in the run-up to an election, a time when such efforts are usually at their fiercest.
The Facebook Protect security program for high-profile individuals launched in Australia in December, with the company pledging to work with election officials and political parties to provide candidates with training on its policies and tools and ways to protect.
To prevent hacking, candidates are encouraged to upgrade security to two-factor authentication. The company said it will also coach influencers, or those who generate advertising revenue from online comments, to spot fake news.
People who want to post election-related ads must provide government-issued identification as well as mandatory disclosure of their funding sources, it said.
Ads from unauthorized parties would be removed without disclosure of funding and kept in a public archive for seven years, she added.
RMIT University, which joined Meta’s third-party fact-checking efforts, said it will review posts the company has identified as potential misinformation and attempt to verify them through interviews with primary sources and reviews of public data.
“An ongoing focus of our work is identifying the super-mongers of misinformation and the ecosystems in which they operate,” said Russell Skelton, director of the RMIT FactLab, in a statement. “False misinformation disrupts evidence-based public policy and debate, and so it is critical that we better understand what drives this.”