Facebook (Meta) and Twitter would like to see Australian authorities’ enthusiasm for more and more online censorship curbed by reviewing the effectiveness of some of the current and upcoming legislation and practices in this field.
Both companies stand to lose money if ordered to pay fines for failing to censor content under Australia’s laws like the Online Safety Act that was passed last year and came into effect recently, as well as two upcoming bills said to be designed to combat trolls and protect online privacy.
Facebook and Twitter made submissions to the Select Committee on Social Media and Online Safety, which was set up in late 2021 to look into social media companies’ practices and assess how they affect Australians’ mental health.
Twitter is arguing that the Committee’s inquiry that is supposed to last three months is not enough and should be extended, in that way allowing more time to implement the Online Safety Act, but also urging for “meaningful consultation with the community.” Twitter would like a review of online safety to be done a year after the committee’s first report, set to be released in February.
Both Twitter and Facebook, along with Google and TikTok, are members of Digital Industry Group Inc (DiGi), which also filed a submission which said some of the legislative measures now being pushed, like mandatory age verification on social media, have come without any legislative notice. DiGi wants “wider consultations” before implementing these measures.
Both this industry group and Facebook in its individual submission warned that the sheer number of different laws either adopted or currently being considered in Australia with the goal of regulating the internet could produce “overlapping, duplicative, or inconsistent rules.”
DiGi would like to see more clarity with all these different proposals – such as the Anti-Trolling Bill – consolidated into a single law.
The Online Safety Act 2021 contains such “gems” as mandating that online services using encryption should “take reasonable steps to develop and implement processes to detect and address material or activity on the service that is or may be unlawful or harmful.”
The law also gives the eSafety commissioner the power to order deletion of links and removal of apps within 24 hours.
Failure to comply brings with it a fine of about $404,000 for platforms $81,000 for an individuals.