An Australian regulator has fined Elon Musk’s social media platform X A610,500 ($386,000) for failing to cooperate with a probe into anti-child abuse practices, a blow to a company that has struggled to keep advertisers amid complaints it is going soft on moderating content.
The e-Safety Commission fined X, the platform Musk rebranded from Twitter, saying it failed to respond to questions including how long it took to respond to reports of child abuse material on the platform and the methods it used to detect it.
Hamas related disinformation.
Though small compared to the $44 billion Musk paid for the website in October 2022, the fine is a reputational hit for a company that has seen a continuous revenue decline as advertisers cut spending on a platform that has stopped most content moderation and reinstated thousands of banned accounts.
Most recently the EU said it was investigating X for potential violation of its new tech rules after the platform was accused of failing to rein in disinformation in relation to Hamas’s attack on Israel.
“If you’ve got answers to questions, if you’re actually putting people, processes and technology in place to tackle illegal content at scale, and globally, and if it’s your stated priority, it’s pretty easy to say,” Commissioner Julie Inman Grant said in an interview.
“The only reason I can see to fail to answer important questions about illegal content and conduct happening on platforms would be if you don’t have answers,” added Inman Grant, who was a public policy director for X until 2016.
X closed its Australian office after Musk’s buyout, so there was no local representative to respond to Reuters.
A request for comment sent to the San Francisco-based company’s media email address was not immediately answered.
Under Australian laws that took effect in 2021, the regulator can compel internet companies to give information about their online safety practices or face a fine.
If X refuses to pay the fine, the regulator can pursue the company in court, Grant said.
After taking the company private, Musk said in a post that “removing child exploitation is priority #1”.
But the Australian regulator said that when it asked X how it prevented child grooming on the platform, X responded that it was “not a service used by large numbers of young people”.
X told the regulator available anti-grooming technology was “not of sufficient capability or accuracy to be deployed on Twitter”.
Inman Grant said the commission also issued a warning to Alphabet’s (GOOGL.O) Google for noncompliance with its request for information about handling of child abuse content, calling the search engine giant’s responses to some questions “generic”.
Google said it had cooperated with the regulator and was disappointed by the warning.
“We remain committed to these efforts and collaborating constructively and in good faith with the e-Safety Commissioner, government and industry on the shared goal of keeping Australians safer online,” said Google’s director of government affairs and public policy for Australia, Lucinda Longcroft.
X’s noncompliance was more serious, the regulator said, including failure to answer questions about how long it took to respond to reports of child abuse, steps it took to detect child abuse in livestreams and its numbers of content moderation, safety and public policy staff.
The company confirmed to the regulator that it had cut 80% of its workforce globally and has no public policy staff in Australia, compared to two before Musk’s takeover.
X told the regulator its proactive detection of child abuse material in public posts dropped after Musk took the company private.
The company told the regulator it did not use tools to detect the material in private messages because “the technology is still in development”, the regulator said.