Breaking News

Facebook to remove posts that 'could lead to physical violence


Facebook will now purge any posts that 'could lead to physical violence' in its latest step to stop hate speech and false information spreading on the social network.
The new policy comes after inter-religious violence in Sri Lanka was in-part sparked by the spread of false information shared on the social network. 
Hate speech and explicit threats were already classified as violations of Facebook rules and were automatically removed by the company.
However, the new policy takes another step back, eliminating content that may not be explicitly violent, but which seems likely to encourage such behaviour.
The social network will start removing inaccurate or misleading content created or shared to stir-up volatile situations, it has confirmed.
'There are certain forms of misinformation that have contributed to physical harm, and we are making a policy change which will enable us to take that type of content down,' a Facebook spokesman revealed after a briefing on the policy at the company's campus in Menlo Park, California.
'We will be begin implementing the policy during the coming months.'
The social network said it is partnering with local organisations and authorities adept at identifying when posts are false and likely to incite violence.
Misinformation removed in Sri Lanka under the new policy included content falsely contending Muslims were poisoning food given or sold to Buddhists.
People and groups in Myanmar also experienced real-world physical violence as a consequence of similar rumours spread on Facebook.
The social network has been lambasted for allowing rumours or blatantly false information to circulate that may have contributed to violence.
'There were instances of misinformation that didn't violate our distinct community standards but that did contribute to physical violence in countries around the world,' said Tessa Lyons, a product manager on Facebook's news feed, according to Wall Street Journal.
'This is a new policy created because of that feedback and those conversations', she added.
Ms Lyons said she did not have any more details to publicly share about the policy at this stage.
The company has implemented a series of changes aimed at fighting the spread of malicious or false information on Facebook, from fabrications that incite violence to untruths capable of swaying elections.
However, it is currently not clear how Facebook will determine whether content could lead to violence. 
The news comes as co-founder and CEO Mark Zuckerberg hit back at the idea that Facebook had now become too big and should be broken into smaller companies.
Zuckerberg recently faced congressional hearings on the data practices of Facebook – particularly in the wake of the Cambridge Analytica data scandal – and the lack of any true competitor.
He told congress in April Facebook had 'a lot' of competitors, who overlap with their service in 'different ways'.
The 34-year-old spent two days testifying about his company and answering questions.
In an interview with technology blog Recode, Zuckerberg said an overseas company the size of Facebook would be less inclined to show such transparency.
'I think you can bet that, if the government here is worried about... whether it's election interference or terrorism... I don't think Chinese companies are going to want to cooperate as much and aid the national interest there,' he said.
Zuckerberg also spoke out in the interview against the US constraining its own companies, noting it would only gives way to overseas - specifically Chinese - competitors.
'If we adopt a stance which is that, we're going to, as a country, decide to clip the wings of these [American] companies and make it so it's harder for them to operate in different places or they have to be smaller, then there are plenty of other companies out there willing and able to take the place of the work that we're doing,' he said.
'And they do not share the values that we have.'
Zuckerberg admitted he, as the creator of Facebook, was to blame for data being stolen and allegedly used to hijack the 2016 Presidential Election.
In his defence, he claimed he had been alerting the FBI to attacks on RNC and DNC members, and other accounts since 2015, but admitted Facebook was 'too slow' to respond to accounts spreading disinformation.
In the near-90-minute interview, released as a podcast, Zuckerberg also said he did not see it as his job to remove posts purely because the information was inaccurate, explaining he believed users should be able to unintentionally share misleading information.

No comments