A new report claims that the Facebook platform has failed to delete certain terrorist posts that it described as "insightful" and "engaging"

Facebook fails to tackle extremist content on networks

REUTERS/JOHANNA GERON - Facebook logo displayed on a mobile phone

The Institute for Strategic Dialogue has produced a research report in which it has shown the world that Facebook did not delete terrorist content circulating on the social network. Extremists have long been using various online platforms to use them as weapons, as well as to promote hatred and to interest users, who they then recruit for their ranks. It seems that Facebook, instead of removing any evidence, has decided to leave this content on the network. Specifically, these are photos of beheadings, hate speeches issued by Daesh or the Taliban, whose publications the famous platform labelled as "insightful" and "attractive".

Moustafa Ayad, executive director of the Institute for Strategic Dialogue, has been at the helm of producing this document. He argues that such posts are easy for anyone to find online, and that such content incites much more, and Facebook should take note to stop it happening. "It's essentially trolling: it annoys the group member and similarly makes someone in moderation take note, but groups often don't get deleted. That's what happens when there is a lack of content moderation," the Institute's director added. Also, Ayad added that these groups have been popping up for the past 18 months and accused the platform of not doing anything to eradicate them. 

mark-zuckerberg-facebook

On the other hand, Facebook, or now with a new name change known as 'Meta', has had access to the report, and thus acted quickly to receive less criticism. When they saw these groups on the network promoting Islamic extremist content, they removed them completely. "We have removed groups that have come to our attention. We don't allow terrorists on our platform and we remove content that praises, depicts or supports them whenever we find it," a spokesperson for the social network said.

But in addition to taking action to prevent such problems again, Facebook has defended itself against the accusations by adding that these out-of-control posts are human error by the content reviewers and that anyone would make them, as the people in charge of these reviews see a lot of photos throughout the day. "We know our app isn't always perfect, so we continue to invest in people and technology to eliminate this type of activity faster, and work with experts in terrorism, violent extremism and cyber intelligence to disrupt the misuse of our platform," concludes the statement in which the social network apologises for what happened.

facebook-red-social

Frances Haugen, a former employee of the US giant who left the brand because of how it worked, revealed several reports from the company that revealed issues about the secrets of the social network and how it operates. She confirms that in addition to the problem of content reviewers, there is another in which she claims that Facebook's automated systems for identifying hate speech and pro-terrorism content have no people who speak more than English, so it is easy for Arabic text, not knowing what it says, to slip into posts.

The company is totally lacking in Arabic experts and, above all, those who speak the more difficult dialects spoken in the violent areas where terrorism takes place, so they cannot understand the cultural contexts that a publication may have. For her, the brand created by Mark Zuckerberg does not have workers who can speak and understand Arabic and this represents a challenge when it comes to reviewing published content. 

facebook-red-social

Undoubtedly, Facebook continues to face problems on several sides. Haugen, in the documents he revealed to the world, also claims that Meta was in possession of controversial reports warning of the dangers of Instagram for young people, especially teenage girls, as what is posted online can cause them to suffer from disorders, mental problems or even suicide. Haugen claims that this was completely ignored by the company. She has also alleged that Facebook incited the groups that stormed the Capitol in Washington D.C., as well as encouraging political extremism.

Envíanos tus noticias
Si conoces o tienes alguna pista en relación con una noticia, no dudes en hacérnosla llegar a través de cualquiera de las siguientes vías. Si así lo desea, tu identidad permanecerá en el anonimato