Lebanese broadcaster supports dance group Mayyas ahead of US TV show performance

0

LONDON: Anti-vaxxer groups are using carrot emojis to evade automated moderation tools used by social networks to detect news that violates the platform’s policies, the BBC reported on Friday.

An investigation revealed several Facebook groups in which the carrot emoji was replaced with the word “vaccine”. Because Facebook’s algorithm normally focuses on words rather than emojis, members were able to circumvent the platform’s automatic content moderation mechanisms.

According to the report, a Facebook group using this tactic had more than 250,000 members.

The groups, which could only be joined by invitation, had clear guidelines and urged members to “use code words for everything” and “Never use the c-word, the v-word or the b-word”, making reference to “COVID”, “vaccine”. ” and ” booster “.

The investigation also found that groups using the carrot emoji were promoting unverified claims that people were harmed or killed by vaccines.

Marc Owen Jones, a disinformation researcher at Hamad bin Khalifa University in Qatar, noticed the trend after he was invited to join one of the groups and took to Twitter to share his findings.

“These were people telling stories of loved ones who died shortly after receiving the COVID-19 vaccine,” he said. “But instead of using the words ‘COVID-19’ or ‘vaccine’ they were using carrot emojis.

“At first, I was a little puzzled. And then it clicked — that it was being used as a way to evade, or seemingly evade, Facebook’s fake news detection algorithms.

After the BBC reported the findings to Meta, the groups were removed, although some reappeared soon after.

“We have removed this group for breaching our harmful misinformation policies and will review any other similar content in accordance with this policy. We continue to work closely with public health experts and the UK government to further tackle misinformation about COVID vaccines,” Meta said in a statement.

Meta, along with other social media platforms, has come under scrutiny over the past two years for failing to remove misinformation about COVID-19 and vaccines.

Facebook said last year it had removed more than 20 million pieces of content with misinformation about COVID-19 or vaccines since the pandemic began.

Emojis are harder for algorithms to detect since the AI ​​is trained on text and words, which may explain how these groups have managed to go unnoticed for so long.

With emoji-based hate posing a growing challenge to automated detection, a team of Seattle University researchers created a tool called HatemojiCheck, a suite of tests that exposes weaknesses in existing hate detection models and identifies the hateful language expressed via emojis.


Source link

Share.

Comments are closed.