Headline

TikTok Deletes Over 380,000 Hate Speech Videos–In 2020

 TikTok, who is known for short videos that talk and dance to your favorite music, is working on the problem of hate speech, like other social networks.

TikTok

By: Angela Lang/CNET

On August 21, TikTok said that it deleted more than 380,000 videos in the United States in 2020 that were in violation of the company's rules prohibiting hate speech.Revealed.. It also banned more than 1300 accounts and removed more than 64,000 malicious comments due to malicious content and behavior.

TikTok describes hate speech as "attacking, threatening, or causing violence against individuals or groups of individuals based on protected attributes such as race, religion, gender, gender identity, or nationality." , Content that is depriving humanity."

Published in August by the Alliance for Defamation Prevention [ADL]ReportTikTok, which is used by 100 million US users, is full of content that promotes white supremacist groups and anti-Semitism, he said. Human rights activists are calling on social networks such as Facebook to step up their efforts to remove malicious content on their sites.

According to Eric Han, who is in charge of TikTok's safety management in the United States, the company wants to improve its policy on hate speech and strengthen its measures against this problem. He outlined TikTok's five measures against hate speech. In addition to seeking expert advice for policy improvements, he says he is addressing this kind of content by banning accounts and by not including offensive content in search results.

This article is from overseas CBS InteractivearticleIs edited by Asahi Interactive for Japan.


Source link

Do you like this article??

Show More

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button