According to U.K.-based think tank Demos about 10,000.
They came to this figure by using a filter to analyze 126,975 English-language tweets and then estimated that one in every 15,000 tweets contained a “racist or ethnic slur.” But it goes deeper…
Based on this formula, they found that anywhere between 47.5 percent and 70 percent were “non-derogatory” or used to “express in-group solidarity.” (Cutting out the term “white boy” reduced the total number of racially insensitive tweets in half).
Of course, this still begs the tricky question of how many tweets would be considered, by most people, as racist. A program can’t parse the racist subtext of a tweet and human analysts have different ideas of what counts as racism.
“Even though racist, religious and ethnic slurs tend to be used in a non-derogatory way on Twitter, this does not mean that hate speech is not being used on this platform,” the report’s authors said. “Language does not require the use of slurs in order to be hateful.”
Ultimately, the study found that one in 55,000 tweets (around 0.000018 percent) was indicative of racial prejudice. That includes up to 10 percent of the tweets that were considered “casual” racial slurs — meaning they weren’t explicitly racist, but would probably be considered offensive by some people — and the estimated 100 tweets a day that threatened violence.
Thanks to NBC News for this article. Read more here.