Facebook has launched an effort to combat racist comments posted on the platform, but its efforts have run into problems, according to a new study.
According to the report from the Brennan Center for Justice at New York University, users of the social network posted nearly 3,000 comments in the first 24 hours of September, the day the report was released.
That’s a huge number, but the study found that it’s not nearly enough to eliminate all racist comments.
Instead, the study finds that Facebook users were “more likely to engage in racist behavior,” with about 30 percent of them commenting “racist.”
Those comments were made on the company’s social news feed and weren’t automatically flagged as such by moderators.
That’s because Facebook uses a system called “objective analysis” to determine if a comment is racist, and the company has already identified some posts that have been flagged for that.
However, the report found that the company isn’t doing enough to stop people from posting racist comments, especially on Facebook itself.
The Brennan Center analyzed nearly 1,400 comments made by users of Facebook from the first 72 hours of the report, and found that “a very small number” of comments are flagged as racist.
The group also found that some users were also posting racist content in their news feeds, but they were not flagged as offensive.
The group also said that it was difficult to identify the racist comments by looking at the comments themselves, and that there was no way to see who was posting the comments or whether they were flagged.
“This data is important because it tells us that some racist comments are likely going to stay on Facebook for longer than others,” said the report’s co-author and law professor at the NYU School of Law, Jessica Rosenworcel.
“It’s not like we know the racist language or how often these comments are made.”
Facebook is already under scrutiny over the content of posts made on its platform, which has become a hotbed for hate speech and racism.
The company recently launched an initiative called “Operation Safe Spaces,” which encourages users to report posts of racism and other hate speech.
In addition to banning hate speech, the company is also rolling out new “community guidelines” for its users, including ones that encourage users to use their accounts responsibly.
Facebook says that this will ensure that “we keep our community safe, welcoming and inclusive.”