Facebook Makes It Easier to Report Bullies

By Jennings Brown on at

Facebook is taking steps to hinder harassment on the social media platform with an array of new anti-bullying tools.

At the beginning of this year, Facebook tweaked the News Feed so users saw more posts from family and friends and less content from publishers and companies. Mark Zuckerberg said at the time he wanted to help users have “meaningful social interactions.” Now Facebook is taking more steps to try to facilitate less toxic social interactions.

The company’s head of safety, Antigone Davis, announced the new initiative in a blog post. One new feature will enable users to delete several comments at the same time. That means if someone is getting a flood of mean or harassing comments, they could delete them en masse without having to read them all. Users can also report a post on behalf of someone who they think is being harassed or bullied, through a menu on the post. The comments will then be flagged for Facebook’s moderators to review and decide whether the content is in violation of the company’s community standards.

On the flip side, if a user believes their post was unfairly reported as harassment, they can appeal the decision.

Davis wrote that Facebook is trying out ways to enable users to block specific words from showing in comments on their posts. Facebook claims it is also ramping up efforts to protect public figures. The company will still allow critical discussion about celebrities and politicians, but says it will censor “severe attacks that directly engage a public figure.” Facebook has already had a similar policy for young public figures in place for a few months, but now it’s expanding the initiative to cover all ages.

The announcement comes one week after Twitter said it is cracking down on “dehumanising speech” on its platform. Twitter and Facebook-owned Instagram started letting users block certain words back in 2016.