In a recent article, The Wall St. Journal reported that one of the most frequent Facebook and YouTube comments was: “If you don’t feel comfortable saying something, don’t say anything.”

This comment was followed by: “Just don’t make me laugh.

It’s a waste of your time.”

In the second, the comments were: “This isn’t about you.

This is about the people who like you.

They’re going to get to you, but they can’t make you laugh.”

A third comment was: “@tara_curl you’ve been banned for being a bad person.

We’re gonna find you.

If you do, you’re going away forever.”

In the fourth comment, the user said: “@rachelmack a lot of people are upset, but I think this is a really smart way to start the conversation.

I hope this works.”

In total, there were more than 7,300 Facebook and 5,800 YouTube comments on Facebook.

Of those comments, nearly 70 percent were negative.

The majority of negative comments came from male users.

This was also the case for Twitter, where, on average, there was one negative comment per 1,000 tweets.

Facebook’s policy for its content prohibits bullying, hate speech and harassment.

However, when the company decides to take action, it usually goes far beyond Facebook’s policy.

For instance, in November 2017, Facebook announced a crackdown on hate speech on its platform, with the policy prohibiting hate speech by using “hate speech” as a category.

The company also expanded its enforcement against abuse and bullying.

In April 2018, Facebook took down a post that encouraged users to harass a woman and a friend of the woman’s by posting photos of their faces in front of a house and saying, “This woman is a disgusting cunt.”

The post had been flagged as “toxic.”

The company also took down the post, but the user who made it remains anonymous.

In addition to this, Facebook has also been under fire from civil rights organizations and human rights groups for its policy banning hate speech in the platform.

It also took the unusual step of removing the word “white” from its platform.