Facebook Safety Guide
CONTENTS

Facebook’s Policies

  1. What type of content is not allowed on Facebook?
  2. What do Facebook’s policies say about online harassment?
  3. How does Facebook take action against harassment?
  4. How is Facebook responding to criticism for not adequately limiting the spread of hate speech and abusive behavior?

Report Harassment

  1. What should I do if someone is harassing me on Facebook?
  2. How do I report an abusive comment?
  3. How do I report an abusive post?
  4. How do I report a Profile?
  5. How to report a photo or a video?
  6. How to report posts in your Timeline?
  7. How do I report a Facebook group that is abusive?
  8. What happens after I report abusive content?
  9. Are my reports kept confidential?
  10. Someone is harassing me across various functions on Facebook (photos, comments, private messages, etc.). Do I still need to report each instance separately?
  11. Will Facebook notify me when they have dealt with my report?
  12. What can I do if my complaint is rejected? Do I have any recourse?
  13. Who can report harassment on Facebook? Can I report abuse if I don’t have a Facebook account?
  14. What should I do if someone is asking me to share a nude or sexual picture of myself on Facebook, or is threatening to share a photo that I already sent?

Block on Facebook

  1. How do I block someone on Facebook?
  2. What happens when I block someone?
  3. After I block someone, can I see anything about that person?
  4. What other options do I have to stop someone from contacting me?

Control Your Messages

  1. How do I report an abusive message?
  2. How do I block messages from someone on Messenger?
  3. What happens when I block messages from someone in Messenger?

Additional Resources

  1. What can I do to increase the privacy and security of my account?
  2. What resources does Facebook offer for victims of harassment?
Facebook's Policies

1. What type of content is not allowed on Facebook?

Facebook has developed a set of Community Standards that outline what type of content is not allowed and can be reported and requested to be removed. Here are the categories included in the guidelines:

  • Violence and criminal behavior
    • Violence and Incitement
    • Dangerous individuals and organizations
    • Coordinating harm and publicizing crime
    • Regulated goods
    • Fraud and deception
  • Safety
    • Suicide and Self-Injury
    • Child nudity and sexual exploitation of children
    • Sexual exploitation of adults
    • Bullying and harassment
    • Human exploitation
    • Privacy violations and image privacy rights
  • Objectionable Content
    • Hate Speech
    • Violent and graphic content
    • Adult nudity and sexual activity
    • Sexual solicitation
    • Cruel and insensitive
  • Integrity and Authenticity
    • Misrepresentation
    • Spam
    • Cybersecurity
    • Inauthentic behavior
    • False News
    • Manipulated Media
    • Memorialization
  • Respecting Intellectual property

Learn more about Facebook’s Community Standards.

2. What do Facebook’s policies say about online harassment?

Facebook says it does not tolerate any form of bullying and harassment, including threats to releasing personally identifiable information, unwanted malicious contact, target cursing, or claims about romantic involvement, sexual orientation, or gender identity.

When evaluating abusive behavior, Facebook distinguishes between public figures and private individuals, arguing they allow “critical commentary” of people who have a large public audience. On Facebook, “public figures can include celebrities, athletes, bands, politicians, journalists and creators”. While the blue badge on their Page or profile means that this is an authentic Page or profile, keep in mind that not all public figures have blue badges.

“For public figures, we remove attacks that are severe as well as certain attacks where the public figure is directly tagged in the post or comment. For private individuals, our protection goes further: we remove content that’s meant to degrade or shame, including, for example, claims about someone’s sexual activity”, says Facebook.

Sharing and re-sharing posts of certain abusive content may be allowed if it is clear that the purpose is “to condemn or draw attention to bullying and harassment.”

(Do you want to know how other social media platforms like Instagram or Twitter are responding to online harassment? Check out our Social Media Safety Guides.)

 

3. How does Facebook take action against harassment?

After receiving a report, moderators evaluate each case to determine if it violates Facebook’s Community Standards. If so, the platform removes the content and warns the person who posted it. Facebook may also temporarily block the person from using some features on the platform (e.g., sending messages, tagging things, uploading photos). If the abuse continues, Facebook could increase the amount of time they’re blocked from using features or, in some cases, remove their accounts altogether.

Facebook uses automated tools to identify abusive behavior, but in its biannual Community Standards Enforcement Report, published in May 2020, they highlight that “using technology to proactively detect bullying and harassment can be more challenging than other violation types” so in those cases, they tend to rely more heavily on human review.

Facebook is one of the platforms with the highest incidents of harassment. According to a survey on online harassment in the U.S, conducted by the Anti-Defamation League (ADL), of all citizens who were harassed online, 77 percent reported that at least some of their harassment occurred on this platform. Facebook has also faced criticism for not addressing the vast proliferation of hate and abusive behavior on the platform.

 

4. How is Facebook responding to criticism for not adequately limiting the spread of hate speech and abusive behavior?

Facebook has been accused by civil rights groups and different sectors of failing to take sufficient steps to stop the spread of hateful and abusive content on its platform. The social media company has said it has been improving its automated technology for identifying images and text and increasing the number of moderators to combat hate speech.

In a May 2020 report, Facebook announced that between January and March 2020, they deleted a record number of hate speech posts with 9.6 million taken down compared to 5.7 million in the prior period.

But criticism against Facebook intensified at the beginning of June 2020 during protests against police brutality in the U.S. and especially after Facebook refused to take no enforcement action against a series of posts by President Donald Trump following the killing of George Floyd, including one that warned “looting” would lead to “shooting”. Even though Facebook rules say speech that inspires or incites violence is not allowed, CEO Mark Zuckerberg said that social media companies should not be “arbiters of truth.”

“Personally, I have a visceral negative reaction to this kind of divisive and inflammatory rhetoric. … But I’m responsible for reacting not just in my personal capacity but as the leader of an institution committed to free expression”, wrote Zuckerberg. The episode led to an ad boycott which was joined by more than 300 advertisers.

Facebook has said it is giving people more control over how others interact with their posts. The platform has introduced new ways for people to hide or delete multiple comments at once. Facebook has also been testing ways to more easily search for and block offensive words from appearing in comments.

Report Harassment

1. What should I do if someone is harassing me on Facebook?

Remember that, in any case, it is always important to document and take screenshots of the episode of harassment, as this could be useful in any future investigation. If you feel you’re in immediate danger, contact the police or your local authorities.

If you are being harassed on Facebook, the first thing you should do is report the episode to the platform. The best way to report abusive content is by using the Report link that appears near the content itself. Here are some examples of the content you can report:

  • Profiles
  • Posts
  • Posts on Your Timeline
  • Photos and Videos
  • Messages
  • Pages
  • Groups
  • Ads
  • Events
  • Questions
  • Comments

After reporting, you should consider other options to protect yourself, including ‘unfriend’ or remove people from your profile or ‘block’ them to prevent them from starting chats and messages with you, adding you as a friend, and viewing things you share on your Timeline. Be aware that blocking can mask threads, and one of the concerns is making risk assessment more difficult. If you’re scared for your physical and mental safety, consider getting a trusted friend or family member to monitor your account instead. You could also control who you interact with. The Facebook Help Center explains how you can control who reaches your inbox, how to block messages, or how to ignore a conversation.

Facebook also offers tools to help you control your own Profile if the abuse is happening there. You can decide who can post to your Timeline and when posts that you’re tagged in are displayed.

If you’re under 18 and someone’s putting pressure on you that’s sex-related, Facebook recommends contacting local law enforcement or the National Center for Missing & Exploited Children using the CyberTipline at cybertip.org or 1-800-843-5678. They have advisers available 24/7 to help.

If this person is a relative or someone in your household and you need help, contact local law enforcement, go to the National Sexual Assault Hotline online or call the National Sexual Assault Hotline at 1-800-656-HOPE (4673).

 

2. How do I report an abusive comment?

  1. Go to the comment you want to report.
  2. Click next to the comment.
  3. Click Give feedback or report this comment.
  4. To give feedback, click the option that best describes how this comment goes against their Community Standards. If you don’t see any option that fits, click Something Else to search for more.

 

3. How do I report an abusive post?

To report a post:

  1. Click in the top right of the post.
  2. Click Find support or report post.
  3. To give feedback, click the option that best describes how this post goes against their Community Standards. Click Next.
  4. Depending on your feedback, you may then be able to submit a report to Facebook.

 

4. How do I report a Profile?

  1. Go to the profile you want to report.
  2. Click to the right and select Find Support or Report Profile.
  3. To give feedback, click the option that best describes how this profile goes against their Community Standards, then click Next.

 

5. How to report a photo or a video?

  1. Click on the photo or video to expand it. If the profile is locked and you can’t view the full-sized photo, click Find support or report photo.
  2. Hover over the photo or video and click Options in the bottom right corner.
  3. Click Find Support or Report Photo for photos or Find support or report video for videos.
  4. Select the option that best describes the issue and follow the on-screen instructions.

If you’re having trouble reporting something, Facebook recommends log in from a computer and use the report links.

 

6. How to report posts in your Timeline?

  1. In the top right of the post, click .
  2. Click Find support or report post and then click the option that best describes how this post goes against our Community Standards.

You can also block the person who posted this content on your timeline, hide from the timeline, or eliminate.

 

7. How do I report a Facebook group that is abusive?

  1. From your News Feed, click Groups in the left menu or search for the name of the group you want to report.
  2. Click below the cover photo and select Report Group.
  3. Select what’s wrong with the group, click Next then click Done.

 

8. What happens after I report abusive content?

A member of Facebook’s support team will review your report and determine whether it violates Facebook Community Standards and if it should be removed or not. You can always check the status of your report in the Support Inbox. You will receive updates there once they’ve reviewed it. In its policies, Facebook reminds us that reporting something doesn’t guarantee that it will be removed.

 

9. Are my reports kept confidential?

Facebook lets the person whose content has been reported know that a report has been made, but they do not let the person know who reported them. However, in some cases, when a message conversation is between just two people or a back-and-forth on a comment thread, the person who was reported may be able to guess or make assumptions about who reported them.

 

10. Someone is harassing me across various functions on Facebook (photos, comments, private messages, etc.). Do I still need to report each instance separately?

Right now, the best way to ensure that Facebook investigates the abuse is to report each instance separately. If an account has been set up to impersonate you or if an entire Page, Group, or Event has been set up to harass you, you can report the whole account, Page, Group, or Event instead of each individual piece of content.

 

11. Will Facebook notify me when they have dealt with my report?

Yes. You can always check the status of your report in the Support Inbox and Facebook notifies and updates you there once they have reviewed it.

 

12. What can I do if my complaint is rejected? Do I have any recourse?

If you feel like your case wasn’t handled adequately, you should try to report it again. If a mistake is made and Facebook reverses its decision, they will update you in the Support Inbox.

Additionally, you should encourage your friends and family to report the harassment because more reports make it more likely that Facebook will take it down.

Remember that you can always control your interaction with the person posting abusive content by unfriending, unfollowing, or blocking.

Finally, keep in mind that you are not alone in the struggle, and there is a community on Right To Be’s Storytelling platform ready to support you – visit this page to request help. If you feel you are in danger, consider finding legal support and report it to your local authorities as soon as possible.

Check out our resources to find tips on what to do if you experience online harassment.

 

13. Who can report harassment on Facebook? Can I report abuse if I don’t have a Facebook account?

Anyone can report abuse on Facebook. If you see a friend or family member being bullied or harassed, you can report someone on their behalf via the menu above the post that you are concerned about.

Facebook says that there are some cases, though, like name-calling and impersonation, where having the person being targeted report the content is helpful in providing additional context.

If you don’t have a Facebook account, you can report a violation of Facebook Terms with this form. You will be asked to indicate the type of abuse, which may include:

  • My account is hacked
  • Someone is pretending to be me
  • Someone is using my email address for their Facebook account
  • Someone is using my photos or my child’s photos without my permission
  • Something on Facebook violates my rights
  • I found an underage child on Facebook
  • Someone is threatening to share things I want to keep private
  • Other abuse or harassment

Facebook also has forms in their Help Center for special report types like accounts of children under the age of 13 and suicidal content.

 

14. What should I do if someone is asking me to share a nude or sexual picture of myself on Facebook, or is threatening to share a photo that I already sent?

This is what Facebook has established in its policies:

  1. Document the post: You may need a record of the post if you decide to take further action.
  2. Report the person to Facebook using this form to report. Before you submit your report, go to this person’s profile and copy their Facebook URL and email. Facebook will ask for this information when you file your report.
  3. Block this person to prevent them from starting conversations with you or see things you post on your profile.
  4. Consider reporting to local law enforcement.

If you’re under 18, it is important to talk with a parent, teacher, school counselor, or other adults you trust.

In this guide, you can find key definitions you might need when talking to a lawyer and/or law enforcement.

Block on Facebook

1. How do I block someone on Facebook?

  1. Click in the upper-right corner of your homepage and select Settings and Privacy
  2. Hit on Settings and then select Blocking.
  3. Enter the name or email address of the person you want to block and click Block.
  4. If you entered a name, select the specific person you want to block from the list that appears.

If you can’t find someone using this method, you can go to the person’s profile and selecting Block from the menu on their cover photo. Keep in mind that people will not be notified when you block them.

Besides blocking, there are other ways to control who can see things you post on your Timeline. Learn more about your Privacy Settings.

 

2. What happens when I block someone?

People you block can no longer:

  • See things you post on your profile
  • Tag you
  • Invite you to events or groups
  • Start a conversation with you
  • Add you as a friend

 

3. After I block someone, can I see anything about that person?

Yes. Blocking allows you to prevent any interactions with someone on Facebook, but you may still encounter content they’ve shared. Here’s what you might seem according to Facebook’s policies:

  • Messages:
    • Your message history with someone you’ve blocked will stay in your inbox. If the blocked person is ever included in a conversation with a group of friends, you may be able to see the messages.

Mutual friend stories:

  • Photos:
    • You might see photos or tags of the blocked person added by other people.
  • Groups:
    • Someone you’ve blocked won’t be able to add you to a group, but you’ll be able to see groups that the blocked person created or is a member of.
  • Events:
    • Someone you’ve blocked won’t be able to invite you to an event, but you’ll be able to see events to which both you and the blocked person are invited.
  • Games and apps:
    • Since games and apps are run by outside developers, the Facebook block won’t apply to them. So you could see someone you blocked while you’re playing a game (e.g.: in the game’s chat room).

 

4. What other options do I have to stop someone from contacting me?

You can change your settings to prevent anyone from posting on your Timeline. You could also adjust your filters preference to control what types of messages arrive in your inbox. You could also activate ‘Message requests’ which is a tool that tells you when someone you’re not friends with on Facebook has sent you a message.

Our self-care guide will provide you with some tips that will help you to feel better.

Control Your Messages

1. How do I report an abusive message?

To report an abusive message:

  1. Open the abusive message
  2. Click Options in the top right
  3. Click the option ‘Something’s Wrong’

 

2. How do I block messages from someone on Messenger?

To block messages from someone:

Desktop App:

  1. Open a conversation with the person you want to block and click .
  2. Click Block on Messenger > Block.

Desktop (messenger.com):

  1. Below Chats, hover over a conversation with the person you want to block and click .
  2. Click Block Messages > Block Messages.

iPhone and iPad:

  1. From Chats, open a conversation with the person you want to block.
  2. Tap their name at the top of the conversation.
  3. Scroll down and tap Block.
  4. Tap Block on Messenger > Block.

Android:

  1. From Chats, open the conversation with the person you want to block.
  2. Tap their name at the top of the conversation.
  3. Scroll down and tap Block.
  4. Tap Block on Messenger > Block.

You can also ignore a conversation, turn off notifications for a conversation, or delete a conversation.

 

3. What happens when I block messages from someone in Messenger?

According to Facebook:

  • The person you block will no longer be able to contact you (example: send you messages, call you) in Messenger or in Facebook chat.
  • You’ll also no longer be able to contact them in Messenger or in Facebook chat.

When you’re added to a group conversation that includes the person you’ve blocked, you’ll be notified before you enter the conversation. If you choose to enter the group conversation, you’ll able to see their messages and they’ll be able to see yours in that conversation.

Additional Resources

1. How To Increase Privacy

You can use your Privacy Settings to control who gets to see your posts and timeline. (Learn more here) To get to your privacy settings, click Account at the top of any page and select Privacy Settings in the dropdown menu that appears. From there, you can specify privacy for a specific message or post, and control how much information you share.

Facebook also offers a number of security features that help keep your personal information private, including remote logout, secure browsing, log-in approvals, and more. You can find these features on your Account Settings page, in the Account Security section.

Our Technical Safety Guide also offers information on how to increase your digital security.

 

2. What Resources Facebook Offers

Facebook partners with safety organizations around the world and offers resources to its users. Below are two examples:

  • The National Network to End Domestic Violence’s “Privacy and Safety on Facebook: A Guide for Survivors”
  • The Cyber Civil Rights Initiative’s Help Center article on non-consensual sharing of intimate photos.

They also include these resources in their Guidelines:

  • Facebook’s Bullying Prevention Hub provides resources and tips that help teens, parents and educators deal with bullying and its consequences.
  • MTV’s A Thin Line: This campaign empowers kids to identify, respond to and stop the spread of digital abuse in their own lives. The campaign is built on the understanding that there’s a “thin line” between what may begin as a harmless joke and something that could end up having a serious impact.
  • Child Exploitation and Online Protection Centre (CEOP) works to track and bring offenders to account either directly or in partnership with local and international forces.
  • Childnet International works to track and bring offenders to account either directly or in partnership with local and international forces.
  • Commonsense.org provides trustworthy information and education to help kids thrive in a world of media and technology.
  • ConnectSafely.org is an online forum that gives teens and parents a voice in the public discussion about youth online. It also offers many other resources, such as social-media safety tips for teens and parents.
  • Cyberbullying Research Center provides up-to-date information about cyberbullying among adolescents and serves as a center of information about the ways adolescents use and misuse technology.
  • FOSI.org works to make the online world safer for families by encouraging best practices and tools that respect free expression in the field of online safety.
  • The National Center for Missing and Exploited Children serves as the US’s resource on missing and sexually exploited children. It provides information and resources to law enforcement and other professionals, parents and children, including child victims.
  • OnguardOnline.gov is a program of the US’s Federal Trade Commission that provides practical tips to avoid internet fraud.
  • WiredSafety has tools to help young people make smart media and technology choices. Three of their popular programs are STOP cyberbullying, Teenangels and WiredCops.

You can also explore Right To Be’s tools and our Online Harassment resources and learn how other social media platforms are responding to online harassment here.