Facebook announced some changes that are designed to keep people safe within Facebook Groups. In part, Facebook will hold Group admins accountable for what is posted in their group.
Facebook says one way they keep people safe is by proactively identifying and removing posts and groups that break their rules. Facebook has been using AI and machine learning “to proactively detect bad content before anyone reports it, and sometimes before people even see it.”. It also uses human moderators.
Facebook has created new Group Privacy settings:
By default, a group that was formerly “secret” will now be “private” and “hidden”. A group that was formerly “closed” will now be “private” and “visible”. Groups that are “public” will remain “public” and “visible”.
Here are some factors Facebook considers when deciding if a Group should come down:
- Does the name or description of the group include hate speech or other content Facebook doesn’t allow?
- If group leaders often break Facebook’s rules, or if they commonly approve posts from other members who break Facebook’s rules, those are clear strikes against the overall group.
- If a group member repeatedly violates Facebook’s standards, Facebook will start requiring admins to review their posts before anyone else can see them. Then, if an admin approves a post that breaks Facebook’s rules, it will count against the whole group.
It sounds like people who participate in Groups on Facebook really need to choose wisely when selecting admins. Facebook’s emphasis that their rules apply within Groups is likely going to deter those who have been de-platformed from other online spaces. I guess that’s one way to help keep people safe in Private Groups.