April 24 (UPI) -- Facebook on Tuesday published internal rules on what moderators look for when deciding to remove content and announced launching an appeals option.
The social media giant, which has 2.1 billion users and counting, said it wants to make sure the public knows how decisions are made to remove content.
Facebook's updated Community Standards guidelines are what moderators use when deciding what is or is not acceptable content -- including bullying, gun sales, nudity and hate speech.
Moderators working for Facebook sift through millions of reports each week from users about inappropriate posts, groups or pages. Posts are also flagged by the company's automated systems.
The new guidelines give users a way to petition the decision if they believe content is unfairly removed, or for content that was flagged by a user and not removed. Appeals will be sent to a human moderator who will issue a decision within 24 hours.
To engage with communities about what is working and what is not, Facebook is launching a series of forums beginning in Europe and coming to the United States and other countries later this year.
Last month, Facebook unveiled changes to its privacy settings -- making it easier for users to control what they share and delete collected data.
In a statement, Facebook said it heard "loud and clear" that privacy settings are difficult to find and access on the social networking site.
The new privacy settings came after it was revealed Facebook shared personal data of nearly 50 million users with Cambridge Analytica, a British company accused of using the information to post targeted political ads for President Donald Trump.