Reporting and Moderation Guide
Moderation maintains a healthy community. If you know how reporting works and what evidence helps, your reports become far more useful — and moderators can act faster to protect users.
When to report
Report behavior that violates the site’s rules: harassment, explicit content shared without consent, threats, grooming, and repeated harassment. For immediate danger or criminal activity, contact local authorities first and then report to the platform.
What information helps
Screenshots, timestamps, and session IDs are invaluable. Describe what happened succinctly and include any message content that illustrates the violation. Avoid editorializing — stick to facts so moderators can assess the incident quickly.
How moderation processes usually work
Most platforms queue reports and triage them by severity. Urgent violations get faster review; minor issues may trigger warnings or temporary restrictions. Repeat offenders tend to face stronger penalties such as temporary or permanent bans.
Appeals and transparency
If you or someone you reported is sanctioned, many platforms offer an appeals process. Provide supporting information for your appeal, and keep communications factual. Transparency from platforms — clear rules and visible outcomes — builds trust.
Protecting reporters
Some services allow anonymous reporting to shield the reporter’s identity. If you’re worried about retaliation, use these features. Remember to protect your personal data if you’re sharing evidence.
Be a proactive community member
Block repeat offenders, avoid engagement that escalates conflict, and encourage positive behavior. Small community norms — like not sharing personal information or being courteous — reduce conflicts and lower moderation load.
What to expect after reporting
You may receive an automated acknowledgment followed by an outcome notice. If action isn’t taken, consider providing more evidence or escalating via the platform’s support channels.
Participants who understand reporting and moderation contribute to safer spaces. Your reports matter — they help moderators identify patterns and prioritize the most harmful behavior.