Content Moderator
Job summary
The Content Moderator (Trust & Safety Specialist) is responsible for reviewing and monitoring user-generated content to ensure compliance with platform policies, legal requirements, and community standards. This role plays a vital part in maintaining a safe, respectful, and engaging online environment by identifying and removing content that violates guidelines.
Job descriptions & requirements
Responsibilities:
- Review and act on reported content, including text, images, and videos, ensuring it meets platform guidelines. Focus will be on high-priority queues and edge cases that require human judgment.
- Monitor daily queues to identify new patterns of abuse (e.g., new spam techniques, coordinated hate campaigns) and escalate them to the Policy team immediately.
- Provide feedback on the moderation tool efficiency. Suggest changes to workflows that can increase review speed without sacrificing accuracy.
- Maintain a high accuracy rate (95+) on all moderation decisions. Participate in calibration sessions with the team to ensure consistency in applying policies.
- Provide constructive feedback to Policy teams when guidelines are unclear or conflict with real-world context, helping to refine the rulebook for thousands of moderators.
- Investigate cases where content was removed or accounts were suspended, making final determinations on reinstatement requests with a focus on fairness and due process.
- Serve as a designated responder during “red alert” situations, such as graphic live-streamed events or coordinated harassment campaigns.
Requirements:
- Minimum of 4 years of experience in Content Moderation, Trust & Safety Operations, or community Management for a major tech/social media platform.
- Ability to spot subtle violations that automated systems miss (e.g., hate symbols hidden in images).
- Comfortable using moderation tools (e.g., Hive, Besedo, Salesforce) and Google Workspace.
- Experience handling spikes in content volume during global events or viral challenges.
- High level of emotional fortitude. Must be comfortable reviewing disturbing content (violence, hate speech, adult content) and have proven strategies for digital wellness.
- Deep understanding of regional nuances, cultural sensitivities, and historical contexts. Ability to distinguish between hate speech and protected political speech, or between violent extremism and documentary/news content.
- Proven track record of maintaining quality metrics while processing a high volume of content (e.g., 80-100+ pieces per hour).
- Ability to stay focused during repetitive tasks without losing attention to detail.
Remuneration: NGN 500,000 monthly
Important safety tips
- Do not make any payment without confirming with the Jobberman Customer Support Team.
- If you think this advert is not genuine, please report it via the Report Job link below.