Odixcity Consulting

Content Moderator

Odixcity Consulting

Software & Data

2 days ago
Easy apply New

Job summary

The Content Moderator (Trust & Safety Specialist) is responsible for reviewing and monitoring user-generated content to ensure compliance with platform policies, legal requirements, and community standards. This role plays a vital part in maintaining a safe, respectful, and engaging online environment by identifying and removing content that violates guidelines.

Min Qualification: Degree Experience Level: Mid level Experience Length: 4 years

Job descriptions & requirements

Responsibilities:

  • Review and act on reported content, including text, images, and videos, ensuring it meets platform guidelines. Focus will be on high-priority queues and edge cases that require human judgment.
  • Monitor daily queues to identify new patterns of abuse (e.g., new spam techniques, coordinated hate campaigns) and escalate them to the Policy team immediately.
  • Provide feedback on the moderation tool efficiency. Suggest changes to workflows that can increase review speed without sacrificing accuracy.
  • Maintain a high accuracy rate (95+) on all moderation decisions. Participate in calibration sessions with the team to ensure consistency in applying policies.
  • Provide constructive feedback to Policy teams when guidelines are unclear or conflict with real-world context, helping to refine the rulebook for thousands of moderators.
  • Investigate cases where content was removed or accounts were suspended, making final determinations on reinstatement requests with a focus on fairness and due process.
  • Serve as a designated responder during “red alert” situations, such as graphic live-streamed events or coordinated harassment campaigns.


Requirements:

  • Minimum of 4 years of experience in Content Moderation, Trust & Safety Operations, or community Management for a major tech/social media platform.
  • Ability to spot subtle violations that automated systems miss (e.g., hate symbols hidden in images).
  • Comfortable using moderation tools (e.g., Hive, Besedo, Salesforce) and Google Workspace.
  • Experience handling spikes in content volume during global events or viral challenges.
  • High level of emotional fortitude. Must be comfortable reviewing disturbing content (violence, hate speech, adult content) and have proven strategies for digital wellness.
  • Deep understanding of regional nuances, cultural sensitivities, and historical contexts. Ability to distinguish between hate speech and protected political speech, or between violent extremism and documentary/news content.
  • Proven track record of maintaining quality metrics while processing a high volume of content (e.g., 80-100+ pieces per hour).
  • Ability to stay focused during repetitive tasks without losing attention to detail.

 


Remuneration: NGN 500,000 monthly

Important safety tips

  • Do not make any payment without confirming with the Jobberman Customer Support Team.
  • If you think this advert is not genuine, please report it via the Report Job link below.

This action will pause all job alerts. Are you sure?

Cancel Proceed

Similar jobs

Lorem ipsum

Lorem ipsum dolor (Location) Lorem ipsum Confidential
3 years ago

Stay Updated

Join our newsletter and get the latest job listings and career insights delivered straight to your inbox.

v2.homepage.newsletter_signup.choose_type

We care about the protection of your data. Read our

We care about the protection of your data. Read our  privacy policy .

Follow us On:
Get it on Google Play
2026 Jobberman

Or your alerts