Start Date:
January 2026
Contract:
11-month fixed-term (Contingent Worker)
Location:
Fully onsite in Dublin
We are seeking a skilled and detail-oriented Content Moderation and Quality Assurance Specialist to join our team on a contract basis. In this role, you will be responsible for training cutting-edge AI models for content moderation, ensuring the accuracy and consistency of moderation decisions, and maintaining high-quality standards across multiple vendor sites.
This is an excellent opportunity for someone passionate about content integrity, online safety, and quality assurance to make a real impact in a fast-evolving space.
Please note: This role involves exposure to potentially graphic or sensitive content, including but not limited to images, videos, or text related to graphic violence, child exploitation, animal abuse, self-injury, or offensive language.
Key Responsibilities
* Content Moderation Quality Assurance: Review and evaluate AI-generated moderation decisions to ensure they meet accuracy, consistency, and policy standards
* Golden Set Creation: Develop and maintain benchmark datasets ("golden sets") to test and calibrate AI and vendor performance.
* Vendor Calibration: Partner with vendor sites to identify improvement opportunities, provide feedback, and support calibration efforts to enhance quality.
* Vendor Management: Build and maintain strong working relationships with vendors, offering guidance, metrics, and performance insights.
* Process Improvement: Identify opportunities to optimize moderation workflows, tools, and processes in collaboration with internal teams.
* Data Analysis: Analyze data to identify trends and insights that inform quality assurance and performance strategies.
* Cross-functional Communication: Collaborate with internal stakeholders, including operations, policy, and leadership teams, to align on QA initiatives.
* Training & Development: Participate in ongoing training to stay current on evolving policies, procedures, and industry best practices.
* Sensitive Content Handling: Review and manage potentially objectionable or disturbing material in accordance with established safety and wellbeing protocols.
* Societal Impact Assessment: Evaluate the potential societal effects of various forms of online content and behavior, supporting informed moderation practices.
Requirements
* 4+ years of experience in content moderation, quality assurance, or a related field
* Strong understanding of content moderation policies and workflows.
* Excellent analytical, problem-solving, and communication skills.
* Ability to work both independently and collaboratively in a fast-paced environment
* Exceptional attention to detail and accuracy.
* Experience managing or collaborating with vendors or external partners is preferred.
* Familiarity with data analysis and reporting tools is an advantage.
* Experience reviewing AI-generated content or conversations.