Responsibilities
- Review and moderate user-generated content (text, images, videos, and audio) in line with company policies and community guidelines.
- Support quality assurance initiatives by identifying gaps and suggesting process improvements.
- Identify, assess, and take appropriate action on content that violates platform standards.
- Provide advisory support and recommendations on content moderation decisions and best practices.
- Escalate sensitive or high-risk cases to relevant teams in a timely and accurate manner.
- Stay up to date with changes in policies, regulations, and emerging content trends.
- Collaborate with internal teams to improve moderation workflows and policy enforcement.
Requirements
- Bachelor’s degree or relevant experience in content moderation, trust & safety, or customer support.
- English proficiency at C1/C2.
- Strong analytical and decision-making skills with high attention to detail.
- Ability to handle sensitive or potentially distressing content in a professional manner.
- Excellent written and verbal communication skills.
- Ability to work in a team and independently while adhering to strict guidelines and deadlines.
- Flexible to work onsite, hybrid, or remotely.
Preferred Qualifications (Nice to Have)
- Previous experience in content moderation or advisory roles.
How to Apply
- By applying, you agree that we may create a profile for you on Simera to continue your application.