Develop, maintain, and optimize global content and community policies (e.g., Community Guidelines, Advertising Policies, Creator Policies) to ensure platform safety and ecosystem health.
Translate policy principles into clear, actionable moderation standards and operational guidelines to support efficient execution by moderation teams.
Collaborate with cross-functional teams including Moderation, Legal, Product, and Engineering to ensure consistent policy understanding and smooth implementation.
Identify and assess platform risks, monitor policy enforcement outcomes and industry trends, and continuously drive policy improvement and iteration.
Qualifications
Bachelor's degree or above; majors in Law, Public Policy, International Relations, Media Studies, Sociology, or related fields are preferred.
4+ years of experience in Trust & Safety, content policy, platform governance, or related areas; international or cross-border work experience is preferred.
Fluent in English as a working language.
Strong cross-cultural communication skills, with an understanding of user behavior and cultural differences across regions.
Ability to thrive in a fast-paced, multi-time-zone collaborative environment.
Preferred Qualifications
Experience in content policy at global internet platforms such as YouTube, Meta, TikTok, or Twitter/X.
Familiarity with international platform governance and data privacy regulations (e.g., GDPR, DSA, COPPA).
Experience handling cross-border content risks or global major events.