DoubleVerify is the recognized market innovator with the technology & tools that accurately authenticate the quality of digital media & drive ad performance for the world's largest brands. DV provides media transparency & accountability to deliver the highest level of impression quality for maximum advertising performance. Since 2008, DV has helped hundreds of Fortune 500 companies gain the most from their media spend by delivering best in class solutions across the digital ecosystem that help build a better industry. Learn more at doubleverify.com.
Doubleverify is seeking a Product Policy Lead to join our Product Policy Team. The Product Policy Lead will be responsible for overseeing & growing a team of policy managers to research & develop our product policy. The person in this role will work across content verticals ranging from misinformation & hate speech to crime, violence & disasters. Candidates should have knowledge of or interest in internet policy issues & media regulatory issues and/or law.
Drive the teams strategic planning & effective execution of issue policies.
Guide & oversee policy positions for policies that inform content classification.
Maintain strong working relationships with product management & operations teams to create new policies & implement policies across features
Work closely with cross functional partners to understand & research public opinions & advertiser perspectives in order to inform policy development including policy creation & policy adjustments.
Collaborate with linguists & data scientists creating classification models based on product policy.
5+ years of work experience in trust & safety, product policy, content policy or brand safety & suitability roles or in media regulatory issues, regulation of user generated content, or media policy.
Experience in or working with the technology industry.
Excellent writing & analytical skills.
Familiarity with web & social media content trends.
Interest in & passion for policy & content classification/moderation.
Experience building & leading teams
Familiarity with national or international policy proposals or frameworks for addressing harmful content online.
Familiarity with internet policy issues.
Familiarity with content moderation or classification.
Masters in Public Policy, International Relations or Law.