Senior Data Engineer, Social Integrations
Who we are
DoubleVerify is the leader in digital performance solutions, improving the impression quality & audience impact of digital advertising. Built on best practices, DoubleVerify solutions create value for media buyers & sellers by bringing transparency & accountability to the market, ensuring ad viewability, brand safety, fraud protection, accurate impression delivery & audience quality across campaigns to drive performance.
Since 2008, DoubleVerify has helped hundreds of Fortune 500 companies gain the most value out of their media spend by delivering best in class solutions across the digital ecosystem that help build a better industry.
As Senior Data Engineer in Social Integrations team, you will lead new initiatives, designs & builds deep integrations with the worlds biggest social platforms & other walled gardens in order to measure ad performance.
You will work directly with engineers from the largest internet companies in order to collaborate on APIs development.
What youll do
- Design, develop, & test data-driven products & features
- Train & mentor, a team of software engineers
- Design & develop robust applications, services, & APIs that scale
- Improve data quality through testing, tooling & continuously evaluating the performance
- Understand business needs & work with product owners to establish priorities
- Bridge the gap between Business / Product requirements & technical details
- Integrating with existing systems & APIs, refactoring them as needed
- Continuously improve quality of deliverables & SDLC processes
- Explore new ways of producing, processing, & analyzing data in order to gain insights into both our users & our product features
- Work in multi-functional agile teams with end-to-end responsibility for product development & delivery
Who you are
- Lead by example - design, develop & deliver quality solutions.
- At least 6 years experience coding in python/scala/java/c#
- Deep understanding of web technologies, standards, protocols, etc.
- You are passionate about crafting clean code & have a steady foundation in coding & building data pipelines
- Excellent Experience working in distributed environments
- Experience integrating with 3rd party APIs
- Experience with big data technologies & working with data (ETL, processing)
- Very good SQL query writing abilities & data understanding
- Have good DevOps skills - working with build servers, docker & containers clusters (kubernetes)
- Have B.Sc./M.Sc. in Computer Science or a related field
- Excellent communication skills & a team player
- Experience with working in a cloud environment, Google Cloud Platform
- Experience with container technologies - Docker / Kubernetes
- Have a good understanding of ad serving technologies & standards
- Have worked in Kafka