NYC  SF        Events   Jobs   Deals  
    Sign in  
 
 
 

DoubleVerify // digital media measurement software & analytics
Apply To Job
 
 

 

Who we are

DoubleVerify is an Israeli-founded big data analytics company (Stock: NYSE: DV). We track & analyze tens of billions of ads every day for the biggest brands in the world.
We operate at a massive scale, handling over 100B events per day & over 1M RPS at peak, we process events in real-time at low latencies (ms) & analyze over 2.5M video years every day. We verify that all ads are fraud free, appear next to appropriate content, appear to people in the right geography & measure the viewability & users engagement throughout the ads lifecycle. 

We are global, with HQ in NYC & R&D centers in Tel Aviv, New York, Finland, Berlin, Belgium & San Diego. We work in a fast-paced environment & have a lot of challenges to solve. If you like to work in a huge scale environment & want to help us build products that have a huge impact on the industry, & the web - then your place is with us.

 

What you'll do

You will join the Traffic Team, a core engineering team operating at the heart of the company's measurement system.

You will Build & maintain high-throughput streaming systems processing 100B+ daily events.

Tackle performance & optimization challenges that make interview questions actually relevant

Design & implement real-time data processing pipelines using Kafka, Databricks/Spark, & distributed computing

Lead projects end-to-end: design, development, integration, deployment, & production support

Who you are

  • 5+ years of software development experience with JVM-based languages (Scala, Java, Kotlin) with strong functional programming skills
  • Strong grasp of Computer Science fundamentals: functional programming paradigms, object-oriented design, data structures, concurrent/distributed systems
  • Proven experience with high-scale, real-time streaming systems & big data processing.
  • Experience & deep understanding of a wide array of technologies, including:
  • Stream processing: Kafka, Kafka Streams, or similar frameworks (Flink, Spark Streaming, Pulsar).
  • Concurrency frameworks: Akka, Pekko, or equivalent actor systems/reactive programming.
  • Data platforms: Databricks, Spark, Delta Lake, or similar lakehouse technologies.
  • Microservices & containerization: Docker, Kubernetes.
  • Modern databases: Experience across analytical databases (ClickHouse, Snowflake, BigQuery), NoSQL (Cassandra, MongoDB), & columnar stores
  • Cloud infrastructure: GCP or AWS.
  • Hands-on experience developing with AI tools (Cursor, Claude Code, etc..) .
  • Strong DevOps mindset: CI/CD pipelines (GitLab preferred), infrastructure as code, monitoring/alerting.
  • BSc in Computer Science or equivalent experience.
  • Excellent communication skills & ability to collaborate across teams.

Nice to have

  • Previous experience in ad-tech.
  • Experience with schema evolution & data serialization (Avro, Protobuf, Parquet)

#Hybrid#

 
 
 
 
 
About    Feedback    Press    Terms    Gary's Red Tie
 
© 2026 GarysGuide