Who we are
DoubleVerify is an Israeli-founded big data analytics company (Stock: NYSE: DV). We track & analyze tens of billions of ads every day for the biggest brands in the world. We operate at a massive scale, handling over 100B events per day & over 1M RPS at peak, we process events in real-time at low latencies (ms) & analyze over 2.5M video years every day. We verify that all ads are fraud free, appear next to appropriate content, appear to people in the right geography & measure the viewability & users engagement throughout the ads lifecycle.
We are global, with HQ in NYC & R&D centers in Tel Aviv, New York, Finland, Berlin, Belgium & San Diego. We work in a fast-paced environment & have a lot of challenges to solve. If you like to work in a huge scale environment & want to help us build products that have a huge impact on the industry, & the web - then your place is with us.
What will you do
You will join a team of experienced engineers & help them in developing our innovative classification products.
You will lead projects by architecting, designing & implementing solutions that will impact the core components of our system.
Youll develop new & awesome features while leveraging cloud native technology stack, do continuous improvements of our development process by adapting new technologies, & using them to solve product & engineering challenges while raising the bar of code quality & standards.
Who you are
- 5+ years of experience coding in an industry-standard language such as Scala, Java, Rust, Closure, Kotlin, Go etc.
- Deep understanding of Computer Science fundamentals: object-oriented design, functional programming, data structures, multi-threading & distributed systems.
- Experience with in-memory distributed cache such as Aerospike or Redis & messaging systems such as Apache Kafka, etc.
- Experience working with Docker, Kubernetes & designing scalable microservices architecture.
- Experience in working with SQL (MySQL, PostgreSQL) & Columnar/NoSQL Databases such as (BigQuery, Vertica, Snowflake, Couchbase, Cassandra, etc.).
- Experience working in a BigData environment & building scalable distributed systems with stream processing technologies such as Akka Streams, Kafka Streams/Spark/Flink.
- Experience working with cloud providers such as GCP or AWS
- BSc in Computer Science or equivalent experience.
- Experience with Agile development, CI/CD pipelines (Git ,GitLab or Jenkins) & coding for automated testing.
- A versatile developer with a getting-things-done attitude.
Nice to have
- Previous experience with online advertising technologies is a big plus.
- Familiarity with the cloud-native computing foundation.
#Hybrid
|