CLEAR transforms what is uniquely you your fingerprints, your face, your eyes into a secure, biometric key to frictionless experiences. We are creating a world where travel is effortless, where accessing your office building is as simple as walking in, & where shopping is as easy as walking in & out of a storewithout ever once showing an ID or credit card. CLEAR currently powers secure, frictionless customer experiences in 30 U.S. airports & venues. With over 3 million members so far, CLEAR is the identity platform of the future, today.
Through cutting edge biometrics & advanced Homeland Security certified data algorithms, CLEAR products guarantee identity & protect travelers & sports fans, while speeding them through security.
CLEAR is continuously extending its platform with new innovations, products, & IP. The Science & Analytics team delivers valuable insights to both internal & external partners via its next generation data platform. Leveraging its capabilities, we work alongside partners to best understand their most pressing data needs then build trusted solutions to support them. Sometimes that solution takes the form of an internal BI dashboard, sometimes delivered as mathematical model in a product feature & yet in other cases we create solutions that give CLEAR differentiated competitive advantage in the marketplace.
Were seeking an interdisciplinary data engineer to focus on building the data platform to feed our mathematical models & algorithmic research. As a critical member of our research & development team, you will have a prominent voice in the future of our company. Youre a deep thinker who enjoys solving critical problems, & can own a solution from end to end. You will build a highly scalable, high throughput, model processing pipeline to support our large scale, real-time predictive risk platform. You will have considerable autonomy in setting & executing on a plan to build out the platform.
What You Will Do:
- You have a strong desire to work in a highly collaborative, team oriented, intellectually curious environment.
- Collaborating with our project manager & the VP of Data Scientist to distill ambiguous business requirements into detailed platform architectures & designs.
- Build a pioneering platform based on those business requirements, including setting up virtual private clusters (in VPCs) & storage blocks in AWS.
- Installing & scaling DWH instances in the VPC.
- Working with the VP of Data Scientist to identify data sources to ingest into the Experimentation Platform.
- Build & scale the star/snowflake data model.
- Build ETL code to ingest from disparate data sources into the Experimentation Platform.
- Interfacing with the Data Science team to install/setup the appropriate analytics warehouse & data science tools.
- Ingesting new data as requested.
Who You Are:
- Bachelor's degree in computer science or 3+ years of experience working with databases.
- Experience designing & coding Big Data solutions utilizing Spark, Snowflake, Kafka, or similar technologies.
- Working with service oriented architectures, web services & cloud technologies in AWS.
- Demonstrated results oriented achievement in creative data solutions.
- Strong knowledge of data structures, algorithms, enterprise systems, & asynchronous architectures.