Who We Are
HomeLight is a venture-backed technology startup revolutionizing the $1 trillion real estate industry. Our mission is simple we empower people to make smarter decisions during one of lifes most important moments: buying or selling their home.
HomeLights technology analyzes millions of home transactions to determine which agent or cash buyer is right for you. We also offer innovative financing & closing solutions, creating an end-to-end real estate experience that's simple, certain, & satisfying.
We pride ourselves on our company culture but dont just take it from us. Weve been recognized as a best place to work by Forbes, Inc. Magazine, & the San Francisco Business Times. Our team breaks barriers every day while staying committed to HomeLight's goals & core values, which is a crucial element to our shared success.
Who You Are
We are building our Data Engineering team to tackle HomeLight's diverse, data challenges. This position is an excellent opportunity for an engineer that wants to own the development, optimization, & operation of our data pipeline, which collects, processes, & distributes data to a suite of HomeLight products & teams. You will provide mission-critical data to both our algorithms & internal users, refining our product & identifying new markets.
What You'll Do Here
Some projects you will work on:
- Optimize & execute on requests to pull, analyze, interpret & visualize data
- Partner with team leaders across the organization to build out & iterate on team, & individual performance metrics
- Optimize our data release processes, & partner with team leads to iterate on & improve existing data pipelines.
- Design & develop systems that ingest & transform our data streams using the latest tools.
- Design, build, & integrate new cutting edge databases & data warehouses, develop new data schemas & figure out new innovative ways of storing & representing our data.
- Research, architect, build, & test robust, highly available & massively scalable systems, software, & services.
- 3+ years of Python & ETL experience, preferably Airflow
- Experience writing & executing complex SQL queries
- Experience building data pipelines & ETL design (implementation & maintenance)
- Scrum/Agile software development process.
Bonus points for
- Expertise with Ruby on Rails.
- Familiarity with AWS, Elasticsearch, Ruby/Rails, Django, Heroku
- Experience setting up & managing internal API services.
- Experience working on a small team, ideally at a startup.
- Familiarity with the Amazon AWS ecosystem