Events  Deals  Jobs 
    Sign in  
 
 
ShopRunner // network of retailers
 
Engineering, Full Time    Chicago, Conshohocken    Posted: Friday, March 26, 2021
 
   
 
Apply To Job
 
 
JOB DETAILS
 

ABOUT US

At ShopRunner, we connect consumers to the brands they love by way of a thriving marketplace & a members-only service that provides benefits across 100+ of the top brands & retailers in the ecommerce space. We build products that provide a premier ecommerce experience by delivering elevated, data-driven content. The landscape of retail is changing & we're here to empower retailers to take their place in that exciting evolution. 

We have people in offices around the world: Headquartered in Chicago, with offices in New York, Conshohocken, PA (Philly area), & Krakow, Poland.

ABOUT THE ROLE:

As a Data Engineer at Shoprunner, youll help power many of our data-backed solutions & manage the large scale data we ingest from our merchant partners.  Data Engineering will work in collaboration with our Enterprise, Consumer, & Data Science teams to bring data models to production, power solutions for our merchant partners, & create more personalized experiences for our customers in our never-ending quest to help our shoppers & retailers connect in new ways & new applications.

ABOUT WHAT YOULL DO

  • Own the key data pipelines which enable real-time event handling, smarter personalization, & more nimble applications.
  • Develop frameworks to productize our machine-learning models that give our members more product choices. 
  • Help us define & manage our big data infrastructure including Kinesis/Kafka streams, Apache Spark & Snowflake data warehouse.
  • Help us evolve our service architecture, embracing architecture approaches such as 12 factor, microservices, & well-formed APIs to allow our architecture to scale both internally & externally.

ABOUT WHAT WERE LOOKING FOR

  • Experience working in an Agile environment & using a VCS like Git.
  • Experience writing production code in Python or JVM-based systems.
  • Experience with data stores & technologies such as Spark, Airflow, ElasticSearch, Kinesis, Kafka, Postgres, MySQL, & Snowflake & a strong understanding of SQL.
  • Experience with building REST APIs to serve & consume data
  • Experience with building batch/streaming ELT pipelines to move & transform data.
  • Experience with data transformation tools & techniques & workflow management
  • Experience optimizing larger applications to increase speed, scalability, & extensibility.
  • Proven self-starter with a strong desire to learn new technologies & work independently but knows to seek help when they are blocked.

We want you to bring your whole human self to work every day. We accept you for who you are & consider everybody on an equal opportunity basis without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. 

 
 
 
Apply To Job
 
 
 
 
 
© 2021 GarysGuide      About    Feedback    Press    Terms