THE ROLE
As a Data Platform Engineer at MPOWER, you will be designing & scaling the cloud-native data systems that power our analytics, operations, & mission-driven decisions. This role is perfect for someone who thrives on building automated pipelines, shaping resilient infrastructure, & enabling insights that impact students around the world.
Youll join a high-impact, cross-functional team that values automation, observability, & secure-by-default architecture. Youll work closely with engineering, analytics, & product stakeholders to ensure data flows seamlessly, securely, & with purpose. Your responsibilities will include, but are not limited to:
- Building, deploying, & maintaining production-grade data workflows using Apache Airflow, AWS Glue, & Lambda to support analytics & business operations
- Managing & provisioning cloud-native infrastructure using Terraform, including services such as Redshift, S3, RDS, Athena, & IAM
- Developing & maintaining automated CI/CD pipelines (e.g., Bitbucket Pipelines, GitHub Actions) to deploy data jobs & infrastructure seamlessly
- Implementing & tuning monitoring , alerting, & logging systems using Datadog, CloudWatch, & custom dashboards to ensure platform reliability & observability
- Writing & optimizing efficient Python & SQL code to support data ingestion, transformation, & workflow automation
- Enforcing secure data operations by applying best practices with IAM, AWS Secrets Manager, & audit-compliant access controls
- Partnering with cross-functional stakeholders to understand analytical & operational data needs & deliver scalable solutions
- Recommending & integrating tools & practices for observability, data validation, & performance optimization across the data platform
THE QUALIFICATIONS
- Bachelors degree in Computer Science, Engineering, Information Systems, or related field
- 46 years of experience in Data Platform, Data Engineering, DevOps, or DataOps roles
- Proven experience working with cloud-native data platforms, especially AWS
- Strong proficiency in Python & SQL for data transformation, automation, & workflow logic
- Hands-on experience with orchestration tools such as Apache Airflow, AWS Glue, & Lambda for managing ETL/ELT pipelines
- Deep knowledge of AWS services including Redshift, S3, IAM, RDS, & Athena, & experience managing them via Terraform (IaC)
- Proven ability to build & manage CI/CD pipelines (e.g., Bitbucket Pipelines, GitHub Actions) for data infrastructure automation
- Experience with monitoring & alerting tools like Datadog & CloudWatch, with a focus on reliability & observability
- Strong problem-solving & collaboration skills, with the ability to work cross-functionally & communicate technical concepts clearly to non-technical stakeholders
- Excellent written & verbal English communication skills
A passion for financial inclusion & access to higher education is a must, as well as comfort working with a global team across multiple time zones & locations!
In addition, you should be comfortable working in a fast growth environment, meaning a small agile team, fast-evolving roles & responsibilities, variable workload, tight deadlines, a high degree of autonomy, & 80-20 everything.