- Looking for a tech savvy Data Engineer to Design, Develop and Support ETL interfaces of a big data marketing technology platform built on AWS.
- Understand the existing landscape, document and optimize the pipelines for best performance
- Interact with Business and Marketing users, Data Scientists and other developers.
Experience: 4-6 years (may consider deviations for exceptional candidates)
Skills and Qualification
- At least 2+ years working experience in AWS Data Services like Redshift, Glue, EMR or Hadoop Stack & RDS(Aurora)
- Experience building data pipelines, ETLs using Hive(preferred) and Spark
- Strong SQL and good command on databases
- Strong communication and interpersonal skills
Good to Have
- Expertise with Python Language and Apache PySpark preferred
- Knowledge/experience building pipelines and optimizing using Pentaho Data Integration(PDI)
- Ability to leverage data assets to respond to complex questions that require timely answers
Flexible to work in UK shift (1:30pm to 10:30pm IST)
The Job is closed. Check the latest active jobs here.