- You will be Crafting and maintain efficient data pipeline architecture
- You should Assemble large, complex data sets that meet functional / non-functional business requirements.
- You will Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- You Build the infrastructure required for efficient extraction, transformation, and loading of data from a wide variety of data sources using AWS big data technologies.
- Build data Applications in cloud based eco-systems such as Amazon, Microsoft Azure.
- Build data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader!
- You will be working with data and analytics specialists to strive for greater functionality in our data systems.
- You should be Experience in building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Should have Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- You Must have Certified in one of the cloud platforms such as Azure, AWS etc.
- 3- 5years of experience in operating a critical high-available IT Infrastructure
- Should have Solid grasp of message queuing, stream processing, and highly scalable big data data stores.
- Should be Experience supporting and working with multi-functional teams in a multifaceted environment.