Aws (with Pyspark) | 9 To 12 Years | Pune & Bengaluru Job in Capgemini
Aws (with Pyspark) | 9 To 12 Years | Pune & Bengaluru
Capgemini
4+ weeks ago
- Mumbai, Maharashtra
- Not Disclosed
- Full-time
Job Summary
Job Description
- Need to have strong python Scala pyspark and spark coding experience
- At least 2years experience working with Big data technologies AWS
- Build batch and Realtime pipeline on AWS
- Solid understanding and hands on Github Bitbucket Kinesis Lambda EMR Glue Athena Redshift S3
- The candidate should have a good understanding of serverless architecture Solid understanding of DB SQL NoSQL
- The candidate should have a good understanding of Application security in Cloud environments
- AWS Certified Developer Associate or AWS Certified Solution Architect Associate
- Solid understanding of Agile Sprint methodologies Quick learner with positive attitude or passion towards Cloud Data and development Strong Communication problem solving Client interaction skills
- PySpark
- AWS (Glue, Athena, Lambda, S3, EMR)
- Lead and own the development of codes and process
- Lead and mentor junior developers
- NoSQL technologies MongoDB Cassandra
- Apache stack Nifi Kafka HBase
- Maintain the Github repository in a structured and secured way Serverless Orchestration using Step Functions AIrflow Data Lake Knowledge
Ref:
501026
Posted on:
August 21, 2020
Experience level:
Experienced
Contract type:
Permanent
Location:
Mumbai
Business units:
I and D Global Practice
Department:
Cloud

Similar Jobs for you

Help us improve JobGrin
Need Help? Contact us