Aws (with Pyspark) | 9 To 12 Years | Pune & Bengaluru Job in Capgemini

Aws (with Pyspark) | 9 To 12 Years | Pune & Bengaluru

Apply Now
Job Summary Job Description
  • Need to have strong python Scala pyspark and spark coding experience
  • At least 2years experience working with Big data technologies AWS
  • Build batch and Realtime pipeline on AWS
  • Solid understanding and hands on Github Bitbucket Kinesis Lambda EMR Glue Athena Redshift S3
  • The candidate should have a good understanding of serverless architecture Solid understanding of DB SQL NoSQL
  • The candidate should have a good understanding of Application security in Cloud environments
  • AWS Certified Developer Associate or AWS Certified Solution Architect Associate
  • Solid understanding of Agile Sprint methodologies Quick learner with positive attitude or passion towards Cloud Data and development Strong Communication problem solving Client interaction skills
Primary Skills
  • PySpark
  • AWS (Glue, Athena, Lambda, S3, EMR)
Secondary Skills
  • Lead and own the development of codes and process
  • Lead and mentor junior developers
  • NoSQL technologies MongoDB Cassandra
  • Apache stack Nifi Kafka HBase
  • Maintain the Github repository in a structured and secured way Serverless Orchestration using Step Functions AIrflow Data Lake Knowledge

Ref:

501026

Posted on:

August 21, 2020

Experience level:

Experienced

Contract type:

Permanent

Location:

Mumbai

Business units:

I and D Global Practice

Department:

Cloud

Experience Required :

Fresher

Vacancy :

2 - 4 Hires

Similar Jobs for you