Hadoop Spark With Scala | 6 To 9 Years | Bengaluru & Mumbai Job in Capgemini

Hadoop Spark With Scala | 6 To 9 Years | Bengaluru & Mumbai

Apply Now
Job Summary Job Description
  • Hadoop Kafka and Spark Hadoop Development
  • Spark development with Scala mainly Streaming and SQL
  • Kafka development with Scala
  • Hadoop architecture knowledge HDFS YARN HBase Spark Kafka
Primary Skills
  • Spark (SQL, Streaming)
  • SCALA
  • Kafka
Secondary Skills
  • Development with Maven SVN Git preferred but not mandatory
  • Other Hortonworks Data Platform components Ranger Zeppelin if available

Ref:

509223

Posted on:

September 9, 2020

Experience level:

Experienced

Contract type:

Permanent

Location:

Bangalore

Business units:

SBU Shared Services

Department:

Big Data & Analytics

Experience Required :

Fresher

Vacancy :

2 - 4 Hires

Apply Now
Similar Jobs for you