Hadoop Admin Job in Modak Analytics

Hadoop Admin

Apply Now
Job Summary

Job Description:

We are seeking some experienced personnel who under minimal direction performs as a fully experienced Hadoop/Cloudera System Administrator. The candidate will manage the live Hadoop infrastructure for the organization, Including production infrastructure. The Candidate will use Hadoop familiarity to support the development team in the development of big data analytics. The Hadoop Engineer may also collaborate on system architecture and engineering efforts concerning the Hadoop infrastructure.

Responsibilities:

  • Hadoop administration including multiple products on Bigdata ecosystem.
  • Hadoop Cluster configuration and securit.
  • Capacity planning, performance tuning.
  • Strong scripting background in automation, configuration management.
  • HDFS maintenance and support.
  • Onboarding users onto the platform.
  • Supporting deployments of parcels and analytics.
  • Heavily Process oriented and good communication skills
  • Seeks to improve personal job-related knowledge and departmental process by studying state-of-the-art tools, participating in educational opportunities, and reading professional publications.

Skills Required:

  • Ability to support large scale Production Hadoop environments in any of the Hadoop distributions.
  • Proficiency in Designing, Capacity planning and cluster setup for Hadoop.
  • Hadoop operational expertise such as troubleshooting skills, bottlenecks, Management of data, users, and job execution, basics of memory, CPU, OS, storage, and network.
  • Experience in any of the Scripting Language (Perl, Shell, Python).
  • Product knowledge on Hadoop distributions such as Cloudera, Hortonwork or MapR.
  • Administration, maintenance, control, and optimization of Hadoop capacity, security, configuration, process scheduling, and errors.
  • Development or administration on any NoSQL technologies.
  • Development/scripting experience on Configuration management and provisioning tools e.g. Puppet, Chef.
  • Development, Implementation or deployment experience on the Hadoop ecosystem (HDFS, MapReduce, Hive, Hbase).
  • Analysis and optimization of workloads, performance monitoring and tuning, and automation.
  • Addressing challenges of query execution across a distributed database platform on modern.
  • Proficiency with at least one of the following: Java, Python or Perl.
  • Experience in tool Integration, automation, configuration management in GIT, Jira platforms.
  • Excellent oral and written communication, presentation skills, analytical and problem-solving skills.
  • Self-driven, ability to work independently and as part of a team.
Experience Required :

3 to 10 Years

Vacancy :

2 - 4 Hires

Similar Jobs for you

See more recommended jobs