Data Engineer (spark, Python, Aws) Job in Clairvoyant

Data Engineer (spark, Python, Aws)

Apply Now
Job Summary

Role: Data Engineer

Number of open positions: 3

Location: Pune/HYD (Currently remote)

At Clairvoyant, we re building a thriving big data practice to help enterprises enable and accelerate the adoption of Big data and cloud services. In the big data space, we lead and serve as innovators, trouble-shooters, and enablers. Big data practice at Clairvoyant, focuses on solving our customer s business problems by delivering products designed with best-in-class engineering practices and a commitment to keeping the total cost of ownership to a minimum.

Must-Have:

  • Experience in designing, implementing, and testing Python applications with an understanding of object-oriented programming, multi-tier architecture, and parallel/ multi-threaded programming.
  • A suitable candidate should have 4+ years of experience in developing enterprise-level applications and should possess a go-getter attitude. He/ She should be able to deal with ambiguity
  • Programming experience with Python is a must; Java/Scala is a nice to have
  • Experience with AWS technologies including AppSync, Lambda, Step Functions, and Event Bridge.
  • Solid experience with AWS services such as Cloud Formation, S3, Glue, EMR/Spark, RDS, Dynamo DB, Lambda, Step Functions, IAM, KMS, SM, etc.
  • Knowledge of modern cloud-native data warehouses i.e., AWS Redshift, Snowflake, or Azure Synapse Analytics
  • Experience implementing metadata solutions leveraging AWS non-relational data solutions such as Elastic Cache and Dynamo DB.
  • Working experience with Python and Spark is desired Experience working on Agile projects
  • AWS Solutions Architect or AWS Big Data Certification preferred
  • Ability to thrive in a high-energy, high-growth, fast-paced, entrepreneurial environment. Willing to learn new skills and implement new technologies
  • Strong organizational, written, and verbal communication skills with high attention to detail and the ability to work independently with minimal supervision
  • Highly collaborative team player who can build strong relationships at all levels of the technology and business organization.

Role & Responsibilities:

  • The person will be part of the EDP Data Platform (EDP) team for a major Insurance client. He/ She will work with different stakeholders to architecting & Building an EDP application platform to support clients internal data teams to onboard, and provide data in Data Cloud.
  • This application would be architected using Micro Services Architecture to support different workloads like Data marts, Data warehouses, and AI/ML.
  • The person will also be required to participate actively in brainstorming sessions for improvement opportunities and take them to completion.
  • Experience in life insurance is preferred but not mandatory
  • Superior analytical and problem-solving skills
  • Should be able to work on a problem independently and prepare client-ready deliverables with minimal or no supervision
  • Good communication skills for client interaction.

Education: BE/B. Tech from a reputed institute




Experience Required :

3 to 6 Years

Vacancy :

2 - 4 Hires

Apply Now
Similar Jobs for you

See more recommended jobs

Your 4 Step Guide to Career Success

Apply for jobs
Create Profile
Schedule Interview
Get Hired