Cloud Data Engineer- Aws-big Data Job in Infocepts
Cloud Data Engineer- Aws-big Data
- Bengaluru, Bangalore Urban, Karnataka
- Not Disclosed
- Full-time
Position: Cloud Data Engineer AWS Big Data
Location: Bangalore, India
Employment Type: Full-time
Experience Required: 5 to 8 years
Purpose of the Position:
Join the Infocepts Cloud Data Architect Team as a Cloud Data Engineer and help design and implement cutting-edge big data solutions on AWS. You will leverage your expertise in EMR, Athena, PySpark, S3, AWS Lambda, and SQL to develop robust and scalable data platforms.
Key Responsibilities:
Technology Assessment and Design:
- Assess existing technology landscape and data integration frameworks.
- Design complex Big Data use cases using AWS services under guidance of the Architect.
- Support architectural decision-making by evaluating trade-offs in cost, performance, and durability.
- Recommend optimizations to existing data infrastructure.
Documentation and Stakeholder Communication:
- Create project documentation adhering to quality and delivery standards.
- Collaborate closely with Architects and Project Managers for scoping, estimation, and planning.
- Present design decisions to technical and business stakeholders clearly.
- Conduct PoCs and design review sessions.
Process Improvement and Automation:
- Identify and suggest opportunities for automation and process enhancements.
- Mentor junior engineers and support technical problem solving.
Training and Knowledge Sharing:
- Prepare and deliver internal training on AWS and Big Data topics.
- Lead client knowledge sharing sessions and contribute to case studies.
Essential Skills:
- In-depth experience with AWS services: S3, EC2, EMR, Athena, Glue, Lambda
- Familiarity with MPP databases like Redshift, Snowflake, or SingleStore
- Proficiency in Apache Spark and Databricks
- Strong programming skills in Python
- Experience building data pipelines using AWS and Databricks
- Knowledge of Big Data file formats such as Delta Lake
- Advanced SQL skills for large-scale data manipulation
- Hands-on experience with Apache Airflow or similar orchestration tools
- Strong understanding of ETL workflows and data warehousing concepts
Desirable Skills:
- Cloud databases: AWS Aurora, Snowflake
- Experience with Hadoop and Hive
- AWS Certifications (Associate or Professional level) are a plus
Qualifications:
- Bachelor s degree in Computer Science, Engineering, or related field (Master s preferred)
- Overall 5+ years of IT experience with at least 3 years in AWS Big Data projects
- Ongoing learning and technical certifications are strongly encouraged
Key Qualities:
- Strong problem-solving and analytical thinking
- Self-driven with a passion for emerging data technologies
- Excellent communication and client presentation skills
- Ability to work in cross-functional, agile teams
Apply now to be part of a high-impact data transformation team working on large-scale cloud data projects!
Qualification : Bachelors degree in Computer Science, Engineering, or related field (Masters preferred)

