Sr. Data Engineer- Aws- Big Data Job in Infocepts
Sr. Data Engineer- Aws- Big Data
Infocepts
14 hours ago
- Bengaluru, Bangalore Urban, Karnataka
- Not Disclosed
- Full-time
Job Summary
Sr. Data Engineer - AWS - Big Data
Location:Bangalore
Type of Employment: Full-Time
Experience Required: 7 to 10 years
Job Overview:
We are seeking a highly skilled Sr. Data Engineer with expertise in AWS cloud technologies and Big Data to join our Cloud Data Architect Team at Infocepts. In this critical role, you will design and implement robust data solutions using technologies like EMR, Athena, PySpark, AWS Lambda, S3, and other AWS services. The ideal candidate will have a strong foundation in database concepts and SQL and will be responsible for building scalable data pipelines to support high-performance data processing.
Key Responsibilities:
- Technology Assessment and Design: Study the existing technology landscape and evaluate current data integration frameworks. Assist in designing complex Big Data use cases leveraging AWS services.
- Documentation and Stakeholder Communication: Prepare and maintain comprehensive project documentation, adhering to quality guidelines and schedules. Work closely with Architects and Project Managers to provide accurate estimations, scoping, and scheduling assistance. Clearly communicate design decisions and conduct Proof-of-Concepts to validate new solutions before implementation.
- Process Improvement and Automation: Identify areas for process automation to improve efficiency and team productivity. Provide expert guidance and troubleshooting support to junior Data Engineers.
- Training and Knowledge Sharing: Develop and deliver technology-focused training sessions for the team, ensuring continuous knowledge sharing. Share expertise through Expert Knowledge Sharing sessions with Client Stakeholders.
Essential Skills:
- AWS Services Expertise: In-depth knowledge of S3, EC2, EMR, Athena, AWS Glue, and Lambda.
- Big Data Technologies: Proficiency with Apache Spark, Databricks, and Big Data table formats such as Delta Lake (open-source).
- Data Warehousing: Strong understanding of data warehousing concepts and architectures.
- Programming Skills: Advanced programming skills in Python for building data pipelines.
- SQL Expertise: Strong SQL skills for data transformation, aggregation, and querying large datasets.
- ETL Workflow Development: Expertise in creating ETL workflows with complex transformations (e.g., SCD, deduplication, aggregation).
- Orchestration Tools: Familiarity with orchestration tools like Apache Airflow.
- MPP Databases: Experience with at least one MPP database (e.g., AWS Redshift, Snowflake, SingleStore).
- Cloud Databases: Exposure to cloud databases like Snowflake or AWS Aurora.
Desirable Skills:
- Cloud Databases: Familiarity with Snowflake, AWS Aurora.
- Big Data Technologies: Experience with Hadoop and Hive.
- AWS Certification: Associate or Professional Level AWS Certification.
- Advanced Knowledge of Big Data Solutions: Exposure to big data tools and frameworks on cloud platforms.
Qualifications:
- Experience: 7+ years of overall IT experience, with 5+ years specifically focused on AWS-related projects.
- Educational Background: Bachelor's degree in Computer Science, Engineering, or a related field (Master's degree is a plus).
- Technical Certifications: Demonstrated commitment to continuous learning through certifications or relevant training.
Qualities:
- Strong analytical and problem-solving skills to deep dive into complex technical challenges.


Help us improve JobGrin
Need Help? Contact us