Cloud Architect Job in Datametica

Cloud Architect

Apply Now
Job Summary

Job Description: Key Responsibilities: Attend requirements gathering workshops, estimation discussions, design meetings, and status review meetings Experience in Solution Design and Solution Architecture for the data engineer model to build and implement Big Data Projects on-premises and on the cloud. Align architecture with business requirements and stabilizing the developed solution Ability to build prototypes to demonstrate the technical feasibility of your vision Professional experience facilitating and leading solution design, architecture and delivery planning activities for data-intensive and high throughput platforms and applications To be able to benchmark systems, analyses system bottlenecks and propose solutions to eliminate them Able to help programmers and project managers in the design, planning, and governance of implementing projects of any kind. Develop, construct, test and maintain architectures and run Sprints for the development and rollout of functionalities Data Analysis, Code development experience, ideally in Big Data Spark, Hive, Hadoop, Java, Python, PySpark, Execute projects of various types i.e. Design, development, implementation and migration of functional analytics Models/Business logic across architecture approaches Work closely with Business Analysts to understand the core business problems and deliver efficient IT solutions for the product Deployment of sophisticated analytics program of code using any cloud application. Key Mandatory Skills: Technical Hands-on experience in design, coding, development and managing Cloud Data Pipeline/Data Warehouse applications. Translate complex functional and technical requirements into detailed design and development Implement and deploy high-performance, custom applications at scale on Hadoop Hands-on Experience in Cloud Native Programming Data Warehousing projects with either Java or Python-based programming background. Experience in any of the below Cloud-Native Skills GCP - Big Query, Dataflow, Dataproc, Composer, Cloud Functions etc. Azure - Synapse, Databricks, HD Insight, Data Factory, Azure Functions etc. AWS Redshift, Glue, Athena, EMR, Airflow, AWS Lambda etc. Proficient with various development methodologies like waterfall, agile/scrum and iterative Professional Attributes: Exceptional communication, organization, and time management skills Ability to coach and mentor the team to reach their highest potential Collaborative approach to decision-making Strong analytical skills

Experience Required :

8 to 16 Years

Vacancy :

2 - 4 Hires

Similar Jobs for you

See more recommended jobs