Data Engineer | Gcp Cloud | Python | Etl Job in Datatobiz

Data Engineer | Gcp Cloud | Python | Etl

Apply Now
Job Summary

This is a hands-on, entrepreneurial-focused position. This key engineering role is expected to design, develop, deliver, and support data solutions and collaborate with other key team members in Design, Engineering, Product Management, and various Customer and Stakeholder roles.
You must be comfortable and effective as an engineering team player in our dynamic and fastpaced environment. Our culture encourages experimentation and favours agile and rapid iteration over the pursuit of immediate perfection. We are flexible and supportive of remote work arrangements.
Specific responsibilities:
Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment.
Ingest and integrate massive datasets from multiple data sources, while designing and developing solutions for data integration, data modeling, and data inference and insights.
Design, monitor, and improve development, test, and production infrastructure and automation for data pipelines and data stores.
Troubleshoot and performance tune data pipelines and processes for data ingestion, merging, and integration across multiple technologies and architectures including ETL, ELT, API, and SQL.
Test and compare competing solutions and provide informed POV on the best solutions for data ingestion, transformation, storage, retrieval, and insights.
Work well within the quality and code standards, and engineering practices, in place, and maintained by the team.
Experience and Skills:
3-5 years of data engineering experience required, 3+ years of Google Cloud Platform (GCP) experience desired. Equivalent cloud platform experience considered.
2 + years of coding in Python.
2+ years of experience working with JSON, SQL, document data stores, and relational databases.
Solid understanding of ETL/ELT concepts, data architectures, data modelling, data manipulation languages and techniques, and data query languages and
techniques.
Experience working in GCP-based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc, Airflow, etc.
In-depth understanding of Google s data product technology and underlying data architectures. Equivalent cloud platform experience considered.

Experience Required :

Fresher

Vacancy :

2 - 4 Hires

Similar Jobs for you

See more recommended jobs