Azure Databricks + Pyspark Job in Cognizant

Azure Databricks + Pyspark

Apply Now
Job Summary

1. Hands on experience in MS Azure Data bricks and Hadoop ecosystem components like DBFS, Parquet, Delta Tables, HDFS, Map Reduce programming, Kafka, Spark & Event Hub.
2. In depth understanding of Spark Architecture including Spark Core, Spark SQL, Data Frames, Spark Streaming, RDD caching, Spark MLib.
3. Hands on experience in Scripting languages like Scala & Python.
4. Hands on experience in Analysis, Design, Coding & Testing phases of SDLC with best practices.
5. Expertise in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair.
6. Experience in creating tables, partitioning, bucketing, loading and aggregating data using Spark SQL/Scala.
7. Migrating the code from Traditional DW Environments to Apache Spark and Scala using Spark SQL, RDD.
8. Experience in transferring data from RDBMS/BLOB/ADLS to Data bricks using ADF.
9. Experience in Azure Database (PaaS) OR Azure SQL Data warehouse.
10. Experience in orchestrating the E2E data loads using the Azure Data Factory V2 / Logic Apps.
11. Experience in Cosmos DB.
12. Good knowledge on SQL Queries & Stored Procedures.

Experience Required :

Fresher

Vacancy :

2 - 4 Hires

Similar Jobs for you

See more recommended jobs