Pyspark Developers Job in Techno Wise
Pyspark Developers
- Pune, Pune Division, Maharashtra
- Not Disclosed
- Full-time
Roles and Responsibilities Ability to design, build and unit test the application in Spark/Pyspark In-depth knowledge of Hadoop, Spark, and similar frameworks Ability to understand existing ETL logic to convert into Spark/PySpark Good implementation experience of oops concepts Knowledge of Unix shell scripting, RDBMS, Hive, HDFS File System, HDFS File Types, HDFS compression codec Experience in processing large amounts of structured and unstructured data, including integrating data from multiple sources Experience in working with Bitbucket and CI-CD process Have knowledge of the agile methodology for delivering the projects Good communication skills Skills Minimum 2 years of extensive experience in design, build and deployment of PySpark-based applications Expertise in handling complex large-scale Big Data environments Minimum 2 years of experience in the following: HIVE, YARN, HDFS Experience in working in ETL products e.g. Ab Initio, Informatica, Data Stage etc. Hands-on experience writing complex SQL queries, exporting, and importing large amounts of data using utilities

