PySpark Developers Capgemini Engineering
Capgemini Engineering
Office Location
Full Time
Experience: 2 - 2 years required
Pay:
Salary Information not included
Type: Full Time
Location: Haryana
Skills: Python, big data, distributed computing, SQL, Data Engineering, Apache Spark, pyspark, ETL processes
About Capgemini Engineering
Job Description
Design and implementation of systems using PySpark, database migration, transformation and integration solutions for any Data warehousing project. Experience in Apache Spark and Python programming experience Experience in developing data processing tasks using PySpark such as reading data from external sources, merging data, performing data enrichment and loading into target data destinations. Experience with building APIs for provisioning data to downstream systems by leveraging different frameworks. Experience on Jupyter Notebook/ Zeppelin/ PyCharm IDEs. Python PySpark Big Data Distributed Computing ETL Processes SQL Data Engineering Apache Spark,