Pyspark Developer Capgemini
Capgemini
Office Location
Full Time
Experience: 3 - 3 years required
Pay:
Salary Information not included
Type: Full Time
Location: Noida
Skills: SQL, spark, Apache Hadoop, pyspark
About Capgemini
Job Description
Job Description Pyspark+SQL Proficient in leveraging Spark for distributed data processing and transformation. Skilled in optimizing data pipelines for efficiency and scalability. Experience with real-time data processing and integration. Familiarity with Apache Hadoop ecosystem components. Strong problem-solving abilities in handling large-scale datasets. Ability to collaborate with cross-functional teams and communicate effectively with stakeholders. Primary Skills Pyspark SQL,