Data Engineer ETL JP Morgan Services India Pvt Ltd

  • company name JP Morgan Services India Pvt Ltd
  • working location Office Location
  • job type Full Time

Experience: 2 - 31 years required

Pay:

Salary Information not included

Type: Full Time

Location: Bengaluru, Karnataka, India null, undefined

Skills: Python, devops, Cloud, problem-solving

About JP Morgan Services India Pvt Ltd

Job Description

Software Engineer II Data Engineer ETL

Are you passionate about building scalable systems and solving complex problems? We have the perfect opportunity for you to advance your career in data engineering.

 

As a Data Engineer at JPMorgan Chase within Corporate Technology, you are part of a dynamic team that enhances and delivers robust ETL processes and data integration solutions. You will work with cutting-edge technologies to transform and cleanse data, ensuring high-quality software delivery.

 

Job Responsibilities:

  • Develop and maintain robust ETL processes for data integration.
  • Utilize big data technologies like Apache Spark (PySpark) for data processing.
  • Write secure and high-quality code in Python and SQL.
  • Build cloud-native applications using platforms such as AWS, Azure, or GCP (AWS).
  • Leverage cloud services for data storage, processing, and analytics.
  • Implement containerization technologies like Docker and Kubernetes (EKS).
  • Apply object-oriented design and data structure fundamentals.
  • Collaborate effectively with cross-functional teams.

Required Qualifications, Capabilities, and Skills:

  • Formal training or certification on software engineering concepts and 2+ years applied experience
  • Strong hands-on experience in developing and maintaining ETL processes.
  • Knowledge of big data technologies such as Apache Spark (PySpark).
  • Proficiency in Python and SQL.
  • Experience with cloud platforms like AWS, Azure, or GCP (AWS).
  • Hands-on experience with Docker and Kubernetes (EKS).
  • Solid understanding of object-oriented design and data structures.
  • Strong collaboration skills.

Preferred Qualifications, Capabilities, and Skills:

  • Experience in using Databricks for big data analytics.
  • Experience with data orchestrator tools like Airflow or Prefect.

Experience Level

Mid Level