Data Engineer - Python Hucon Solutions India Pvt.Ltd.
Hucon Solutions India Pvt.Ltd.
Office Location
Full Time
Experience: 6 - 6 years required
Pay: INR 1800000 - INR 3000000 /year
Type: Full Time
Location: Bangalore
Skills: Data Engineering, Python, pyspark
About Hucon Solutions India Pvt.Ltd.
Job Description
Job Title:Data Engineer Python
Experience: 6 -10 years
Salary Range: 15-30 LPA
Locations: Chennai,Bangalore
Job Summary:
We are seeking an experienced Data Engineer with strong expertise in Python programming and data engineering best practices. The ideal candidate will be responsible for building robust, scalable data pipelines, optimizing data workflows, and enabling efficient data storage, transformation, and processing in both cloud and hybrid environments.
Key Roles & Responsibilities (R&Rs):
1. Data Pipeline Development
-
Design and develop scalable and reliable ETL/ELT pipelines using Python.
-
Integrate data from various sources (structured, semi-structured, unstructured) into centralized storage systems.
-
Automate data workflows and implement data orchestration solutions.
2. Data Processing & Transformation
-
Develop complex data transformation logic using Python (Pandas, PySpark, etc.).
-
Work with both batch and real-time data processing frameworks.
-
Ensure data accuracy, consistency, and integrity throughout the pipeline.
3. Performance Tuning & Optimization
-
Monitor and optimize data workflows for performance and cost-efficiency.
-
Implement logging, error handling, and monitoring mechanisms.
-
Troubleshoot and resolve issues in data pipelines and systems.
4. Collaboration & Delivery
-
Work closely with data scientists, analysts, and business teams to deliver data for analytics and machine learning use cases.
-
Translate business requirements into technical data workflows.
-
Ensure proper documentation of data models, workflows, and processes.
Required Skills & Qualifications:
-
Bachelors/Masters degree in Computer Science, Engineering, or a related field.
-
6- 10 years of experience in data engineering, with strong expertise in Python.
-
Experience with data processing frameworks (e.g., PySpark, Pandas, Dask).
-
Hands-on experience with cloud platforms (AWS, GCP, or Azure).
-
Proficient in working with databases (SQL, NoSQL) and data modeling.
-
Experience with data orchestration tools like Airflow, Luigi, or Prefect.
-
Familiarity with data storage technologies (e.g., S3, HDFS, Delta Lake).
Preferred Skills (Good to Have):
-
Experience with containerization and DevOps tools (Docker, Kubernetes, CI/CD pipelines).
-
Familiarity with data quality frameworks, data cataloging, and data lineage tools.
-
Exposure to machine learning pipelines or MLOps practices.
-
Certifications in cloud platforms (e.g., AWS Certified Data Analytics, Google Data Engineer).