Data Engineer Snowflake (AWS), PySpark Hucon Solutions India Pvt.Ltd.
Hucon Solutions India Pvt.Ltd.
Office Location
Full Time
Experience: 6 - 6 years required
Pay: INR 1800000 - INR 3000000 /year
Type: Full Time
Location: Bangalore
Skills: aws, Data Engineering, Snowflake, ETL, elt, pyspark
About Hucon Solutions India Pvt.Ltd.
Job Description
Job Title: Data Engineer Snowflake (AWS), PySpark
Experience: 6 -8 years
Salary Range: 15 -30 LPA
Locations: Bangalore, Chennai, Hyderabad
Job Summary:
We are looking for a highly skilled Data Engineer with strong expertise in Snowflake on AWS and PySpark. The ideal candidate will be responsible for building robust, scalable data pipelines and enabling advanced analytics through efficient cloud-based data solutions.
Key Roles & Responsibilities (R&Rs):
1. Data Pipeline Development
-
Design and implement efficient ETL/ELT pipelines using PySpark.
-
Integrate structured and unstructured data into Snowflake hosted on AWS.
-
Optimize pipeline performance for large-scale data processing.
2. Snowflake Development & Administration
-
Design and develop scalable data models in Snowflake.
-
Write and optimize SQL queries, manage virtual warehouses, and handle Snowflake security roles.
-
Implement Snowflake-specific features such as Snowpipe, Streams, Tasks, and Time Travel.
3. AWS Ecosystem Integration
-
Work with AWS services such as S3, Lambda, Glue, Redshift, and IAM.
-
Ensure smooth data ingestion from AWS sources to Snowflake.
4. Collaboration & Support
-
Partner with data analysts, scientists, and business users to understand data needs.
-
Ensure data quality, reliability, and accessibility across stakeholders.
-
Document technical designs and support knowledge transfer within the team.
Required Skills & Qualifications:
-
Bachelors/Masters degree in Computer Science, Engineering, or related field.
-
68 years of experience in data engineering.
-
Hands-on experience with Snowflake on AWS (including architecture, SQL, and performance tuning).
-
Strong programming skills in PySpark and Python.
-
Experience in building and maintaining big data pipelines.
-
Proficient in SQL and performance optimization.
Preferred Skills (Good to Have):
-
Knowledge of data governance, data security, and data cataloging tools.
-
Familiarity with CI/CD pipelines (e.g., Git, Jenkins, Terraform).
-
Certifications: Snowflake SnowPro, AWS Data Analytics Specialty.