Data Engineer AWS, Snowflake, DBT Hucon Solutions India Pvt.Ltd.

  • company name Hucon Solutions India Pvt.Ltd.
  • working location Office Location
  • job type Full Time

Experience: 6 - 6 years required

Pay: INR 1800000 - INR 3000000 /year

Type: Full Time

Location: Chennai

Skills: Snowflake, Data Engineering, Dbt, Data models, aws, S3, Lambda

About Hucon Solutions India Pvt.Ltd.

Job Description

Job Title: Data Engineer AWS, Snowflake, DBT

Experience: 6 -10 years
Salary Range: 15 30 LPA
Locations: Chennai, KOCHI


Job Summary:

We are looking for an experienced Data Engineer with strong expertise in AWS, Snowflake, and DBT to develop, optimize, and maintain cloud-based data pipelines. This role requires proficiency in building scalable data architectures, ensuring efficient data processing, and integrating modern data transformation practices to deliver high-quality data solutions.


Key Roles & Responsibilities (R&Rs):

1. Data Engineering & Pipeline Development

  • Design and build end-to-end ETL/ELT data pipelines using AWS services, Snowflake, and DBT.

  • Integrate diverse data sources (e.g., relational, unstructured) into Snowflake data warehouse.

  • Ensure high performance, scalability, and reliability of data processing workflows.

2. Snowflake Data Modeling & Optimization

  • Design and implement Snowflake data models, tables, views, and schemas.

  • Leverage Snowflake features like Streams, Tasks, Time Travel, and Zero Copy Cloning for efficient data handling.

  • Optimize Snowflake performance (queries, storage, and compute resources) and monitor for cost-efficiency.

3. Data Transformation with DBT

  • Develop data transformation models using DBT (Data Build Tool) to create clean, reliable datasets for analytics.

  • Implement version control, testing, and documentation best practices with DBT.

  • Collaborate with data analysts and scientists to ensure seamless data access and consistency.

4. Cloud Integration & Automation

  • Work with AWS services like S3, Redshift, Lambda, and Glue to facilitate data ingestion, processing, and orchestration.

  • Automate data workflows and ensure data pipelines are reliable, scalable, and error-free.

  • Implement data monitoring, logging, and alerting to ensure pipeline reliability.

5. Collaboration & Documentation

  • Collaborate with cross-functional teams to understand data requirements and design solutions.

  • Document data models, transformation logic, and technical specifications for internal use.

  • Troubleshoot issues and provide performance optimization recommendations.


Required Skills & Qualifications:

  • Bachelors/Masters degree in Computer Science, Engineering, or a related field.

  • 6-10 years of experience in data engineering with strong hands-on experience in AWS, Snowflake, and DBT.

  • Solid understanding of Snowflake architecture and performance optimization.

  • Proficiency in DBT for building, testing, and maintaining data transformation workflows.

  • Strong experience with AWS services like S3, Lambda, Redshift, and Glue.

  • Advanced SQL skills and experience in query optimization.

  • Programming knowledge in Python or Java.


Preferred Skills (Good to Have):

  • Certifications: AWS Certified Solutions Architect, Snowflake SnowPro, or DBT Fundamentals.

  • Experience with CI/CD for data pipelines (using tools like Git, Jenkins, Terraform).

  • Familiarity with data visualization tools (e.g., Power BI, Tableau).

  • Knowledge of data governance and security practices in cloud environments.