Sr. Data Engineer ETL UNIMORPH CONSULTING LLP

  • company name UNIMORPH CONSULTING LLP
  • working location Office Location
  • job type Full Time

Experience: 8 - 8 years required

Pay: INR 2000000 - INR 2800000 /year

Type: Full Time

Location: Chennai

Skills: ETL, Data Integration, Python, Data Modeling, Azure, data warehousing, data vault, Snowflake, Data Engineering, data pipeline

About UNIMORPH CONSULTING LLP

Job Description

As a Hiring Partner for many IT Organization, We are Hiring for Product MNC for their global Analytics Division. This is Direct and Full time placement with the Hiring Organization.

Interested candidate can share the Word format resume with ctc and notice period to info@unimorphtech.com

Role : Sr. Data Engineer ETL-Manager
Experience : 8-12 Yrs
Location : Chennai

# Key Skills :
ETL,Data Ingestion, Build Data Pipeline,Data Modeling,data migration,data vault,DWH,Azure Data Factory,SQL, Python, ETL,Snowflake,data conversion, data profiling, data governance and data analysis.


# Highlights :

  •  Support Analytics and Reporting using Data Ingestion and Pipelines.
  •  Design,develop & Build data ingestion pipelines solutions with other team.
  •  Experience on Azure Data Factory (ADF) and other Azure Data Services.
  •  Experience in Data modeling,Data WareHouse,data ingestion, data conversion, data profiling, data governance and data analysis.
  •  Experience in ETL & Data Pipeline
  •  Solid SQL, Python, ETL, data modeling (including data vault 2.0), data migration or programming skills.
  •  Experience and strong understanding of Snowflake Architecture and Design.
  •  Experience using modern approaches to automating data pipelines.


# Roles and Responsibilities :

  • Work with stakeholders to define solution and data warehouse (DWH) design (based on Snowflake), that adheres to industry best practices and specific needs of BI tools.
  • Design and develop data ingestion pipelines and frameworks to support enterprise analytical/reporting needs.
  • Provide guidance and collaborate on data ingestion design with data modeling, ML engineering, and data quality teams.
  • Manage and configure the Azure Data Factory environments and related MS Azure services.
  • Implement improvements and new tools and approaches aimed at data integration, storage, modeling, profiling, processing, management, and archival.
  • Recommend improvements and platform development to meet strategic objectives of Global Technology and the business.
  • Utilize Agile practices to manage and deliver features.

# Experience :

  • Bachelor's degree in Computer Science, Engineering or Technical Field preferred.
  • Minimum 3-5 years of relevant experience.
  • Proven experience in data engineering and enterprise data warehouse development.
  • Execution of data ingestion, data conversion, data profiling, data governance and data analysis initiatives.
  • Excellent interpersonal, oral, and written communication; Ability to relate ideas and concepts to others; write reports, business correspondence, project plans and procedure documents.
  • Profound understanding of data concepts, data modeling methods and experience in data management capabilities including data definitions, data quality management, and data integration.
  • Solid SQL, Python, ETL, data modeling (including data vault 2.0), data migration or programming skills.
  • Experience and strong understanding of Snowflake Architecture and Design.
  • Experience using modern approaches to automating data pipelines.
  • Agile and Waterfall methodologies.
  • Ability to work independently and manage multiple task assignments within a structured implementation methodology.
  • Personally invested in continuous improvement and innovation.
  • Motivated, self-directed individual that works well with minimal supervision.
  • Must have experience to work across multiple teams/technologies.
  • Preferred but not essential: Business intelligence tool (preferably PowerBI)
  • Preferred but not essential: Programming experience (C++, JavaScript)
  • Preferred but not essential: Experience with dbt (data build tool)