GCP Data Engineer BigQuery Hucon Solutions India Pvt.Ltd.

  • company name Hucon Solutions India Pvt.Ltd.
  • working location Office Location
  • job type Full Time

Experience: 6 - 6 years required

Pay: INR 1800000 - INR 3000000 /year

Type: Full Time

Location: Pune

Skills: GCP, Data Engineering, Google Cloud Platform, Bigquery

About Hucon Solutions India Pvt.Ltd.

Job Description

Job Title: GCP Data Engineer BigQuery

Experience: 6 -10 years
Salary Range: 15 -30 LPA
Location: Pune


Job Summary:

We are looking for a skilled GCP Data Engineer with strong hands-on experience in BigQuery to develop and maintain cloud-based data solutions. The ideal candidate will play a key role in creating efficient and scalable data pipelines, ensuring optimal data storage and querying on Google Cloud Platform (GCP).


Key Roles & Responsibilities (R&Rs):

1. Data Pipeline Development

  • Design, develop, and maintain ETL/ELT data pipelines using Google Cloud Platform (GCP) tools.

  • Work extensively with BigQuery to store, manage, and analyze large datasets.

  • Implement data transformation logic using SQL and optimize queries to enhance performance.

2. BigQuery Architecture & Optimization

  • Design scalable and optimized data models and schema within BigQuery.

  • Use BigQuery best practices for performance tuning, query optimization, and cost management.

  • Integrate data from different sources, ensuring seamless data flows into BigQuery for real-time analytics.

3. GCP Service Integration

  • Utilize GCP services like Cloud Storage, Dataflow, Pub/Sub, and Cloud Composer to build and automate data pipelines.

  • Develop solutions that integrate BigQuery with other GCP services and external tools.

4. Data Quality, Monitoring & Security

  • Ensure data quality by implementing validation checks, error handling, and robust monitoring for data pipelines.

  • Implement data security best practices in GCP, including access control, encryption, and auditing.

  • Work closely with data governance teams to adhere to compliance and regulatory requirements.

5. Collaboration & Documentation

  • Collaborate with cross-functional teams (data scientists, business analysts, stakeholders) to gather data requirements and provide solutions.

  • Provide technical support and expertise to data teams.

  • Document data pipelines, solutions, and processes for future maintenance and knowledge sharing.


Required Skills & Qualifications:

  • Bachelors/Masters degree in Computer Science, Engineering, or a related field.

  • 6- 10 years of experience in data engineering with significant hands-on expertise in BigQuery and GCP.

  • Strong proficiency in SQL and experience with BigQuery optimization techniques (partitioning, clustering, and indexing).

  • Experience with GCP services such as Cloud Storage, Dataflow, Pub/Sub, Cloud Composer, and Stackdriver.

  • Knowledge of data pipeline orchestration and automation.

  • Programming skills in Python or Java for building and maintaining data pipelines.


Preferred Skills (Good to Have):

  • Certifications: Google Professional Data Engineer or Google Cloud Certified Associate Cloud Engineer.

  • Familiarity with CI/CD pipelines and version control tools like Git.

  • Experience with Apache Airflow, Dataform, or similar orchestration tools.

  • Knowledge of data security practices and compliance in GCP environments.

  • Exposure to data visualization tools (e.g., Tableau, Power BI).