Snowflake Engineer

This role is for one of the Weekday's clients

Salary range: Rs 1000000 - Rs 4000000 (ie INR 10-40 LPA)

Min Experience: 8 years

Location: Bangalore, Pune, Chennai, Kolkata, Gurgaon

JobType: full-time

We are seeking an experienced Snowflake Engineer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data from multiple sources, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions.

Requirements

Key Responsibilities

  • Data Pipeline Development & Integration: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT (Extract, Load, Transform) processes across various data sources.
  • SQL Query Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis.
  • Data Modeling & ELT Implementation: Implement advanced data modeling techniques (e.g., SCD Type-2) using DBT. Design and optimize high-performance data architectures in Snowflake.
  • Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions.
  • Troubleshooting & Data Quality: Perform root cause analysis of data issues, ensure effective resolution, and maintain high data quality standards.
  • Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Maintain clear documentation for data processes, data models, and pipelines.

Skills & Qualifications

  • Snowflake expertise for data warehousing and ELT processes.
  • Strong proficiency in SQL for relational databases and writing complex queries.
  • Experience with Informatica PowerCenter for data integration and ETL development.
  • Familiarity with Power BI for data visualization and business intelligence reporting.
  • Experience with Fivetran for automated ELT pipelines.
  • Familiarity with Sigma Computing, Tableau, Oracle, and DBT.
  • Strong data analysis, requirement gathering, and mapping skills.
  • Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), AWS, or GCP.
  • Experience with workflow management tools such as Airflow, Azkaban, or Luigi.
  • Proficiency in Python for data processing (knowledge of Java or Scala is a plus).

Education

  • Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field.

Skills

  • Snowflake Cloud
  • Snowflake
  • Snowpipe
  • SQL
  • Informatica
  • DBT
  • Data Built Tool


Subscribe now!

Want to receive frequent updates by email? Subscribe to our automatic job service!

Related vacancies