Data Engineer

This role is for one of the Weekday's clients

Salary range: Rs 600000 - Rs 1700000 (ie INR 6-17 LPA)

Min Experience: 3 years

Location: Bangalore, Chennai, pune, Kolkata, Gurugram

JobType: full-time

Experience: 6+ years in IT with 3+ years in Data Warehouse/ETL projects

Requirements

Primary Responsibilities:

  • Design and develop modern data warehouse solutions using Snowflake, Databricks, and Azure Data Factory (ADF).
  • Deliver forward-looking data engineering and analytics solutions that scale with business needs.
  • Work with DW/BI leads to gather and implement requirements for new ETL pipelines.
  • Troubleshoot and resolve issues in existing pipelines, identifying root causes and implementing fixes.
  • Partner with business stakeholders to understand reporting requirements and build corresponding data models.
  • Provide technical mentorship to junior team members and assist with issue resolution.
  • Engage in technical discussions with client architects and team members to align on best practices.
  • Orchestrate data workflows using scheduling tools like Apache Airflow.

Qualifications:

  • Bachelor's or Master’s degree in Computer Science or a related field.
  • Expertise in Snowflake, including security, SQL, and object design/implementation.
  • Proficient with Snowflake tools such as SnowSQL, Snowpipe, Snowsight, and Snowflake connectors.
  • Strong understanding of Star and Snowflake schema modeling.
  • Deep knowledge of data management principles and data warehousing.
  • Experience with Databricks and a solid grasp of Delta Lake architecture.
  • Hands-on with SQL and Spark (preferably PySpark).
  • Experience developing ETL processes and transformations for data warehousing solutions.
  • Familiarity with NoSQL and open-source databases such as MongoDB, Cassandra, or Neo4J.
  • Exposure to structured and unstructured data, including imaging and geospatial formats.
  • Proficient in DevOps tools and practices including Terraform, CircleCI, Git.
  • Strong background in RDBMS, PL/SQL, Unix Shell Scripting, and query performance tuning.
  • Databricks Certified Data Engineer Associate/Professional certification is a plus.
  • Ability to thrive in a fast-paced, dynamic environment managing multiple projects.
  • Experience working within Agile development frameworks.
  • Excellent communication, analytical, and problem-solving skills with strong attention to detail.

Mandatory Skills:
Snowflake, Azure Data Factory, PySpark, Databricks, SQL, Python



Subscribe now!

Want to receive frequent updates by email? Subscribe to our automatic job service!

Related vacancies