Job Description:
Responsibilities and Duties
- Lead design and implementation of Snowflake-based data architectures: schemas, data vault/house/star models, materialized views, and zero-copy cloning patterns for environments.
- Build and maintain production ETL/ELT pipelines into Snowflake using Snowpipe, Snowpark, Streams & Tasks and partner tools (Streamsets, dbt, Fivetran, Matillion, Airbyte, etc.).
- Develop Snowflake-native utilities and apps (Snowpark for Python, UDFs, external functions, and internal tools) to accelerate developer productivity and data product delivery.
- Optimize query performance and cost through clustering keys, partitioning strategies, resource monitors, warehouse sizing, and workload isolation.
- Implement data governance, security and access controls in Snowflake based on role-based access, masking policies, object tagging, data lineage and audit logging.
- Automate infrastructure and deployments leveraging IaC for Snowflake objects and cloud infra CI/CD pipelines, and automated testing for SQL/Snowpark code.
- Build observability and operational tooling by monitoring, alerting, usage/cost reporting, and incident playbooks for Snowflake workloads.
- Mentor engineers, review designs and contribute to roadmap decisions for Snowflake platform evolution.
Required skills and experience
- Strong hands-on experience designing and operating Snowflake in production
- Deep experience with Snowflake features, like Snowpark, Streams & Tasks, Snowpipe, Time Travel, cloning, materialized views, external functions and user-defined functions.
- Hands-on ETL/ELT development experience with dbt, SQL, and one or more ingestion tools (Streamsets, Fivetran, Matillion, Airbyte, Kafka connectors).
- Proficient in Python (Snowpark/connector), SQL tuning and query optimization techniques.
- Experience with IaC and automation (Terraform, GitHub Actions, Jenkins, or equivalent).
- Strong knowledge of cloud platforms and native services (AWS, Azure or GCP) as they relate to Snowflake deployment and integrations.
- Solid understanding of medallion architecture, data modeling patterns, data governance, and secure data sharing.
- Demonstrated ability to implement CI/CD, automated testing and production operational practices for data workloads.
Preferred qualifications
- Snowflake SnowPro Core or advanced Snowflake certifications.
- Experience with dbt (core or Cloud) for transformation and modular SQL engineering.
- Experience with data virtualization, data catalogs or data lineage tools.
- Familiarity with analytics and BI integrations (Looker, Tableau, Power BI) and building Snowflake-optimized semantic layers.
- Experience building internal developer tools or data apps using Snowpark or lightweight web frameworks.