GCP Data Engineer (Snowflake))
J&M Group
Date: 5 hours ago
City: Toronto, ON
Contract type: Contractor

Location:Remote/Canada
Role: Contract (12+ Months)
Mandatory: Hands-on experience with Snowflake Data Clean Room.
Key Responsibilities
Role: Contract (12+ Months)
Mandatory: Hands-on experience with Snowflake Data Clean Room.
Key Responsibilities
- Design, develop, and optimize scalable ETL pipelines and workflows using GCP (BigQuery, Dataflow, Dataproc, Pub/Sub).
- Integrate and manage secure, high-performance data flows between Snowflake and BigQuery.
- Write, test, and maintain Python code for ETL, analytics, and automation tasks.
- Use Git for version control, code reviews, and collaborative project management.
- Implement Infrastructure-as-Code (IaC) using Pulumi to manage and automate GCP resources.
- Apply data clean room techniques to maintain secure data sharing environments compliant with privacy standards.
- Collaborate with data scientists, analysts, and product teams to deliver robust data solutions.
- Optimize batch and streaming data pipelines for performance, reliability, and scalability.
- Maintain thorough documentation for processes, data flows, and configurations.
- Strong hands-on experience with GCP data services: BigQuery, Dataflow, Dataproc, Pub/Sub.
- Proficient in Python for data engineering tasks.
- Deep expertise in Snowflake: data modeling, secure data sharing, and query optimization.
- Experienced with Git for source code management.
- Proven experience with Pulumi or equivalent IaC tools for cloud deployment.
- Solid understanding of cleanroom concepts in cloud data warehousing, including privacy/compliance considerations.
- Strong debugging skills for complex data pipelines and cloud environments.
- Excellent communication and documentation abilities.
How to apply
To apply for this job you need to authorize on our website. If you don't have an account yet, please register.
Post a resume