Data Engineer
HireArt
Date: 1 day ago
City: Toronto, ON
Contract type: Contractor

Expected compensation: 85.00 USD Per Hour
HireArt is helping our client find an experienced Data Engineer to play a key role in building and maintaining scalable data infrastructure.
In this role, you’ll design and manage data transport, collection, and storage systems while delivering services that treat data as a first-class product.
The ideal candidate is a collaborative problem-solver with strong technical expertise who thrives on building reliable, scalable data solutions that drive business impact.
As a Data Engineer, You Will
HireArt is helping our client find an experienced Data Engineer to play a key role in building and maintaining scalable data infrastructure.
In this role, you’ll design and manage data transport, collection, and storage systems while delivering services that treat data as a first-class product.
The ideal candidate is a collaborative problem-solver with strong technical expertise who thrives on building reliable, scalable data solutions that drive business impact.
As a Data Engineer, You Will
- Own and optimize core data pipelines to ensure resilience, performance, data quality, and seamless feature onboarding.
- Evolve data models and schemas to meet changing business and engineering needs.
- Implement monitoring systems to improve data quality and consistency.
- Develop self-service tools for data pipeline (ETL) management and optimize SQL queries.
- Contribute to the technical roadmap, aligning projects with team and stakeholder goals.
- Write clean, scalable, and cost-efficient code; conduct code reviews to maintain standards.
- Support high availability and reliability by participating in on-call rotations.
- Collaborate with internal and external teams to resolve blockers and deliver results.
- 3+ years of professional experience in data engineering (6–8 years preferred)
- Proven ability to collaborate with cross-functional stakeholders (analytics, science, product, engineering)
- Skilled in designing complex data models and resilient pipelines
- Proficiency in SQL with ability to implement complex business logic
- Strong knowledge of MPP systems and ETL using Spark or similar technologies
- Proficient in Python, Java, or similar languages for data transformation scripting
- Hands-on experience with workflow management tools (e.g., Airflow, Flyte, etc.)
- Experience in data validation, visualization, and communicating insights effectively
How to apply
To apply for this job you need to authorize on our website. If you don't have an account yet, please register.
Post a resume