Databricks Developer
LABUR
Date: 1 week ago
City: Toronto, ON
Contract type: Contractor

We respectfully request that 3rd parties refrain from contacting us regarding this posting.
Overview
Our client is seeking a Databricks Developer who will be responsible for designing, developing, and maintaining scalable data pipelines and solutions using Databricks. The ideal candidate will have a strong background in data engineering, experience with big data technologies, and a deep understanding of Databricks and Apache Spark.
Key Responsibilities
Overview
Our client is seeking a Databricks Developer who will be responsible for designing, developing, and maintaining scalable data pipelines and solutions using Databricks. The ideal candidate will have a strong background in data engineering, experience with big data technologies, and a deep understanding of Databricks and Apache Spark.
Key Responsibilities
- Design and develop scalable data pipelines and ETL processes using Databricks and Apache Spark.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
- Optimize and tune data pipelines for performance and scalability.
- Implement data quality checks and validations to ensure data accuracy and consistency.
- Monitor and troubleshoot data pipelines to ensure reliable and timely data delivery.
- Develop and maintain documentation for data pipelines, processes, and solutions.
- Implement best practices for data security, governance, and compliance.
- Participate in code reviews and contribute to the continuous improvement
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- Experience in data engineering or a related field.
- Strong experience with Databricks and Apache Spark.
- Proficiency in programming languages such as Python, Scala, or Java.
- Experience with big data technologies such as Hadoop, Hive, and Kafka.
- Strong SQL skills and experience with relational databases.
- Experience with cloud platforms such as AWS, Azure, or Google Cloud.
- Knowledge of data warehousing concepts and technologies.
- Experience with version control systems such as Git.
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
- Experience with Delta Lake and Databricks Delta.
- Experience with data visualization tools such as Power BI, Tableau, or Looker.
- Knowledge of machine learning and data science concepts.
- Experience with CI/CD pipelines and DevOps practices.
- Certification in Databricks, AWS, Azure, or Google Cloud.
How to apply
To apply for this job you need to authorize on our website. If you don't have an account yet, please register.
Post a resume