AWS Databricks Developer

Cognizant


Date: 12 hours ago
City: Mississauga, ON
Contract type: Full time
In This Role, You Will

  • Develop and optimize data pipelines using Spark in Scala to ensure efficient data processing and analysis.
  • Implement and manage Delta Sharing to facilitate secure and efficient data exchange across platforms.
  • Administer Databricks Unity Catalog to maintain organized and accessible data assets.
  • Utilize Databricks CLI for streamlined management of Databricks resources and workflows.
  • Design and deploy Delta Live Pipelines to automate data processing and ensure data quality.
  • Oversee structured streaming processes to enable real-time data analytics and insights.
  • Apply risk management techniques to identify and mitigate potential data-related risks.
  • Integrate Apache Airflow for orchestrating complex data workflows and ensuring seamless execution.
  • Manage data storage and retrieval using Amazon S3 and Amazon Redshift to support scalable data solutions.
  • Develop Python scripts to automate data tasks and enhance data processing capabilities.
  • Utilize Databricks SQL for querying and analyzing large datasets to derive actionable insights.
  • Implement Databricks Delta Lake for efficient data storage and retrieval ensuring data integrity.
  • Coordinate Databricks Workflows to streamline data operations and enhance productivity.
  • Collaborate with cross-functional teams to align data solutions with business objectives and drive innovation.
  • Contribute to the companys mission by delivering data-driven solutions that impact society positively.
  • Stay updated with the latest industry trends and technologies to continuously improve data processes.
  • Provide technical guidance and mentorship to junior developers fostering a collaborative team environment

We believe hybrid work is the way forward as we strive to provide flexibility wherever possible. Based on this role’s business requirements, this is a hybrid position requiring 2-3 days a week in a client or Cognizant office in Mississauga, ON. Regardless of your working arrangement, we are here to support a healthy work-life balance though our various wellbeing programs.

The working arrangements for this role are accurate as of the date of posting. This may change based on the project you’re engaged in, as well as business and client requirements. Rest assured; we will always be clear about role expectations.

What You Need To Have To Be Considered

  • Possess strong experience in Spark in Scala and Databricks with a proven track record in data engineering.
  • Demonstrate expertise in Delta Sharing and Databricks Unity Catalog Admin for efficient data management.
  • Have hands-on experience with Databricks CLI and Delta Live Pipelines for streamlined data operations.
  • Show proficiency in structured streaming and risk management to ensure data reliability and security.
  • Exhibit knowledge of Apache Airflow Amazon S3 and Amazon Redshift for robust data solutions.
  • Be skilled in Python and Databricks SQL for advanced data processing and analysis.
  • Understand Databricks Delta Lake and Workflows for optimized data storage and operations.

We're excited to meet people who share our mission and can make an impact in a variety of ways. Don't hesitate to apply, even if you only meet the minimum requirements listed. Think about your transferable experiences and unique skills that make you stand out as someone who can bring new and exciting things to this role.

How to apply

To apply for this job you need to authorize on our website. If you don't have an account yet, please register.

Post a resume