19 Apr
|
Tredence
|
Toronto
Apply on Kit Job: kitjob.ca/job/2g8z6m
We are seeking a skilled Azure Databricks Engineer with robust experience in Azure Data Factory (ADF) to design, develop, and maintain scalable data pipelines and analytics solutions on the Azure cloud platform. The ideal candidate will have expertise in big data processing, ETL/ELT workflows, and distributed computing using Databricks.
Key Responsibilities
Design and implement scalable data pipelines using
Azure Data Factory (ADF)
Develop and optimize data processing workflows using
Azure Databricks (PySpark/Scala)
Build ETL/ELT processes for ingesting, transforming, and loading large datasets
Integrate data from multiple sources such as APIs, databases, and data lakes
Work with
Azure Data Lake Storage (ADLS)
and
Azure SQL Database
Monitor, troubleshoot, and optimize pipeline performance and data quality
Implement data security, governance, and compliance standards
Collaborate with data scientists, analysts, and stakeholders to deliver data solutions
Automate workflows and deployments using CI/CD pipelinesMaintain documentation for data architecture and processes
Required Skills & Qualifications
Bachelor’s degree in Computer Science, Engineering, or related field
10+ years of experience in Azure data engineering
Strong hands‑on experience with:
SQL and relational databases
Experience with big data technologies and distributed processing
Knowledge of data warehousing concepts and dimensional modeling
Familiarity with Azure services like:
Experience with version control (Git) and CI/CD tools
Strong problem‑solving and debugging skills
#J-18808-Ljbffr
Apply on Kit Job: kitjob.ca/job/2g8z6m
📌 Azure Databricks Engineer (Toronto)
🏢 Tredence
📍 Toronto