19 Apr
|
Myticas Consulting
|
Toronto
19 Apr
Myticas Consulting
Toronto
Apply on Kit Job: kitjob.ca/job/2g950l
Qualifications
Strong hands-on experience with
Databricks (PySpark, Delta Lake, notebooks)
– core requirement
Proven ability to build and optimize
ETL/ELT data pipelines
in a lakehouse setting
Experience with
Azure Data Lake Storage (ADLS Gen2)
for scalable data storage
Hands-on development using
Azure Data Factory (ADF)
for orchestration and pipelines
Experience with
Azure Functions
for serverless data processing
Solid understanding of
lakehouse architecture (Databricks + ADLS integration)
Strong proficiency in
Python and SQL
for data transformation and pipeline logic
Experience with
data modeling, partitioning, and performance optimization
Familiarity with
Databricks Unity Catalog (data governance,
access control)
Experience integrating Databricks with
Snowflake, APIs, or downstream BI systems
Exposure to
CI/CD pipelines
(Azure DevOps, GitHub Actions) for data workflows
Experience with
infrastructure-as-code tools
(Terraform or similar) is an asset
Familiarity with
orchestration tools
(Airflow, dbt, or ADF pipelines)
Strong communication skills with
client-facing / consulting experience
Databricks certifications preferred
(strong indicator of hands-on expertise)
#J-18808-Ljbffr
Apply on Kit Job: kitjob.ca/job/2g950l
📌 Databricks Engineer (Data Engineer) (Toronto)
🏢 Myticas Consulting
📍 Toronto