Elevate your career as a Senior Data Engineer in a hybrid model. Drive the design and optimization of data pipelines and lakehouse infrastructure to empower analytics and machine learning.
In this pivotal role, you'll work closely with cross-functional teams and the Data Engineer Manager. Your focus will be on building and scaling the enterprise data platform, ensuring high-quality data workflows, APIs, and integrity of data. Be a strong individual contributor who takes ownership of complex challenges and influences data architecture.
Key Responsibilities:
• Design and maintain scalable data pipelines across layers
• Optimize workflows within Databricks and Delta Lake
• Build RESTful APIs for data access and consumption
• Write reliable SQL and PySpark transformations
• Contribute to monitoring and data quality frameworks
Requirements:
• 4+ years of hands-on data engineering experience
• Solid understanding of pipeline design patterns
• Proficiency with Databricks and Medallion Architecture
• Experience with orchestration tools for production pipelines
• Robust SQL skills and Azure services knowledge
Shape data transformation across analytics and ML models, making a significant impact in this dynamic environment.
#J-18808-Ljbffr
Apply on Kit Job: kitjob.ca/job/2fsnfm
📌 Hybrid Senior Data Engineer Role (Calgary)
🏢 Quorum Software
📍 Calgary
Reply to this offer
Impress this employer describing Your skills and abilities, fill out the form below and leave Your personal touch in the presentation letter.