Elevate your career as a Data Engineer, specializing in the design and optimization of data pipelines to support innovative AI-driven applications and data initiatives.
In this pivotal role, you will be responsible for developing productive data infrastructures, collaborating with diverse teams to enhance data-driven decision-making.
Your focus will be on successfully integrating multiple data sources while ensuring data quality and security measures are upheld.
Your contributions will have a profound impact on the organization’s pursuit of intelligent solutions.
Key Responsibilities:
• Develop and optimize ETL/ELT data workflows
• Enhance data storage solutions and architectures
• Integrate APIs and manage batch/streaming data
• Troubleshoot and improve system performance
• Collaborate in data modeling projects with AI/ML teams
Requirements:
• Expertise in Python, SQL, and Scala for data tasks
• Hands-on experience with ETL frameworks like Apache Airflow
• Proficient in Big Data technologies, cloud platforms
• Familiarity with both relational and NoSQL databases
• Strong problem-solving skills and effective communication
Utilize your data engineering skills to forge innovative solutions and contribute to AI applications that reshape the landscape of data utilization across industries.
#J-18808-Ljbffr
Apply on Kit Job: kitjob.ca/job/2fignv
📌 Dynamic Data Engineer Creating Scalable Solutions for AI Applications (Toronto)
🏢 TheAppLabb
📍 Toronto
Reply to this offer
Impress this employer describing Your skills and abilities, fill out the form below and leave Your personal touch in the presentation letter.