Apply on Kit Job: kitjob.ca/job/2g9ego
Data Architect Job Purpose & Summary The Data Architect is responsible for contributing to the design and execution of DUCA’s enterprise data architecture and analytics strategy. This role leads the design, development, and optimization of DUCA’s data warehouse (Lodestar) and associated data platforms, ensuring data is reliable, accessible, and tactically leveraged across the organization to generate insight and enable self‑service.
The role guides the business on how to maximize value from data, analytics, and AI‑enabled tools through hands‑on technical expertise. The Data Architect collaborates closely with Business Systems Analysts, Technology teams, and business leaders to deliver high‑quality data models, pipelines, dashboards, and insights that drive improved decision‑making and operational excellence.
This role is accountable for data architecture, ingestion frameworks, data modelling, quality governance, analytics leadership, and enabling the organization to leverage advanced analytics and AI to increase efficiency and impact.
Key Accountabilities & Duties Data Governance & Architecture Design
Build and maintain DUCA’s enterprise data architecture, including target‑state design, standards, and execute on DUCA’s modernization roadmap.
Lead the evolution, expansion, and optimization of the Lodestar data warehouse and related platforms.
Establish best practices for data modelling, metadata, lineage, and semantic layers across all data domains.
Collaborate with business and technology leaders to prioritize data ingestion and integration initiatives aligned to DUCA’s strategic goals.
Champion data governance, including data quality standards, ownership frameworks, access controls, and stewardship practices.
Serve as DUCA’s subject matter expert on enterprise data structure, architecture, and analytical tooling.
Lead cross‑functional collaboration between Business Systems Analysts, Data Stewards, and business partners to improve data literacy and maximize the value of enterprise data.
Data Engineering & Platform Development
Design, build,
and optimize data ingestion pipelines ensuring reliable, timely, and secure data delivery.
Oversee data quality processes, observability, monitoring, and controls to ensure trust in core data assets.
Ensure all architecture and pipelines are documented and maintained in accordance with industry standards.
Collaborate with system owners to ensure data is integrated seamlessly across platforms.
Analytics, Reporting & AI‑Enabled Insights
Lead the development of dashboards, KPIs, and analytical assets that provide actionable insights for business units and leadership.
Drive adoption of self‑serve analytics through well‑structured semantic models and governed data sets.
Identify and implement opportunities to incorporate Artificial Intelligence to automate analytics workflows, increase efficiency, and enhance decision‑making capabilities.
Act as a partner to business leaders by translating data into insight‑supported recommendations.
Occupational Experience & Education Requirements
Undergraduate degree in Computer Science, Data Science, Engineering, Business Technology, or related field.
7+ years of progressive experience in data architecture, data engineering, or enterprise analytics.
Demonstrated experience designing and managing enterprise data warehouses, preferably in a financial services environment.
Hands‑on experience building and maintaining data pipelines, data models, BI solutions, and analytics platforms.
Experience with AI/ML technologies and integrating AI‑assisted automation within analytics workflows.
Experience with banking systems or credit union environments is an asset.
Relevant certifications in data architecture, cloud, analytics,
or AI considered an asset.
Knowledge, Skills & Attributes Technical Skills
Advanced SQL and strong proficiency in data modelling techniques (e.g., star schema, normalized models, semantic modeling).
Technical proficiency in modern data stack technologies, including ETL/ELT tools.
Proficiency with analytics and visualization tools such as Power BI or Oracle Analytics.
Working knowledge with data warehouse platforms such as Lodestar (or similar).
Strong understanding of database architecture, data pipelines, APIs, and integration patterns.
Skilled in Python or R for advanced analytics and automation (preferred).
Familiarity with AI/ML tools, cloud platforms, and modern data architecture paradigms (e.g., lakehouse, event‑driven data).
Understanding of ITIL Framework, considered an asset.
Attributes & Competencies
Strong communication skills with the ability to translate complex data into clear business insight.
Critical thinker with a hands‑on mindset.
Excellent problem‑solving, analytical, and critical‑thinking skills.
Ability to build strong relationships across technical and non‑technical teams.
Curiosity, continuous learning mindset, and a passion for enabling data‑driven cultures.
Working Conditions Normal office workplace with occasional after‑hours work required to support production deployments or critical data initiatives.
Department:
Information Technology
Primary Location:
Corporate Office
Employment Status:
Full‑time
Hours per Week:
38
Salary:
$87,445 – $120,741 annually. Actual annual base salaries will vary depending on relevant job‑related factors such as experience, knowledge, skills, qualifications, and education/training. This position may be eligible for discretionary bonuses.
Number of Existing Vacancies:
1
DUCA is committed to employment equity and encourages applications from all qualified candidates. Recruitment related accommodations will be provided upon request.
#J-18808-Ljbffr
Apply on Kit Job: kitjob.ca/job/2g9ego
📌 Data Architect (Toronto)
🏢 Duca
📍 Toronto