Data Architect designing conceptual data models and enterprise architecture solutions for Brillio. Leading integration patterns and collaborating with stakeholders to enhance data management.
Responsibilities
Assessment and discovery to define enterprise architecture strategy roadmap
Lead and design best-in-class, conceptual data models leading to implementable solution designs across technical ecosystem
Guide planning of data movement and integration patterns across our ecosystem, defining integration standards, reference architectures and tool utilization
Provide leadership and standard methodologies for using native-first connectors over third-party for integrating our various workflows, applications and data
Support master data solution architectures, system landscapes, governance workflows, and systems integration and implementation
Collaborating with business stakeholders and technical teams to understand data management requirements, data quality remediation and challenges, and identify opportunities for the design of a best of breed solution
Produce relevant solution design artifacts including but not limited to technical estimates, conceptual system designs, sequence diagrams, and data models
Act as an advisor for new and emerging technologies, patterns, and trends
Requirements
13+ years of data architecture or integration architecture experience
Experience creating ETL and reverse-ETL patterns and services to handle data from various data sources, formats, and use cases
Demonstrable experience with enterprise-wide architectures, integration, warehousing and highly proficient with relational database design, data streams, data modeling, and SQL
Experience with streaming, batch and micro-batch data processing and workflows
Experience leading enterprise assessment, Data quality roadmaps, and implementation
Hands on experience in advanced SQL, Python, Spark and Azure Technologies
Ability to drive the practical evolution and innovation of infrastructure, processes, products and services by influencing decision makers, implementers, stewards, and owners of what direction should be taken
Should be Master communicator and able to lead team and solutionize problems
Data Engineer role focusing on migrating legacy systems to ADA at BBVA. Collaborate with multidisciplinary teams and ensure system integrity during transitions.
Senior Data Engineer focused on modernizing enterprise data capabilities at U.S. Bank. Designing and building reusable data engineering patterns for consistent delivery across teams.
Experienced Data Architect designing and implementing scalable data architecture for a financial services and healthcare technology company. Collaborating across teams to support analytics and operational needs.
Senior Data Engineer at SS&C building and optimizing data pipelines in a lakehouse environment. Collaborating with data architects and stakeholders in the financial services sector.
Principal Data Pipeline Lead at SS&C overseeing development of scalable data pipelines. Leading a small team and providing technical guidance for modern data platform integration.
Data Architect designing scalable, secure data architectures for fraud detection and risk management at Fiserv. Collaborating with cross - functional teams and managing large datasets and pipelines.
Director of Engineering overseeing development of AI - driven data platforms at LVT. Leading teams to transform sensor data into actionable insights using modern architecture and technologies.
Senior Data Engineer at Independence Pet Holdings shaping data ecosystem by building platforms and pipelines. Collaborating with teams to enhance data analytics and operational insights.
Senior Data Engineer designing and developing scalable data pipelines for fintech company. Collaborating with stakeholders to ensure analytics - ready data formats and supporting batch and streaming processes.
Senior Data Engineer at Vancity designing, building, and optimizing scalable data pipelines. Collaborating closely with analytics and business teams to deliver trusted data products while ensuring high standards of data quality.