Data Platform Lead at TF Bank responsible for designing a cloud-native data platform. Overseeing architecture, governance, and data quality across the bank in Málaga.
Responsibilities
Lead the design, development, and operation of Avarda’s modern data platform, ensuring it is cloud-native, event-driven, and fit for both operational and analytical use cases.
Define and enforce data contracts, standards, and documentation to provide trusted, reusable, and accessible data products.
Build and maintain the canonical “single view of the consumer” that underpins both analytics and production systems.
Establish practical governance practices for metadata, lineage, and data quality across domains.
Collaborate with the Product Manager for Shared Services and senior stakeholders (Engineering, Analytics, Country Managers) to align platform capabilities with business needs.
Balance compliance and innovation by working with the Data Protection Officer and Compliance to ensure data privacy and regulatory requirements are met.
Contribute hands-on to development, ensuring the right technical patterns and tools are adopted.
Mentor engineers and promote best practices in data modeling, event-driven design, and cloud-native development.
Author and maintain technical documentation and architecture decision records to support transparency and scalability.
Requirements
Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
5+ years of professional experience in data engineering or platform engineering, with at least 2 years in a senior or lead role.
Proven expertise designing and implementing cloud-native, event-driven data platforms (Azure, AWS, or GCP).
Strong skills in data modeling, ETL/ELT pipelines, and streaming architectures (e.g., Kafka, Event Hubs).
Experience with metadata management, lineage, and data quality frameworks.
Solid understanding of microservices, APIs, and data product thinking.
Proficiency in SQL and at least one modern programming language (Python, Scala, Java, .NET).
Familiarity with DevOps practices including CI/CD pipelines, containerization, and infrastructure-as-code.
Excellent communication and stakeholder management skills, with the ability to bridge business and technical domains.
Software Developer in Test working on cloud - based data platform at Tecsys. Ensuring quality and reliability of data pipelines and transformations using automation frameworks.
Data Engineer responsible for designing, building, and optimizing data pipelines and architectures in a tech environment. Requires extensive experience with modern data warehousing and cloud platforms.
Lead Data Engineer role at Brillio focusing on AI & Data Engineering with expertise in Azure and MS Fabric. Collaborate within the Data Engineering team in Pune, Maharashtra, India.
Data Architect at Whiteshield designing scalable, secure data architectures for national and enterprise transformation programs. Architecting modern data platforms to support analytics, AI and operational use cases.
Data Engineer managing scalable data ecosystems for actionable business intelligence and cross - functional stakeholder collaboration. Optimizing ETL/ELT pipelines and ensuring data integrity and security.
Data Engineer specializing in data architecture and solutions for a banking environment, driving value for customers through innovative engineering practices and technologies in data management.
Technical Lead for data engineering and reporting in healthcare technology at Dedalus. Shaping innovative software solutions and leading cross - functional technical teams in Australia.
Senior ML Data Engineer working on data pipeline curation for Mobileye's autonomous vehicle dataset. Collaborating across teams to enhance ML engineering and vision model applications.
Data Engineer managing customer datasets to enhance industrial research and development. Responsible for ETL pipelines and data ingestion for the Uncountable Web Platform.
Data Engineer designing and maintaining scalable data solutions on Databricks for clinical trials. Collaborating with teams to overcome data challenges and ensure the smooth logistics of clinical supplies.