Senior Data Engineer responsible for migrating and modernising data platforms in banking. Rebuilding critical data platform with a focus on risk and core financial data flows.
Responsibilities
Play a central role in the **end-to-end migration and modernisation** of the data platform
Translate legacy ETL logic from SSIS and stored procedures into modern **ELT pipelines using dbt**
Implement **Data Vault 2.0 structures** including Raw Vault and Business Vault
Build **datamarts and curated datasets** for downstream analytics and reporting
Design and operate workflows using **Dagster**, including scheduling, dependencies, and recovery mechanisms
Deploy and run data workloads on **OpenShift / Kubernetes environments**
Enable **near real-time data processing** using Kafka-triggered pipelines
Integrate with upstream data lake environments and external data providers
Establish robust **data validation and reconciliation processes**
Implement automated testing and monitoring using dbt
Support production pipelines and resolve incidents when required
Create clear documentation and ensure operational readiness
Continuously improve performance, reliability, and maintainability
Requirements
Strong experience with **SQL Server and T-SQL**, including performance optimisation
Proven hands-on experience with **dbt** in production environments
Solid experience with **workflow orchestration tools**, ideally Dagster
Practical knowledge of **Data Vault 2.0 modelling concepts**
Experience working with **container platforms such as OpenShift or Kubernetes**
Familiarity with **event-driven architectures and Kafka**
Experience working with **financial data**, ideally in banking or trading environments
Understanding of **risk and PnL data structures** is a strong advantage
Strong ownership mindset with the ability to work independently
Structured, pragmatic, and delivery-focused
Comfortable operating in complex and regulated environments
Clear communicator across both technical and business stakeholders.
Data Engineering Lead managing enterprise - scale data platforms using AWS, Snowflake, and Databricks in financial services. Leading data engineering teams and ensuring data governance.
AWS Data Engineer working in Gurugram to support data architecture and integration solutions. Collaborating and translating business needs into data models.
Senior Data Engineer handling data engineering responsibilities in hybrid setting for banking industry. Collaborating with cross - functional teams and maintaining data quality in Azure environments.
Data Management professional at Kyndryl involved in creating innovative data solutions and ensuring the seamless operation of complex data systems. Collaborating with teams to transform requirements into scalable database solutions.
Software Engineer designing and developing scalable data processing applications on cloud infrastructure for Thomson Reuters. Collaborating with Data Analysts on AI - enabled solutions for data management and insight generation.
Manager of Data Platform overseeing AWS cloud infrastructure and Snowflake data warehouses for Thomson Reuters. Leading the design and implementation of data processing applications in a hybrid role located in Bengaluru.
Senior Data Engineer designing scalable data pipelines and solutions for Enterprise Data Lake at Thomson Reuters. Collaborating across teams to ensure efficient data ingestion and accessibility.
Senior Data Engineer at Technis developing scalable data pipelines and solutions for innovative connected spaces products. Collaborating within a cross - functional team to deliver high - quality data - driven outcomes.
Data Architect designing and implementing data architectures supporting analytics and ML for federal clients. Collaborating with teams to translate mission needs into robust data solutions.