Senior Data Engineer shaping how data drives processes at Phoenix Group. Working with cross-functional teams on cloud platforms including Databricks and Azure.
Responsibilities
Design and implement end-to-end data engineering solutions across multiple platforms, including Azure, Databricks, SQL Server, and Salesforce, enabling seamless data integration and interoperability
Architect and optimize Delta Lake environments within Databricks to support scalable, reliable, and high-performance data pipelines for both batch and streaming workloads
Develop and manage robust data pipelines for operational, analytical, and digital use cases, leveraging best practices for data ingestion, transformation, and delivery
Integrate diverse data sources—cloud, on-premises, and third-party systems—using connectors, APIs, and ETL frameworks to ensure consistent and accurate data flow across the enterprise
Implement advanced data storage and retrieval strategies that support operational data stores (ODS), transactional systems, and analytical platforms
Collaborate with cross-functional teams (data scientists, analysts, product owners, and operational leaders) to embed data capabilities into business processes and digital services
Optimize workflows for performance and scalability, addressing bottlenecks and ensuring efficient processing of large-scale datasets
Apply security and compliance best practices, safeguarding sensitive data and ensuring adherence to governance and regulatory standards
Create and maintain comprehensive documentation for data architecture, pipelines, and integration processes to support transparency and knowledge sharing
Requirements
Proven experience in enterprise-scale data engineering, with a strong focus on cloud platforms (Azure preferred) and cross-platform integration (e.g., Azure ↔ Salesforce, SQL Server)
Deep expertise in Databricks and Delta Lake architecture, including designing and optimizing data pipelines for batch and streaming workloads
Strong proficiency in building and managing data pipelines using modern ETL/ELT frameworks and connectors for diverse data sources
Hands-on experience with operational and analytical data solutions, including ODS, data warehousing, and real-time processing
Solid programming skills in Python, Scala, and SQL, with experience in performance tuning and workflow optimization
Experience with cloud-native services (Azure Data Factory, Synapse, Event Hub, etc.) and integration patterns for hybrid environments
AWS Data Engineer designing data models and supporting data architecture for various clients at EXL. Collaborating to deliver data solutions for improved business outcomes in a hybrid work environment.
Senior Data Engineer at Noda creating scalable data solutions for smarter, sustainable buildings. Collaborating with teams to optimize data for high - performance analytics.
Leading Technology Consulting as Associate Director at Protiviti, focusing on Microsoft Fabric and Databricks. Strengthening client relationships through analytics and mentoring teams in consulting engagements.
Senior Consultant position at Protiviti mentoring teams on data analytics and engineering solutions using Microsoft technologies, enhancing efficiency and client relationship management.
GCP Data Engineer specializing in data governance, architecture, and quality. Collaborates in a hybrid environment across multiple locations in Mexico.
Director of Data Engineering leading data architecture and analytics at Petfolk. Overseeing data infrastructure and managing a data team to drive AI and business intelligence solutions.
Senior Data Engineer managing end - to - end data pipelines with Google Cloud Platform. Collaborating closely with product teams to deliver scalable data solutions in a hybrid setting.
GCP Data Engineer designing, building, and optimising data solutions on Google Cloud Platform. Collaborating with clients to solve complex data challenges and enhance AI capabilities.
Data Engineer developing scalable data solutions across multi - cloud environments for clients. Mentoring junior engineers while ensuring data quality and promoting best practices within the team.