Data Engineer role at DyFlex Solutions, designing enterprise-scale data solutions for client projects using cloud and modern frameworks. Hands-on technical delivery combined with client-facing consulting in a hybrid work setup.
Responsibilities
Design, build, and optimise enterprise-scale data solutions
Build and maintain scalable data pipelines for ingesting, transforming, and delivering data
Manage and optimise databases, warehouses, and cloud storage solutions
Implement data quality frameworks and testing processes to ensure reliable systems
Design and deliver cloud-based solutions (AWS, Azure, or GCP)
Take technical ownership of project components and lead small development teams
Engage directly with clients, translating business requirements into technical solutions
Champion best practices including version control, CI/CD, and infrastructure as code
Requirements
Hands-on data engineering experience in production environments
Strong proficiency in Python and SQL; experience with at least one additional language (e.g. Java, Typescript/Javascript)
Experience with modern frameworks such as Apache Spark, Airflow, dbt, Kafka, or Flink
Background in building ML pipelines, MLOps practices, or feature stores is highly valued
Proven expertise in relational databases, data modelling, and query optimisation
Demonstrated ability to solve complex technical problems independently
Excellent communication skills with ability to engage clients and stakeholders
Degree in Computer Science, Engineering, Data Science, Mathematics, or a related field
Benefits
Work with SAP’s latest technologies on cloud as S/4HANA, BTP and Joule, plus Databricks, ML/AI tools and cloud platforms
A flexible and supportive work environment including work from home
Competitive remuneration and benefits including novated lease, birthday leave, remote working, additional purchased leave, and company-provided laptop
Competitive remuneration and benefits including novated lease, birthday leave, salary packaging, wellbeing programme, additional purchased leave, and company-provided laptop
Comprehensive training budget and paid certifications (Databricks, SAP, cloud platforms)
Structured career advancement pathways with mentoring from senior engineers
Exposure to diverse industries and client environments
AWS Data Engineer designing data models and supporting data architecture for various clients at EXL. Collaborating to deliver data solutions for improved business outcomes in a hybrid work environment.
Senior Data Engineer at Noda creating scalable data solutions for smarter, sustainable buildings. Collaborating with teams to optimize data for high - performance analytics.
Leading Technology Consulting as Associate Director at Protiviti, focusing on Microsoft Fabric and Databricks. Strengthening client relationships through analytics and mentoring teams in consulting engagements.
Senior Consultant position at Protiviti mentoring teams on data analytics and engineering solutions using Microsoft technologies, enhancing efficiency and client relationship management.
GCP Data Engineer specializing in data governance, architecture, and quality. Collaborates in a hybrid environment across multiple locations in Mexico.
Director of Data Engineering leading data architecture and analytics at Petfolk. Overseeing data infrastructure and managing a data team to drive AI and business intelligence solutions.
Senior Data Engineer managing end - to - end data pipelines with Google Cloud Platform. Collaborating closely with product teams to deliver scalable data solutions in a hybrid setting.
GCP Data Engineer designing, building, and optimising data solutions on Google Cloud Platform. Collaborating with clients to solve complex data challenges and enhance AI capabilities.
Data Engineer developing scalable data solutions across multi - cloud environments for clients. Mentoring junior engineers while ensuring data quality and promoting best practices within the team.