DataOps Engineer at Eeze focusing on data pipeline stability across multiple products. Collaborating with IT teams to maintain quality, observability, and operational efficiency.
Responsibilities
Ensure the reliable and timely execution of daily data pipelines and scheduled workflows.
Operate and maintain internal data services, including ingestion layers, OLAP/lake storage, materialised views, and task dependencies.
Contribute to CI/CD workflows for data pipelines and participate in deployments, version management, and change control.
Monitor orchestration systems (e.g., Airflow), troubleshoot pipeline failures, delays, and anomalies, and drive continuous performance improvements.
Implement and maintain data quality checks, anomaly detection, schema validation, and audit processes.
Collaborate with Data Engineers on table lifecycle management, storage optimisation, partitioning strategies, and schema evolution.
Work with IT Infrastructure and IT Operations teams to improve platform observability, including logging, metrics, and alerting.
Develop and maintain SOPs, platform standards, best practices, and troubleshooting documentation.
Provide operational support to internal users (DE/DA/DS/Ops) for issues such as query performance, missing data, or inconsistent KPIs.
Requirements
2+ years of experience in Data Ops, Data Engineering, BI Engineering, or a similar operational data role.
Experience with CI/CD workflows, Docker, Kubernetes, or other DevOps-related practices.
Hands-on experience with workflow orchestration tools such as Airflow (or equivalent).
Familiarity with mainstream data engineering technologies such as Kafka, Spark, Flink, Delta Lake, Iceberg, Hudi, ClickHouse, or Doris.
Good understanding of data warehousing concepts, including partitioning, schema evolution, table lifecycle management, and OLAP vs. data lake architectures.
Strong SQL skills and familiarity with Python for scripting, automation, or validation.
Strong debugging and problem-solving skills, especially for data anomalies and pipeline failures.
Comfortable working cross-functionally with DE/Infra/Ops/DA/DS teams in a fast-paced environment.
Senior Data Engineer at Goodwin enhancing data platforms and fostering data - driven culture across teams. Collaborating with IT and Finance on technology solutions and data governance practices.
Director, Data Platform Design and Strategy at MedImpact leading data platform and AI innovations to enhance healthcare services. Overseeing enterprise projects and managing teams to meet strategic goals.
Data Engineer delivering AI - and data - driven solutions for Honeywell’s industrial customers. Architecting and implementing scalable data pipelines and platforms focused on IoT and real - time data processing.
Data Engineering Associate focusing on data quality control and management for distribution platform. Collaborates on large scale data projects to ensure data accuracy and availability for users.
Data Architect managing enterprise data platform built on Microsoft Fabric at Johnstone Supply. Leading architectural standards and collaborating with business and IT leaders for strategic data - driven insights.
Data Engineer at Studyportals responsible for data pipelines and infrastructure. Join a team ensuring accurate and trustworthy data for analytics and business decisions.
AI/ML Engineer designing and refining prompts and workflows using large language models. Responsible for developing data pipelines and delivering scalable AI solutions in a hybrid work environment.
AWS Data Architect at Fractal designing and operationalizing AWS data solutions at enterprise scale. Collaborating with clients and mentoring engineers in best practices.
Senior Data Engineer driving data - driven success at Pacific Life. Collaborating with a team to build scalable and secure data solutions in Newport Beach, CA or Charlotte, NC.
Data Architect managing Commercial Data architecture initiatives for Valmet's sales and service team. Leading AI - driven data integrity and quality efforts in a global context.