Data Engineer creating clean, reliable data pipelines for Plenti, a fintech lender. Collaborating with modern tools like AWS and Databricks to enhance data quality and analytics.
Responsibilities
Design and implement reliable data (ELT/ETL) pipelines using Airbyte (for ingestion) and dbt for data modeling and transformation
Configure and monitor software-defined data assets using Dagster. Troubleshoot pipeline failures and ensure SLAs for data freshness are met
Develop and optimize Databricks (Delta Lake) tables. Write efficient Spark/SQL queries to handle large datasets
Write clean, maintainable Python scripts for custom data extraction or utility tasks. Contribute to the team's codebase via Git/GitHub
Monitor containerized workloads on AWS EKS (Kubernetes). Assist in debugging pod failures and resource bottlenecks
Implement tests (dbt tests, Great Expectations) to catch data anomalies early and ensure trust in our data products
Participate in on-call rotation for critical incidents and drive post-mortems to prevent recurrence
Requirements
5+ years’ experience in a Data Engineering role
Strong SQL skills with experience in complex queries, performance tuning, and data modelling (dimensional models, wide analytical tables, and curated “gold” datasets)
Strong Python skills for data manipulation, automation, and scripting
Hands-on experience with dbt (models, testing, and documentation)
Exposure to Databricks and reverse ETL tools
Experience with data orchestration tools such as Dagster, Airflow, or Prefect
Working knowledge of cloud platforms, ideally AWS (S3, EC2, etc.), with exposure to Azure or GCP also valued
Familiarity with Git and modern engineering practices (CI/CD, code reviews, Infrastructure as Code basics)
Experience using AI-assisted development tools (e.g. GitHub Copilot, Cursor)
Developing ML and computer vision solutions for cutting - edge autonomous vehicle dataset pipeline at Mobileye. Collaborating across teams for data curation and advanced perception algorithms.
Data Migration Lead in a hybrid role managing data migration for a major transformation programme in the media sector. Collaborating with various teams to ensure data integrity and successful migration.
Consultant ML & DataOps at Smile integrating data science projects for major clients. Designing MLOps solutions and enhancing data governance in a collaborative environment.
Data Engineer developing and maintaining data pipelines for Coolbet’s analytical services. Working within an Agile framework to ensure data reliability and efficiency.
API Data Engineer developing innovative data - driven solutions and advancing data architecture for AI Control Tower. Building and integrating APIs and data pipelines to support organizational needs.
Journeyman Data Architect supporting Leidos' enterprise data and analytics program for the Department of War. Collaborating on solutions for data architecture, cloud environments, and governance.
Senior Software Engineer developing backend services and data infrastructure for integrated products at Booz Allen. Collaborating with a small elite team to deliver reliable and scalable services.
AWS Streaming Data Engineer developing software and systems in a fast, agile environment. Utilizing experience with real - time data ingestion and processing systems across distributed environments.
Mid - level Data Engineer ensuring efficient data transformation and integration for data annotation projects. Collaborating with teams to optimize data quality and performance in pipeline operations.
Cloud Data Platform Administrator supporting deployment and operations of modern Enterprise Data Platform. Focus on AWS infrastructure, CI/CD integration, and secure cloud operations.