Lead Data Engineer overseeing the migration from Redshift to AWS RDS and mentoring data analysts in Python and DataOps practices.
Responsibilities
Lead new Data Pipeline Delivery: Implement the new Python-based ETL pipelines for data movement from Luna’s microservice DynamoDB tables to RDS transactional DB (Postgres OLTP) and subsequently to the RDS data warehouse (Postgres OLAP).
Platform Optimization: Leverage expertise from previous data setups to optimize the new Postgres OLTP for transactional performance and the OLAP environment for analytics effectiveness, with both being conscious of on-going cost management.
Standard Setting: Establish and enforce DataOps standards, including version control (Git), automated CI / CD deployment, and schema migration management using tools like Liquibase or Drizzle ORM.
Hands-on Coaching: Actively mentor team members, elevating their skills in Python, Git, and engineering workflows through code reviews and workshops.
Code Quality: Conduct rigorous code reviews, providing detailed, educational feedback that explains best practices and clearly outlines required changes to elevate team standards.
AI Engineering patterns: Support the definition and implementation of AI tooling and practices to augment the Data engineering team.
Initiative Planning: Break down larger reporting initiatives into manageable epics and stories within our Agile framework (a mix of Scrum for larger work items, and Kanban for smaller continuous flow items).
Stakeholder Management: Network with business domains to capture requirements and provide strategic guidance on the data platform migration plan.
Product Partnership: Support the Data Product Owner by providing the technical context necessary to prioritize the team's backlog effectively.
Requirements
Data Engineering - 7 to 10 years experience building and supporting full-stack data pipelines from source to reporting.
AWS Mastery - 5+ years deep experience with AWS, ideally administration and optimization of RDS / Aurora (Postgres) and Redshift.
Python - 5+ years expert-level Python, with specific experience working on ETL pipelines, including libraries like Pandas and orchestration tools like Airflow or Prefect.
PostgreSQL - 5+ years expert-level SQL and database architecture (partitioning, indexing) for both OLTP and OLAP.
DataOps / Tools - 3+ years expertise in Git, with solid understanding of Git branching and workflow, and schema management (e.g., Liquibase, Flyway, or Drizzle).
Benefits
25 days holiday allowance + bank holidays
Share scheme
A £1000 flexifund to use on a personalised list of benefits such Gym membership, Cycle to Work Scheme, Health, dental and optical cash plan
Staff Data Engineer at Headspace building privacy - first data platforms for mental health support. Leading data engineering strategies and mentoring team members to enhance data - driven decision making.
Senior Data Engineer building and implementing data pipelines at Headspace. Collaborating with analytics and data science teams to enhance personalized mental health support.
Data Engineering Intern working on data pipelines and infrastructure in fast - growing fintech. Collaborating with data engineers, learning best practices and developing data solutions.
Senior Software Engineer building and maintaining data infrastructure for Gusto. Collaborating with Data Science and Business Intelligence teams to achieve their goals.
Data Engineer building and maintaining scalable data pipelines for AI Search Infrastructure at You.com. Collaborating across teams to ensure data quality and enable AI capabilities.
Data Engineer developing and managing technology - based data solutions for clients in different industries in Greece. Participating in software development lifecycle within Agile team setting.
Data Architect leading design and governance of high - quality data architectures for clients. Collaborating with engineering teams and stakeholders to transform business challenges into scalable data solutions.
Data Engineer supporting vehicle buying and selling solutions through integration pipelines. Collaborating with teams to build digital vehicle platforms and optimize data processes in São Paulo.
Senior Advanced Data Engineer designing and optimizing data architecture for Honeywell. Collaborating with cross - functional teams to drive data - driven decision - making and operational efficiency.
Senior Data Engineer building and operating data platforms at bsport for analytics and AI/ML. Collaborating with Data team to enrich data layers and maintain platform observability.