Data Engineer responsible for creating pipelines and models to support analytics at Trainline. Collaborating with BI Developers and Data Scientists to drive business insights through data.
Responsibilities
Be key to making our data lake more accessible and insightful breaking down the barriers to access by working on new data marts and designing data models that even the most basic SQL users can use
Build data pipelines with Spark or DBT
Use SQL to transform data into meaningful insights
Build and deploy infrastructure with Terraform
Implement DDL, DML with Iceberg
Do code reviews for your peers
Orchestrate your pipelines with DAGs on Airflow
Participate in SCRUM ceremonies (standups, backlogs, demos, retros, planning)
Secure data with IAM and AWS Lake formation
Deploy your changes with Jenkins and GitHub actions
Requirements
Proven experience as a Data Engineering using SQL and Python
Previous experience with data lakes in AWS, Glue Catalog and Athena (or equivalent)
Good experience with DBT
Capable of using popular data modelling tools to create a diagram of proposed tables to enable discussion
Good communicator and comfortable with presenting ideas and outputs to technical and non-technical users
Worked on Apache Airflow before to create DAGS
Ability to work within Agile, considering minimum viable products, story pointing and sprints
Senior Data Engineer building and maintaining robust data pipelines for various data products at Beep Saúde. Collaborating within the team and leading data governance practices.
Software Developer in Test working on cloud - based data platform at Tecsys. Ensuring quality and reliability of data pipelines and transformations using automation frameworks.
Data Engineer responsible for designing, building, and optimizing data pipelines and architectures in a tech environment. Requires extensive experience with modern data warehousing and cloud platforms.
Lead Data Engineer role at Brillio focusing on AI & Data Engineering with expertise in Azure and MS Fabric. Collaborate within the Data Engineering team in Pune, Maharashtra, India.
Data Architect at Whiteshield designing scalable, secure data architectures for national and enterprise transformation programs. Architecting modern data platforms to support analytics, AI and operational use cases.
Data Engineer managing scalable data ecosystems for actionable business intelligence and cross - functional stakeholder collaboration. Optimizing ETL/ELT pipelines and ensuring data integrity and security.
Data Engineer specializing in data architecture and solutions for a banking environment, driving value for customers through innovative engineering practices and technologies in data management.
Technical Lead for data engineering and reporting in healthcare technology at Dedalus. Shaping innovative software solutions and leading cross - functional technical teams in Australia.
Senior ML Data Engineer working on data pipeline curation for Mobileye's autonomous vehicle dataset. Collaborating across teams to enhance ML engineering and vision model applications.
Data Engineer managing customer datasets to enhance industrial research and development. Responsible for ETL pipelines and data ingestion for the Uncountable Web Platform.