Data Engineer optimizing data architecture and pipelines at Kantox, a fintech company. Collaborate with teams to develop data products within a modern Lakehouse environment.
Responsibilities
Build and maintain high-performance, tested and well-documented dbt models, following best practices.
Write efficient, maintainable, and scalable SQL transformations across different layers of the data stack.
Collaborate with analytics and domain teams to translate data needs into well-modeled, trusted datasets.
Help monitor and improve the quality, reliability, and performance of data pipelines.
Integrate new data sources into the platform through ingestion pipelines (batch or streaming).
Implement data tests, CI/CD validations and participate in peer reviews to ensure code quality.
Contribute to data product development within a Data Mesh architecture.
Participate in agile rituals and collaborate with cross-functional teams.
Requirements
2+ years of experience in Data Engineering or a similar role.
Strong experience with SQL and dbt, and a passion for clean, efficient code.
Solid understanding of data modeling principles (dimensional, star schema, SCDs…).
Experience working with big data tools like Trino, Spark, or similar query engines.
Familiarity with batch and/or streaming pipelines, e.g. Kafka or RabbitMQ.
Experience writing data tests and following version control workflows (e.g., Git).
Basic proficiency in Python (used for scripting, testing, or orchestration logic).
Familiarity with data quality tools like dbt tests, Great Expectations, or Soda is a plus.
Exposure to orchestration tools like Dagster, Airflow, or Prefect is a plus.
Comfortable working in a collaborative and agile environment, using tools like Jira, GitHub and Slack.
Curious, pragmatic, and always looking to learn and improve.
Fluent in English.
Permission to work within the EU is a plus.
Benefits
Competitive salary 💰
Sponsored learning budget
Free private health insurance
Free Spanish, English, French and Catalan lessons
Relocation package if needed
Flexible working hours + short Fridays
Hybrid work model
29 days of annual vacations🌴
Gym discounts and free sports activities 💪
Restaurant Ticket with monthly credit and regular cross-team lunches
Fresh fruit and unlimited coffee 🍇☕️
Pizza Fridays 🍕
Beautiful office with incredible 360-degree views of Barcelona ☀️
Senior Data Engineer at Keyrus leading the design, development, and delivery of scalable data platforms. Collaborating with teams to translate requirements into production - grade solutions and mentoring engineers.
Senior Data Engineer for global payments platform designing ETL pipelines and data models. Collaborating across teams to tackle complex data challenges in an innovative fintech environment.
Data Warehouse Modelling Engineer designing and maintaining data models using Data Vault 2.0 for iGaming industry. Collaborating with stakeholders and optimizing data models in a hybrid work environment.
Senior Data Engineer driving impactful data solutions for the climate logistics startup HIVED's core data platform. Collaborating with cross - functional squads to enhance analytics and delivery.
Data Engineer developing and maintaining CRE forecasting infrastructure for Cushman & Wakefield. Collaborates with senior economists and technical teams to ensure high - quality data solutions.
Data Engineer at PwC, engaging with Azure cloud services to enhance data handling and integrity. Responsibilities include pipeline optimizations, documentation, and collaboration with stakeholders.