BI Data Engineer supporting analytics and decision-making for Kpler's products. Responsible for building scalable pipelines and robust data models in a dynamic market landscape.
Responsibilities
Architect and develop scalable, resilient, high-quality data solutions to support analytics, reporting, and machine learning workloads.
Design, implement, and maintain reliable ETL/ELT pipelines that integrate, cleanse, and consolidate data from diverse internal and external sources, including APIs and third-party systems.
Ensure data is accurately ingested, transformed, stored, and made accessible through robust testing frameworks and clearly defined business rules.
Translate business requirements into scalable data models and transformation logic, working with large, multidimensional datasets to surface insights, trends, and opportunities.
Build and maintain the BI reporting codebase, ensuring high standards of data integrity, availability, and performance across the organization.
Create and maintain backend infrastructure that supports analytics platforms, dashboards, and machine learning pipelines.
Develop and execute unit and integration tests for data pipelines and transformation scripts to ensure reliability and consistency.
Proactively identify inefficiencies and bottlenecks in data flows and propose scalable, forward-looking solutions.
Partner closely with product managers, engineers, commercial teams, and business stakeholders to understand strategy and data needs, aligning delivery with the BI roadmap.
Contribute to team growth through code reviews, design discussions, and knowledge sharing, while staying current with industry best practices.
Take ownership of the quality, integrity, and consistency of the BI data codebase and associated datasets.
Ensure ETL efficiency and data availability to meet stakeholder requirements and business SLAs.
Adhere to engineering best practices, including documentation, version control, and maintainable code standards.
Support and deliver against team OKRs and KPIs, contributing to overall team and business success.
Requirements
2-4 years of back-end and/or data engineering experience, delivering production-grade data solutions.
Demonstrated experience working in a global or international environment, collaborating across regions and time zones.
Significant experience working with Python.
Experience with SQL and NoSQL databases for OLTP and OLAP usages.
Experience ingesting and processing data from external APIs.
Experience with GCS (Google Cloud Services) and modern data warehousing (ideally BigQuery).
Familiarity with BI data architecture and version control tools such as GitHub or GitLab.
Understanding of performance optimization.
Strong understanding of the business logic behind datasets, with an ability to ensure trust and data quality.
Analytical, detail-oriented, and committed to building reliable, scalable solutions.
Comfortable working with unstructured or imperfect data and transforming it into actionable insights.
A collaborative team player who thrives in a global, multicultural environment.
Strong communicator who can translate complexity into clarity for non-technical stakeholders.
Data Engineer focused on analytics and data pipeline development for network optimisation. Collaborating with teams to deliver high - quality data solutions with Python and SQL.
Senior Product Manager defining platform capabilities for Data Cloud in Salesforce. Collaborating with R&D teams while shaping product strategy for Data 360 integration.
Senior Data Engineer at Goodwin enhancing data platforms and fostering data - driven culture across teams. Collaborating with IT and Finance on technology solutions and data governance practices.
Director, Data Platform Design and Strategy at MedImpact leading data platform and AI innovations to enhance healthcare services. Overseeing enterprise projects and managing teams to meet strategic goals.
Data Engineer delivering AI - and data - driven solutions for Honeywell’s industrial customers. Architecting and implementing scalable data pipelines and platforms focused on IoT and real - time data processing.
Data Engineering Associate focusing on data quality control and management for distribution platform. Collaborates on large scale data projects to ensure data accuracy and availability for users.
Data Architect managing enterprise data platform built on Microsoft Fabric at Johnstone Supply. Leading architectural standards and collaborating with business and IT leaders for strategic data - driven insights.
Data Engineer at Studyportals responsible for data pipelines and infrastructure. Join a team ensuring accurate and trustworthy data for analytics and business decisions.
AI/ML Engineer designing and refining prompts and workflows using large language models. Responsible for developing data pipelines and delivering scalable AI solutions in a hybrid work environment.
AWS Data Architect at Fractal designing and operationalizing AWS data solutions at enterprise scale. Collaborating with clients and mentoring engineers in best practices.