GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Responsibilities
Work with both the business teams (finance and actuary initially), data scientists and engineers to design, build, optimise and maintain production grade data pipelines and reporting from an internal Data warehouse solution, based on GCP/Big Query
Work with finance, actuaries, data scientists and engineers to understand how we can make best use of new internal and external data sources
Work with our delivery partners at EY/IBM to ensure robustness of Design and engineering of the data model/ MI and reporting which can support our ambitions for growth and scale
BAU ownership of data models, reporting and integrations/pipelines
Create frameworks, infrastructure and systems to manage and govern Ki’s data asset
Produce detailed documentation to allow ongoing BAU support and maintenance of data structures, schema, reporting etc.
Work with the broader Engineering community to develop our data and MLOps capability infrastructure
Ensure data quality, governance, and compliance with internal and external standards.
Monitor and troubleshoot data pipeline issues, ensuring reliability and accuracy.
Requirements
Experience designing data models and developing industrialised data pipelines
Strong knowledge of database and data lake systems
Hands on experience in Big Query, dbt, GCP cloud storage
Proficient in Python, SQL and Terraform
Knowledge of Cloud SQL, Airbyte, Dagster
Comfortable with shell scripting with Bash or similar
Experience provisioning new infrastructure in a leading cloud provider, preferably GCP
Proficient with Tableau Cloud for data visualization and reporting
Experience creating DataOps pipelines
Comfortable working in an Agile environment, actively participating in approaches such as Scrum or Kanban
Desirable Skills
Experience of streaming data systems and frameworks would be a plus
Experience working in regulated industry, especially financial services would be a plus
Senior Manager leading a team of database engineers to manage CCC's data platform. Overseeing mission - critical applications and collaborating with cross - functional teams in a hybrid environment.
As a Principal Data Architect at Solstice, lead the design and implementation of data architecture solutions. Ensure data integrity, security, and accessibility to meet strategic organizational goals.
Data Platform Specialist overseeing data workflows and enhancing data quality for Stackgini's AI - driven IT solutions. Collaborating with teams to drive improvements and stakeholder support.
Data Engineer designing data pipelines in Python for a major railway industry client. Collaborate with Data Scientists and ensure code quality with agile methodologies.
Senior Data Engineer responsible for building and optimizing data pipelines for banking analytics initiatives. Collaborating with data teams to ensure data quality and readiness for enterprise use.
Senior Data Engineer developing scalable data solutions on Databricks for analytics and operational workloads. Collaborating with cross - functional teams to modernize the data ecosystem.
Data Engineer focused on analytics and data pipeline development for network optimisation. Collaborating with teams to deliver high - quality data solutions with Python and SQL.
Senior Product Manager defining platform capabilities for Data Cloud in Salesforce. Collaborating with R&D teams while shaping product strategy for Data 360 integration.
Senior Data Engineer at Goodwin enhancing data platforms and fostering data - driven culture across teams. Collaborating with IT and Finance on technology solutions and data governance practices.
Director, Data Platform Design and Strategy at MedImpact leading data platform and AI innovations to enhance healthcare services. Overseeing enterprise projects and managing teams to meet strategic goals.