GCP Data Engineer designing, building, and optimising data solutions on Google Cloud Platform. Collaborating with clients to solve complex data challenges and enhance AI capabilities.
Responsibilities
Design, build, and optimise scalable data platforms and pipelines on Google Cloud Platform
Develop robust ingestion, transformation, and orchestration pipelines using BigQuery, Cloud Storage, Dataflow, Dataproc, Composer, and related GCP services
Build high-quality data models to support analytics, reporting, and AI/ML use cases
Implement modern data architecture patterns, including lakehouse and ELT/ETL approaches
Collaborate closely with clients to translate business requirements into well-architected, actionable data solutions
Support or implement dbt, CI/CD pipelines, Git-based workflows, and data engineering best practices
Ensure strong data quality, governance, lineage, security, and performance optimisation within GCP environments
Work alongside analytics, governance, and AI consultants to deliver cohesive, end-to-end solutions
Contribute to reusable assets, accelerators, and internal frameworks that strengthen Intelligen’s GCP capability
Requirements
4–6+ years’ experience in data engineering
Strong hands-on experience with Google Cloud Platform, particularly BigQuery and cloud-native data services
Experience designing and building production-grade data pipelines on GCP
Strong SQL skills and experience building complex data transformations
Familiarity with modern data stacks e.g. Composer, BigQuery, Databricks, Snowflake, dbt
Experience working across the full data lifecycle: ingestion, transformation, modelling & consumption
Knowledge of DevOps concepts, version control, and CI/CD in data environments
Excellent communication, problem-solving, and collaboration skills
Benefits
We’re not just delivering AI and data projects.. we’re humanising them. That means we care deeply about the how, not just the what. We value curiosity, creativity, and a willingness to challenge the status quo. We look for people who are driven to build a business, get curious, and offer opinions and ideas.
You’ll join a team that’s smart, kind, and ambitious. You’ll have real influence - in your projects, your practice, and the broader business. And as we grow, so will you.
Here’s what that looks like:
**Grow With Purpose**
Structured development pathways so you always know what you’re working toward.
Paid certifications across all major cloud and data technologies — because mastery matters
Weekly training, guilds and knowledge exchanges where practitioners share real project insights.**
**Support That Helps You Thrive**
A dedicated support lead to mentor, guide and help you navigate your career
Regular 360° feedback loops, giving you clear direction and honest insight into your progress.**
**A Community You’ll Enjoy Being Part Of**
Innovation days where you can experiment, prototype and sharpen new skills.
Team days, social catchups, many new community clubs…
Opportunities to lead, influence and shape our growing practice as we expand across the region.**
If you want to be part of a team that cares just as much about *how* we deliver as *what* we deliver; where curiosity is valued, ideas matter, and your impact is felt - drop us a message and start the conversation today...
Data Engineer role specializing in Azure & Snowflake at InfoCentric. Leading design and delivery of enterprise - scale data platforms for large organizations.
Principal Data Architect at PointClickCare ensuring coherent and scalable data architecture. Driving unified data direction while collaborating with Engineering Architecture team for AI enablement.
Data Engineer Tech Lead developing data solutions at Carelon. Leading a cross - functional team to optimize data workflows and maintain data integrity.
Lead Data Engineer responsible for evolving Manna’s data infrastructure for drone delivery. Overseeing data architecture and analytics while building scalable data pipelines.
Data Engineer designing, implementing, and optimizing data pipelines for DeepLight AI. Collaborating closely with a multidisciplinary team to analyze large - scale data.
Data Engineer designing and maintaining scalable ETL pipelines at Satori Analytics. Collaborating with teams to deliver high - quality analytics solutions across various industries.
Data Architect responsible for defining enterprise data architecture on AWS and Databricks Lakehouse platforms. Enabling scalable data lakes and enterprise analytics for financial services organizations.
Data Platform Operations Support leading data engineering strategy across projects for EXL. Driving innovation and optimization while collaborating with various teams in the organization.
Manager II leading data engineering projects at Navy Federal Credit Union. Overseeing data governance and quality initiatives while managing engineering teams in a hybrid work environment.
Principal Data Engineer responsible for architecting scalable data pipelines and building high - quality data foundations. Collaborating closely with experts to ensure data readiness for advanced analytics.