Senior Data Engineer delivering high-quality data solutions at a leading Google Cloud consultancy. Designing, building, and maintaining scalable data pipelines and infrastructure for complex data analysis.
Responsibilities
How You’ll Shape Our SuccessThe purpose of this role is to design, build, and maintain scalable data pipelines and infrastructure that enable the efficient processing and analysis of large, complex data sets.
What You’ll DoDevelop and maintain automated data processing pipelines using Google Cloud:
Design, build, and maintain data pipelines to support data ingestion, ETL, and storage
Build and maintain automated data pipelines to monitor data quality and troubleshoot issues
Implement and maintain databases and data storage solutions:
Stay up-to-date with emerging trends and technologies in big data and data engineering
Ensure data quality, accuracy, and completeness
Implement and enforce data governance policies and procedures to ensure data quality and accuracy:
Collaborate with data scientists and analysts to design and optimise data models for analytical and reporting purposes
Develop and maintain data models to support analytics and reporting
Monitor and maintain data infrastructure to ensure availability and performance
Requirements
What You’ll Need to Succeed
Experience in contributing to technical decision making during in-flight projects.
A track record of being involved in a wide range of projects with various tools and technologies, and solving a broad range of problems using your technical skills.
Demonstrable experience of utilising strong communication and stakeholder management skills when engaging with customers
Significant experience with cloud platforms such as Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP).
Strong proficiency in SQL and experience with relational databases such as MySQL, PostgreSQL, or Oracle.
Experience with big data technologies such as Hadoop, Spark, or Hive.
Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or Apache Airflow.
Proficiency in Python and at least one other programming language such as Java, or Scala.
Willingness to mentor more junior members of the team.
Strong analytical and problem-solving skills with the ability to work independently and in a team environment.
Benefits
**Financial:**
Competitive base salary.
Discretionary company bonus scheme.
Employee referral scheme
Meal Vouchers
**Health & Wellbeing:**
Health Care Package
Life and Health Insurance
Bookster
**Time Off & Flexibility: **
28 days of annual leave
Floating bank holidays
An extra paid day off on your birthday.
Ten paid learning days per year.
Flexible working hours
Sabbatical leave (after 5 years).
Work from anywhere (up to 3 weeks per year).
**Development & Recognition: **
Industry-recognised training & certifications.
Bonusly: employee recognition and rewards platform.
Data Engineer delivering AI - and data - driven solutions for Honeywell’s industrial customers. Architecting and implementing scalable data pipelines and platforms focused on IoT and real - time data processing.
Data Engineering Associate focusing on data quality control and management for distribution platform. Collaborates on large scale data projects to ensure data accuracy and availability for users.
Data Architect managing enterprise data platform built on Microsoft Fabric at Johnstone Supply. Leading architectural standards and collaborating with business and IT leaders for strategic data - driven insights.
Data Engineer at Studyportals responsible for data pipelines and infrastructure. Join a team ensuring accurate and trustworthy data for analytics and business decisions.
AI/ML Engineer designing and refining prompts and workflows using large language models. Responsible for developing data pipelines and delivering scalable AI solutions in a hybrid work environment.
AWS Data Architect at Fractal designing and operationalizing AWS data solutions at enterprise scale. Collaborating with clients and mentoring engineers in best practices.
Senior Data Engineer driving data - driven success at Pacific Life. Collaborating with a team to build scalable and secure data solutions in Newport Beach, CA or Charlotte, NC.
Data Architect managing Commercial Data architecture initiatives for Valmet's sales and service team. Leading AI - driven data integrity and quality efforts in a global context.
Data Solutions Architect leading business intelligence solutions and analytics at Crowe. Overseeing data pipelines and analytics frameworks to drive decision - making and compliance.
Senior Lead Data Engineer designing and building scalable data solutions utilizing AI technology for a globally recognized financial institution. Serving sophisticated clients across the globe.