Data Engineer designing and maintaining scalable data solutions in GCP and Snowflake environments. Collaborating with clients and stakeholders to ensure data quality and functionality.
Responsibilities
Design, develop, test, and maintain data pipelines and ETL/ELT processes using GCP and Snowflake.
Implement data ingestion, transformation, and storage solutions for structured, semi-structured, and unstructured data.
Build and optimize batch, micro-batch, and real-time data pipelines.
Support data migration from legacy systems to cloud platforms (GCP, Snowflake).
Collaborate with business and technical stakeholders to translate requirements into scalable data solutions.
Work with GCP services such as BigQuery, Cloud SQL, Cloud Spanner, and Cloud Bigtable.
Integrate data from various sources and support data platform development.
Ensure data quality by implementing validation rules, testing frameworks, and monitoring solutions.
Work closely with security teams to ensure data protection, access control, and compliance.
Support development of data models and schemas for analytics and reporting.
Contribute to CI/CD processes, version control, and infrastructure automation (e.g., Git, Terraform).
Collaborate with data scientists, analysts, and engineers to support data-driven use cases.
Requirements
5+ years of experience in Data Engineering or similar role.
2+ years of experience working with GCP or similar cloud platforms (AWS, Azure).
Hands-on experience with GCP managed data services (e.g., BigQuery, Cloud SQL, Cloud Spanner, Cloud Bigtable).
Experience working with Snowflake.
Strong knowledge of SQL and experience with data transformation tools (e.g., DBT or similar).
Proficiency in Python for data processing and scripting.
Experience with ETL/ELT processes and data pipeline development.
Experience working with structured, semi-structured, and unstructured data.
Familiarity with data orchestration tools (e.g., Airflow, Dagster or similar).
Experience with version control (Git) and CI/CD practices.
Experience with Infrastructure as Code (e.g., Terraform, Ansible, or similar).
Strong analytical, problem-solving, and troubleshooting skills.
Excellent written and verbal communication skills in English.
Experience in a client-facing or consulting environment is a plus.
Benefits
Work in a supportive team of passionate enthusiasts of AI & Big Data.
Engage with top-tier global enterprises and cutting-edge startups on international projects.
Enjoy flexible work arrangements, allowing you to work remotely or from modern offices and coworking spaces.
Accelerate your professional growth through career paths, knowledge-sharing initiatives, language classes, and sponsored training or conferences, including a partnership with Databricks, which offers industry-leading training materials and certifications.
Choose your preferred form of cooperation: B2B or a contract of mandate, and make use of 20 fully paid days off.
Participate in team-building events and utilize the integration budget.
Celebrate work anniversaries, birthdays, and milestones.
Access medical and sports packages, eye care, and well-being support services, including psychotherapy and coaching.
Get full work equipment for optimal productivity, including a laptop and other necessary devices.
Experience a smooth onboarding with a dedicated buddy, and start your journey in our friendly, supportive, and autonomous culture.
Senior Data Engineer at Goodwin enhancing data platforms and fostering data - driven culture across teams. Collaborating with IT and Finance on technology solutions and data governance practices.
Director, Data Platform Design and Strategy at MedImpact leading data platform and AI innovations to enhance healthcare services. Overseeing enterprise projects and managing teams to meet strategic goals.
Data Engineer delivering AI - and data - driven solutions for Honeywell’s industrial customers. Architecting and implementing scalable data pipelines and platforms focused on IoT and real - time data processing.
Data Engineering Associate focusing on data quality control and management for distribution platform. Collaborates on large scale data projects to ensure data accuracy and availability for users.
Data Architect managing enterprise data platform built on Microsoft Fabric at Johnstone Supply. Leading architectural standards and collaborating with business and IT leaders for strategic data - driven insights.
Data Engineer at Studyportals responsible for data pipelines and infrastructure. Join a team ensuring accurate and trustworthy data for analytics and business decisions.
AI/ML Engineer designing and refining prompts and workflows using large language models. Responsible for developing data pipelines and delivering scalable AI solutions in a hybrid work environment.
AWS Data Architect at Fractal designing and operationalizing AWS data solutions at enterprise scale. Collaborating with clients and mentoring engineers in best practices.
Senior Data Engineer driving data - driven success at Pacific Life. Collaborating with a team to build scalable and secure data solutions in Newport Beach, CA or Charlotte, NC.
Data Architect managing Commercial Data architecture initiatives for Valmet's sales and service team. Leading AI - driven data integrity and quality efforts in a global context.