Senior Data Engineering Consultant at Gradion transforming data infrastructure for global clients. Modernizing legacy systems and operationalizing AI/ML solutions with cloud and AI consultants.
Responsibilities
Advise senior client stakeholders on modern data architecture, cloud migration strategies, and the secure, compliant use of data for business value.
Define clear roadmaps for clients to transition from legacy data warehouses to scalable cloud-native data platforms (e.g., Data Lakes, Lakehouses).
Design data pipelines and structures that enable clients to monetize data assets and derive actionable insights.
Conduct data maturity assessments and define target-state architectures and roadmaps.
Communicate complex data and AI topics in clear business language to executives and stakeholders.
Lead the design and implementation of robust, scalable, and cost-efficient data infrastructure (data lakehouse, data mesh, or centralized warehouse) on major hyperscaler platforms (AWS, Azure, GCP).
Develop and optimize high-throughput data pipelines using modern ELT/ETL tools (e.g., Spark, Flink, Kafka) to handle large data volumes and integrate disparate data sources.
Build, deploy, and manage production-ready machine learning AI/ML pipelines (MLOps) including feature stores, ML model registries, model training, workflows (e.g., MLflow, Vertex AI, SageMaker, Azure ML), serving and monitoring.
Explore and implement data engineering infrastructure required to develop and deploy Small Language Models and other applied AI solutions within client environments for specific client use cases.
Establish data governance, lineage, and compliance controls to ensure trustworthy AI and regulatory readiness.
Contribute to Gradion’s internal frameworks for data platform modernization and AI readiness.
Define and implement data governance frameworks aligned with ISO, FINMA, GDPR, or MedTech requirements.
Embed data security, masking, and access control into pipelines and platform layers.
Help clients design policy-as-code and automated compliance guardrails for data and AI systems.
Conduct technical and architectural assessments (data platform Health Checks) to identify bottlenecks, security gaps, and cost inefficiencies.
Requirements
7+ years of experience in data engineering, data architecture, or ML platform engineering roles.
Strong background in ETL/ELT, data lake/warehouse architecture, and distributed data processing.
Hands-on experience with one or more cloud data ecosystems (AWS, Azure, GCP, Snowflake, Databricks, BigQuery, Synapse, etc.).
Proficiency in Python and SQL; experience with modern frameworks such as Spark, Airflow, dbt, Kafka.
Familiarity with containerization and orchestration (Docker, Kubernetes).
Experience designing and implementing MLOps principles and tooling (e.g., Kubeflow, MLflow, SageMaker, Azure ML). or integrating data pipelines with AI/ML workloads.
Understanding of data security, compliance, and governance frameworks (ISO, GDPR, SOC2).
Consulting mindset, able to translate technical depth into client value and communicate clearly with business stakeholders.
Excellent communication, presentation, and stakeholder management skills, comfortable working with both technical teams and C-level executives (English proficiency, German a plus).
Desired
Experience with data monetization, data products, or real-time analytics.
Familiarity with LLM/SLM architectures, vector databases, or retrieval-augmented generation (RAG) patterns.
Experience with Databricks, Snowflake, or other modern data warehousing/lakehouse platforms.
Familiarity with distributed processing frameworks (e.g., Spark).
A Master's degree in Computer Science, Data Science, or a related quantitative field.
Experience in regulated industries (finance, healthcare, MedTech) or with cross-border data environments (EU/US/APAC).
Certifications such as AWS Data Analytics, Azure Data Engineer, or GCP Professional Data Engineer are a plus.
Benefits
A laptop is provided
Community Tech activities
A fun & dynamic environment with freedom to be creative
Data Engineer role specializing in Azure & Snowflake at InfoCentric. Leading design and delivery of enterprise - scale data platforms for large organizations.
Principal Data Architect at PointClickCare ensuring coherent and scalable data architecture. Driving unified data direction while collaborating with Engineering Architecture team for AI enablement.
Data Engineer Tech Lead developing data solutions at Carelon. Leading a cross - functional team to optimize data workflows and maintain data integrity.
Lead Data Engineer responsible for evolving Manna’s data infrastructure for drone delivery. Overseeing data architecture and analytics while building scalable data pipelines.
Data Engineer designing, implementing, and optimizing data pipelines for DeepLight AI. Collaborating closely with a multidisciplinary team to analyze large - scale data.
Data Engineer designing and maintaining scalable ETL pipelines at Satori Analytics. Collaborating with teams to deliver high - quality analytics solutions across various industries.
Data Architect responsible for defining enterprise data architecture on AWS and Databricks Lakehouse platforms. Enabling scalable data lakes and enterprise analytics for financial services organizations.
Data Platform Operations Support leading data engineering strategy across projects for EXL. Driving innovation and optimization while collaborating with various teams in the organization.
Manager II leading data engineering projects at Navy Federal Credit Union. Overseeing data governance and quality initiatives while managing engineering teams in a hybrid work environment.
Principal Data Engineer responsible for architecting scalable data pipelines and building high - quality data foundations. Collaborating closely with experts to ensure data readiness for advanced analytics.