Senior Databricks DWH Engineer responsible for designing ETL data pipelines. Collaborating with teams to deliver scalable solutions in the banking sector.
Responsibilities
Advanced Design & Implementation: Designing and implementing robust, scalable, high-performance ETL/ELT data pipelines using PySpark/Scala and Databricks SQL on the Databricks platform.
Delta Lake: Expertise in implementing and optimizing the Medallion architecture (Bronze, Silver, Gold) using Delta Lake to ensure data quality, consistency, and historical tracking.
Lakehouse Platform: Efficient implementation of the Lakehouse architecture on Databricks, combining best practices from DWH and Data Lake environments.
Performance Optimization: Optimizing Databricks clusters, Spark operations, and Delta tables (e.g., Zordering, compaction, query tuning) to reduce latency and compute costs.
Streaming: Designing and implementing real-time/near–real-time data processing solutions using Spark Structured Streaming and Delta Live Tables (DLT).
Unity Catalog: Implementation and administration of Unity Catalog for centralized data governance, fine-grained security (row- and column-level security), and data lineage.
Data Quality: Defining and implementing data quality standards and rules (e.g., using DLT or Great Expectations) to maintain data integrity.
Orchestration: Developing and managing complex workflows using Databricks Workflows (Jobs) or external tools (e.g., Azure Data Factory, Airflow) to automate pipelines.
DevOps/CI/CD: Integrating Databricks pipelines into CI/CD processes using tools such as Git, Databricks Repos, and Bundles.
Collaboration: Working closely with Data Scientists, Analysts, and Architects to understand business requirements and deliver optimal technical solutions.
Mentorship: Providing technical guidance to junior developers and promoting best practices.
Requirements
Professional Experience: Minimum 5+ years of experience in Data Engineering, including at least 3+ years working with Databricks and large-scale Spark.
Databricks Platform: Proven, expert-level experience with the full Databricks ecosystem (Workspace, Cluster Management, Notebooks, Databricks SQL).
Apache Spark: Deep knowledge of Spark architecture (RDD, DataFrames, Spark SQL) and advanced optimization techniques.
Delta Lake: Expertise in implementing and administering Delta Lake (ACID properties, Time Travel, Merge, Optimize, Vacuum).
SQL: Advanced/expert skills in SQL and Data Modeling (Dimensional, 3NF, Data Vault).
Cloud: Strong experience with a major Cloud platform (AWS, Azure, or GCP), particularly with storage services (S3, ADLS Gen2, GCS) and networking.
Unity Catalog: Hands-on experience with implementing and administering Unity Catalog.
Lakeflow: Experience with Delta Live Tables (DLT) and Databricks Workflows.
ML/AI Fundamentals: Understanding of basic MLOps concepts and experience with MLflow to support integration with Data Science teams.
DevOps: Experience with Terraform or equivalent tools for Infrastructure as Code (IaC).
Certifications: Databricks certifications (e.g., Databricks Certified Data Engineer Professional) are a strong advantage.
Benefits
Premium medical package
Lunch Tickets & Pluxee Card
Bookster subscription
13th salary and yearly bonuses
Enterprise job security with a startup mentality (diverse & engaging environment, international exposure, flat hierarchy) under the stability of a secure multinational
A supportive culture (we value ownership, autonomy, and healthy work-life balance) with great colleagues, team events and activities
Flexible working program and openness to remote work
Collaborative mindset – employees shape their own benefits, tools, team events and internal practices
Diverse opportunities in Software Development with international exposure
Flexibility to choose projects aligned with your career path and technical goals
Access to leading learning platforms, courses, and certifications (Pluralsight, Udemy, Microsoft, Google Cloud)
Career growth & learning – mentorship programs, certifications, professional development opportunities, and above-market salary
Loads & Dynamics Engineer III at Blue Origin designing and analyzing advanced concept vehicles for safe human spaceflight. Responsibilities include developing vibration and acoustic analyses and ensuring structural integrity of spacecraft.
Azure + Terraform Engineer focusing on automating Azure infrastructure management using Terraform and CI/CD. Design, deploy, and manage infrastructure without manual resource creation.
Facilities Maintenance Engineer managing equipment maintenance and operational procedures at Bachem. Collaborating with validation, quality assurance, and providing technical guidance in a pharma setting.
Senior Engineer overseeing technological infrastructure and data center operations at Telefónica. Focused on security, performance optimization, and project implementation.
IAM Engineer focused on enterprise identity solutions within federal government. Collaborating with teams to design, implement, and maintain security - sensitive identity services.
Senior Engineer building data pipelines for Skydio, a leader in autonomous flight technology. Collaborate with the Autonomy Data Curation team on data - driven projects.
Engineer developing materials and processes for engineering applications within the Coatings Technology and Materials Engineering Department at Newport News Shipbuilding. Collaborating with senior engineers to establish procedures for surface treatment commodities.
Engineer position focusing on welding techniques, procedures, and applications at Newport News Shipbuilding. Involves designing and optimizing welding processes for naval shipbuilding.
Engineer applying electromechanical principles to support robotics and AI development at Newport News Shipbuilding. Collaborates on innovative technology solutions for fleet support.
Senior Systems Engineer specializing in aircraft propulsion systems at HII's Mission Technologies. Supporting the Propulsion Acquisition Division at Wright - Patterson Air Force Base with integration and testing.