Data Engineer specializing in digital transformation at Bounteous. Collaborating on data solutions and migration projects with a focus on data integrity and quality.
Responsibilities
Pipeline Migration
Logic & Scheduling: Refactoring and migrating extraction logic and job scheduling from legacy frameworks to the new Lakehouse environment.
Data Transfer: Executing the physical migration of underlying datasets while ensuring data integrity.
Stakeholder Engagement: Acting as a technical liaison to internal clients, facilitating "handoff and sign-off" conversations with data owners to ensure migrated assets meet business requirements
Consumption Pattern Migration
Code Conversion: Translating and optimizing legacy SQL and Spark-based consumption patterns (raw and modeled) for compatibility with Snowflake and Iceberg.
Usage analysis: Understand usage patterns to deliver the required data products.
Stakeholder Engagement: Acting as a technical liaison to internal clients, facilitating "handoff and sign-off" conversations with data owners to ensure migrated assets meet business requirements.
Data Reconciliation & Quality: A rigorous approach to data validation is required. Candidates must work with reconciliation frameworks to build confidence that migrated data is functionally equivalent to that already used within production flows.
Requirements
Education: Bachelor’s or Master’s degree in Computer Science, Applied Mathematics, Engineering, or a related quantitative field.
Experience: Minimum of 5 years of professional "hands-on-keyboard" coding experience in a collaborative, team-based environment. Ability to trouble shoot (SQL) and basic scripting experience.
Languages: Professional proficiency in Python or Java.
Methodology: Deep familiarity with the full Software Development Life Cycle (SDLC) and CI/CD best practices & K8s deployment experience.
Core Data Engineering Competencies: Candidates must demonstrate a sophisticated understanding of the following modeling concepts to ensure data correctness during reconciliation:
Temporal Data Modeling: Managing state changes over time (e.g., SCD Type 2).
Schema Management: Expertise in Schema Evolution (Ref: Iceberg Apache) and enforcement strategies.
Performance Optimization: Advanced knowledge of data partitioning and clustering.
Architectural Theory: Balancing Normalization vs. Denormalization and the strategic use of Natural vs. Surrogate Keys.
Technical Stack Requirements: Extraction & Logic: Kafka, ANSI SQL, FTP, Apache Spark Data Formats: JSON, Avro, Parquet Platforms: Hadoop (HDFS/Hive), Snowflake, Apache Iceberg, Sybase IQ. Candidate will also need to work with our internal data management platform, and must have an aptitude for learning new workflows and language constructs is essential.
Senior Data Engineer designing and improving software for business capabilities at Barclays. Collaborating with teams to build a data and intelligence platform for Equity Derivatives.
Senior AI & Data Engineer developing and implementing AI solutions in collaboration with clients and teams. Working on projects involving generative AI, predictive analytics, and data mastery.
Consultant driving IA business growth in Deloitte's Artificial Intelligence & Data team. Delivering innovative solutions using data analytics and automation technologies.
Data Engineer responsible for managing data architecture and pipelines at Snappi, a neobank. Collaborating with teams to enable data processing and analysis in innovative banking solutions.
Data Engineer at Destinus developing the data platform to support production and analytics needs. Involves migrating Excel sources to Lakehouse and integrating ERP systems in a hybrid role.
Senior Data Engineer developing solutions within the Global Specialty portfolio at an insurance company. Engaging with diverse business partners to ensure high quality data reporting.
Data Engineer at UBDS Group focusing on designing and optimizing modern data platforms. Collaborating in a multidisciplinary team to develop reliable data assets for analytics and operational use cases.
Data Engineer (dbt) at SDG Group involved in all phases of data projects. Collaborate on data ingestion, transformation, and visualization in a hybrid environment.
Data Consultant at SDG Group specializing in Data & Analytics projects. Collaborate on technical - functional definitions, ETL, data modeling, and visualization for cloud solutions.
Senior Data Engineer responsible for growing customer - defined targeting calculations and developing key/value databases for real - time data processing.