Data Engineer at GFT designing, building, and optimizing data pipelines and architectures. Collaborating with cross-functional teams to ensure robust data infrastructure supporting business goals.
Responsibilities
Design, develop, and maintain scalable data pipelines using Databricks
Implement and manage ETL processes to ensure high data quality and reliability
Optimize data storage solutions for performance and cost-efficiency
Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver actionable insights
Contribute to the development of data strategies that align with the company's business objectives, especially within the fabrics domain
Ensure data security, privacy, and governance across all data platforms
Implement best practices for data management, including documentation, testing, and version control
Stay current with the latest trends and technologies in data engineering, particularly within Databricks and the fabrics industry
Requirements
5+ years of experience in data engineering or a related role
Extensive experience with Databricks, including Databricks Delta Live Tables, Databricks Asset Bundles
Proven experience in the MS fabrics with a strong understanding of industry-specific data challenges and opportunities
Hands-on experience with cloud platforms such as AWS, Azure, or GCP
Proficient in programming languages such as Python, SQL
Strong SQL skills and experience with relational databases (e.g., MySQL, PostgreSQL)
Proficient with data visualization tools (e.g., Power BI, Tableau)
Proven experience in Dimensional Modelling including star schema basic notion
Excellent problem-solving and analytical skills
Strong communication skills, with the ability to translate complex technical concepts to non-technical stakeholders
Ability to work independently and as part of a team in a fast-paced environment
Informatica Power Center is nice to have!
Benefits
Comprehensive medical, dental, vision and others
Professional Development Training with Individual Development Plans to map out your career growth
Opportunity to work in a global environment with diverse teams built with colleagues from around the world
Opportunity to work with technology industry leaders in the financial services industry
Opportunity to work for big name clients in capital markets, banking and other industries
Data Engineer/Analyst maintaining and improving data infrastructure for Braiins. Collaborating with technical and business teams to ensure reliable data flows and insights.
Medior Data Engineer handling Azure migrations for a major urban mobility client. Focused on data pipeline development and ensuring platform reliability with cutting - edge technologies.
Developing ML and computer vision solutions for cutting - edge autonomous vehicle dataset pipeline at Mobileye. Collaborating across teams for data curation and advanced perception algorithms.
Data Migration Lead in a hybrid role managing data migration for a major transformation programme in the media sector. Collaborating with various teams to ensure data integrity and successful migration.
Consultant ML & DataOps at Smile integrating data science projects for major clients. Designing MLOps solutions and enhancing data governance in a collaborative environment.
Data Engineer developing and maintaining data pipelines for Coolbet’s analytical services. Working within an Agile framework to ensure data reliability and efficiency.
API Data Engineer developing innovative data - driven solutions and advancing data architecture for AI Control Tower. Building and integrating APIs and data pipelines to support organizational needs.
Journeyman Data Architect supporting Leidos' enterprise data and analytics program for the Department of War. Collaborating on solutions for data architecture, cloud environments, and governance.
Senior Software Engineer developing backend services and data infrastructure for integrated products at Booz Allen. Collaborating with a small elite team to deliver reliable and scalable services.
AWS Streaming Data Engineer developing software and systems in a fast, agile environment. Utilizing experience with real - time data ingestion and processing systems across distributed environments.