Designing, building, and operating data architecture on AWS for Bring! Labs. Leading data migration efforts and collaborating with product and operations teams.
Responsibilities
Design the target data architecture, establishing modeling patterns and transformation standards
Lead the migration of existing pipelines to dbt, improving and consolidating the current solution
Define, own and document data contracts between source systems and downstream consumers
Partner with product and operations teams to translate business needs into scalable data models
Build for self-service, enabling teams across the company to access and trust the data they need
Requirements
Strong data modeling expertise, translating business requirements into scalable data structures
Experience with modern data stack tools in the dbt area, like Databricks, Snowflake, or similar
Cloud data warehousing experience at internet scale, preferably on AWS
Data governance and security awareness, ownership, access control, lineage
BI tool experience at the architecture/administration level
Strong SQL skills for complex aggregations and a proficiency in Python
Understanding, experience, and interest in the possibilities of emerging AI tooling and practices in software engineering
Business fluent in English; German is an advantage
Nice to have:
Java or Scala experience (our current platform uses these)
Familiarity with Data Mesh or Data Fabric concepts
Experience with applying ML concepts in data platforms
Benefits
A young and rapidly evolving company that empowers employees to make decisions and actively shape our success
A modern and attractive working environment in the heart of Berlin (and additional offices in Zurich and Basel) with free barista-grade coffee
Flexible working hours with the option to work from the office, as well as partially from home
Social events that bring the team together, including twice-yearly company-wide get-togethers and regular team events, all covered by us!
A commitment to sustainability, including mostly traveling by public transport and providing a Bahncard 50 for your commute
Many cool perks, such as 25 days of vacation + a day off on your birthday, the latest hardware, home office subsidies, and much more!
Data Engineer role focusing on migrating legacy systems to ADA at BBVA. Collaborate with multidisciplinary teams and ensure system integrity during transitions.
Senior Data Engineer focused on modernizing enterprise data capabilities at U.S. Bank. Designing and building reusable data engineering patterns for consistent delivery across teams.
Principal Data Pipeline Lead at SS&C overseeing development of scalable data pipelines. Leading a small team and providing technical guidance for modern data platform integration.
Senior Data Engineer at SS&C building and optimizing data pipelines in a lakehouse environment. Collaborating with data architects and stakeholders in the financial services sector.
Experienced Data Architect designing and implementing scalable data architecture for a financial services and healthcare technology company. Collaborating across teams to support analytics and operational needs.
Data Architect designing scalable, secure data architectures for fraud detection and risk management at Fiserv. Collaborating with cross - functional teams and managing large datasets and pipelines.
Director of Engineering overseeing development of AI - driven data platforms at LVT. Leading teams to transform sensor data into actionable insights using modern architecture and technologies.
Senior Data Engineer at Independence Pet Holdings shaping data ecosystem by building platforms and pipelines. Collaborating with teams to enhance data analytics and operational insights.
Senior Data Engineer designing and developing scalable data pipelines for fintech company. Collaborating with stakeholders to ensure analytics - ready data formats and supporting batch and streaming processes.
Senior Data Engineer at Vancity designing, building, and optimizing scalable data pipelines. Collaborating closely with analytics and business teams to deliver trusted data products while ensuring high standards of data quality.