Senior Staff Data Engineer at DeepL working on enterprise-wide data engineering standards and cloud solutions. Leading technical initiatives and mentoring engineers to support data capabilities across the organization.
Responsibilities
Define and implement enterprise-wide Data engineering standards, strategies, and best practices for data solutions
Provide expert guidance on technology selection, cloud services (AWS), and architectural decisions for data solutions
Drive continuous improvements in efficiency, cost reduction, and innovation across data
Evaluate and recommend tools, technologies, and frameworks to enhance our data capabilities
Partner with and influence leaders from engineering, analytics, machine learning, and security teams to align on goals
Mentor and be a thought leader across data, engineering and platform teams, fostering a culture of technical excellence
Collaborate with cross-functional stakeholders to understand data requirements and translate them into technical solutions
Work closely with customer-facing teams to ensure data solutions meet enterprise client needs
Drive best practices in data security, governance, and compliance aligned with enterprise B2B standards
Implement robust security measures for data at rest and in transit
Ensure thorough documentation of data processes, systems architecture, and stakeholder dependencies
Maintain compliance with GDPR, SOC 2, and other relevant regulatory requirements
Requirements
Extensive Data expertise: 10+ years of experience in data engineering or related role with at least 5 years in a staff or principal role
Data architecture experience: Deep understanding of data infrastructure, data warehousing, ETL/ELT processes, and/or data pipeline orchestration
Cloud mastery: Proven experience with cloud platforms (AWS, Azure, or GCP) and cloud-native data services
Scripting & automation: Advanced scripting skills in Python, Bash, or similar languages for automation and tooling
Leadership & communication: Proven track record of technical leadership, mentoring engineers, and influencing cross-functional teams
Enterprise experience: Experience working in high-growth technology or SaaS environments with distributed systems and microservices architecture
Experience with data-specific tools and technologies such as Apache Airflow, dbt, Apache Spark, Kafka, or similar
Experience with real-time streaming data processing pipelines (spark, flink, etc.)
Knowledge of data warehousing solutions: (Snowflake, BigQuery, Redshift) and data lake architectures
Background in data engineering or analytics engineering
AI-Native Orchestration & Advocacy: You don’t just use AI; you redefine engineering workflows through it. You possess a deep-seated belief in AI’s power to transform the software development lifecycle, data accessibility and infrastructure management.
Nice to haves: Familiarity with machine learning operations (MLOps) and ML infrastructure
Experience with security best practices, including secrets management, network security, and compliance frameworks
Experience with agile methodologies and tools (Jira, Confluence) for managing project timelines and deliverables
Contributions to open-source projects or technical community involvement
Experience with cost optimization strategies for cloud infrastructure
Benefits
Diverse and internationally distributed team: joining our team means becoming part of a large, global community with people of more than 90 nationalities. We're more than just colleagues; we're a group of professionals with a shared mission to connect diverse cultures. Our global presence is growing–we've doubled in size nearly every year, with our employees based in the UK, Germany, the Netherlands, Poland, the US, and Japan, and we continue to expand our network.
Open communication, regular feedback: as a language-focused company, we value the importance of clear, honest communication. We value smooth collaboration, direct and actionable feedback, and believe that leading with empathy and growth mindset makes us better together.
Hybrid work, flexible hours: we offer a hybrid work schedule, with team members coming into the office twice a week. This allows you to engage directly with your team and experience the unique energy of our workspace, while still enjoying the flexibility and comfort of working from home. With flexible working hours and trust in your productivity, we are in sync with your team’s general locations and time zones to foster effective and seamless collaboration.
Virtual Shares - An ownership mindset in every role. We believe everyone should share in our success, and that’s why every employee receives Virtual Shares, linking your contribution directly to DeepL’s growth and rewarding you with a stake in our future.
Regular in-person team events: we bond over vibrant events that are as unique as our team, from local team and business unit gatherings, to new-joiner onboardings, to company-wide events that bring us all together–literally.
Monthly full-day hacking sessions: every month, we have Hack Fridays, where you can spend your time diving into a project you're passionate about and get the opportunity to work with other teams–we value your initiatives, impact, and creativity.
30 days of annual leave: we value your peace of mind. With 30 days off (excluding public holidays) and access to mental health resources, we make sure you're as strong mentally as you are professionally.
Competitive benefits: just as our team spans the globe, so does our benefits package. We've crafted it to reflect the diversity of our team and tailored it to align with your unique location, to ensure you feel supported every step of the way.
Senior Manager leading a team of database engineers to manage CCC's data platform. Overseeing mission - critical applications and collaborating with cross - functional teams in a hybrid environment.
As a Principal Data Architect at Solstice, lead the design and implementation of data architecture solutions. Ensure data integrity, security, and accessibility to meet strategic organizational goals.
Data Platform Specialist overseeing data workflows and enhancing data quality for Stackgini's AI - driven IT solutions. Collaborating with teams to drive improvements and stakeholder support.
Data Engineer designing data pipelines in Python for a major railway industry client. Collaborate with Data Scientists and ensure code quality with agile methodologies.
Senior Data Engineer responsible for building and optimizing data pipelines for banking analytics initiatives. Collaborating with data teams to ensure data quality and readiness for enterprise use.
Senior Data Engineer developing scalable data solutions on Databricks for analytics and operational workloads. Collaborating with cross - functional teams to modernize the data ecosystem.
Data Engineer focused on analytics and data pipeline development for network optimisation. Collaborating with teams to deliver high - quality data solutions with Python and SQL.
Senior Product Manager defining platform capabilities for Data Cloud in Salesforce. Collaborating with R&D teams while shaping product strategy for Data 360 integration.
Senior Data Engineer at Goodwin enhancing data platforms and fostering data - driven culture across teams. Collaborating with IT and Finance on technology solutions and data governance practices.
Director, Data Platform Design and Strategy at MedImpact leading data platform and AI innovations to enhance healthcare services. Overseeing enterprise projects and managing teams to meet strategic goals.