Data Engineer developing and maintaining scalable data pipelines for Kognia Sports Intelligence. Collaborating with engineers and data scientists to support data-driven decision making.
Responsibilities
Design, build and support modern and scalable data pipelines using data processing frameworks, technologies, and platforms
Use best practices around CI/CD, automation, testing, and monitoring of analytics pipelines
Collaborate with software engineers, researchers, data scientists, and stakeholders to understand what data is required and how best to make it available in our platform
Improve our cloud architecture and design new architectures as the need arises
Identify and address possibilities for improvement in areas such as speed of delivery and infrastructure cost reduction
Investigate new technologies and approaches as needed
Willingness to take part in an on-call rotation with your team members
Requirements
Minimum 1 year in a similar position or 3 years in other engineering roles with relevant responsibilities
Fluent with one or more high-level programming languages (Python preferred, but also Ruby, Java, Scala, Go, or similar)
Willing to work mostly in Python but possibility for other stacks as the team may decide on a service-by-service basis
Experience working with SaaS production architectures in GCP (preferred) or AWS
Ability to adapt to a fast-paced, changing agile environment
Interest (if not experience) in DevOps technologies such as Kubernetes
Excellent team player with strong verbal and written communication skills
Comfortable working in English - we’re an international team based in Barcelona, with English as our shared language.
Experience providing data and infrastructure for building and deploying ML models to production (preferred)
Experience working in multi-functional teams with end-to-end responsibility for product development and delivery within your mission (preferred)
Front-end experience in React (preferred)
Interested in being the glue between engineering and research (preferred)
Experience in data quality and governance (preferred)
Specific knowledge of GitLab CI/CD (preferred)
Knowledge of containerization, GitOps, and Linux (preferred)
Kubernetes experience especially is a big plus (preferred)
Data Engineer II leading development and delivery of data pipelines for Syneos Health. Collaborating with teams to optimize data processing and integrate solutions into production environments.
Lead Data Engineer overseeing data operations and analytics engineering teams for OneOncology. Focused on operational excellence in data platform and model reliability for cancer care improvement.
Senior AWS Software Data Engineer at Boeing focusing on AWS Data services to support digital analytics capabilities. Collaborating with cross - functional teams to design, develop, and maintain software data solutions.
Senior Data Engineer designing and improving software for business capabilities at Barclays. Collaborating with teams to build a data and intelligence platform for Equity Derivatives.
Senior AI & Data Engineer developing and implementing AI solutions in collaboration with clients and teams. Working on projects involving generative AI, predictive analytics, and data mastery.
Consultant driving IA business growth in Deloitte's Artificial Intelligence & Data team. Delivering innovative solutions using data analytics and automation technologies.
Data Engineer responsible for managing data architecture and pipelines at Snappi, a neobank. Collaborating with teams to enable data processing and analysis in innovative banking solutions.
Data Engineer at Destinus developing the data platform to support production and analytics needs. Involves migrating Excel sources to Lakehouse and integrating ERP systems in a hybrid role.
Senior Data Engineer developing solutions within the Global Specialty portfolio at an insurance company. Engaging with diverse business partners to ensure high quality data reporting.
Data Engineer at UBDS Group focusing on designing and optimizing modern data platforms. Collaborating in a multidisciplinary team to develop reliable data assets for analytics and operational use cases.