Big Data Engineer building scalable data ingestion platforms at Allegro. Work with advanced data science and AI applications in a dynamic and collaborative environment.
Responsibilities
Build a highly scalable and fault-tolerant data ingestion for millions of Allegro customers
Process 5 billion clickstream events every day from all Allegro sites and mobile applications
Engage in projects based on practical applications of data science and AI
Collaborate within a team of experienced engineers organized into various specialized teams
Requirements
Programming in languages such as Scala or Java, Python
Strong understanding of distributed systems, data storage, and processing frameworks like dbt, Spark or Apache Beam
Knowledge of GCP (especially Dataflow and Composer) or other public cloud environments like Azure or AWS
Use good practices such as clean code, code review, TDD, CI/CD
Navigate efficiently within Unix/Linux systems
Positive attitude and team-working skills
Eager for personal development and keeping knowledge up to date
English at B2 level
Benefits
Flexible working hours in an office first model
Annual bonus depending on your annual assessment and the company's results
Well-located offices with fully equipped kitchens and bicycle parking facilities
Excellent working tools including height-adjustable desks and interactive conference rooms
A wide selection of varied benefits in a cafeteria plan
English classes paid for related to the specific nature of your job
Macbook Pro / Air or Dell with Windows depending on preference
High degree of autonomy in terms of organizing your team’s work
Team tourism, training budget, and an internal educational platform
Data Architect designing and implementing data architectures supporting analytics and ML for federal clients. Collaborating with teams to translate mission needs into robust data solutions.
IT Data Engineer developing data pipelines and integrations for Scanfil Group's global IT organization. Collaborating across teams to enhance data solutions and reporting capabilities.
Data Engineer developing Azure data solutions at PwC New Zealand. Responsibilities include data quality monitoring, pipeline development, and collaboration with stakeholders in a supportive environment.
Senior Data Engineer designing and implementing the Enterprise Data Platform at Stellix. Focusing on analytics and insights with a growth path to Principal Data Engineer or Data Architect.
R&D Data Engineer at DXC, transforming complex data into digital assets for global analytics and Smart Lab solutions. Collaborating on ELN and LIMS tools for enhanced data management.
Data Engineer role focusing on data pipelines and processing at 42dot, a mobility AI company. Responsibilities include data collection, schema management, and pipeline monitoring.
Senior Data Engineer at mobility AI company designing large - scale data processing pipelines. Leading technical decisions and mentoring junior engineers in data architecture.
Senior Data Engineer at Booz Allen building advanced tech solutions for mission - driven projects. Utilizing data engineering activities, pipelines, and platforms for impactful data insights.
Senior Software Engineer contributing to Workday's AI/MLOps cloud ops platform. Involves data ingestion, computation, and generation of curated data sets with modern technologies.