Data Engineering Developer at Intact working with cloud technologies and data pipelines. Contributing to data engineering practices and collaborating with cross-functional teams.
Responsibilities
Participate in the analysis design, development, and delivery of efficient, effective, and secure data pipelines.
Have the opportunity to participate in the implementation of industry best practices.
Be an agent of change who will grow our data engineering practice to make our lab an industry benchmark.
Requirements
Bachelor’s degree in Computer Science, Software Engineering, Mathematics or any equivalent combination of training and experience.
3+ years experience in Software Engineering / Data Engineering / ETL development / Data Warehousing.
Experience in Big Data and Cloud technologies for processing large volumes of data.
Understanding of data management, security and data classification.
Understanding of machine learning and artificial intelligence.
Strong business acumen and prioritization skills
Strong proficiency in the Python programming language and the PySpark Framework.
For candidates located in Quebec, bilingualism is required considering the necessity to interact on a regular basis with English-speaking colleagues across the country
No Canadian work experience required however must be eligible to work in Canada.
Benefits
A financial rewards program that recognizes your success
An industry leading Employee Share Purchase Plan; we match 50% of net shares purchased
An extensive flex pension and benefits package, with access to virtual healthcare
Flexible work arrangements
Possibility to purchase up to 5 extra days off per year
An annual wellness account that promotes an active and healthy lifestyle
Access to tools and resources to support physical and mental health, embracing change and connecting with colleagues
A dynamic workplace learning ecosystem complete with learning journeys, interactive online content, and inspiring programs
Inclusive employee-led networks to educate, inspire, amplify voices, build relationships and provide development opportunities
Inspiring leaders and colleagues who will lift you up and help you grow
A Community Impact program, because what you care about is a part of what makes you different.
Technical Lead for data engineering and reporting in healthcare technology at Dedalus. Shaping innovative software solutions and leading cross - functional technical teams in Australia.
Senior ML Data Engineer working on data pipeline curation for Mobileye's autonomous vehicle dataset. Collaborating across teams to enhance ML engineering and vision model applications.
Data Engineer managing customer datasets to enhance industrial research and development. Responsible for ETL pipelines and data ingestion for the Uncountable Web Platform.
Data Engineer designing and maintaining scalable data solutions on Databricks for clinical trials. Collaborating with teams to overcome data challenges and ensure the smooth logistics of clinical supplies.
Senior Manager leading a team of database engineers to manage CCC's data platform. Overseeing mission - critical applications and collaborating with cross - functional teams in a hybrid environment.
As a Principal Data Architect at Solstice, lead the design and implementation of data architecture solutions. Ensure data integrity, security, and accessibility to meet strategic organizational goals.
Data Platform Specialist overseeing data workflows and enhancing data quality for Stackgini's AI - driven IT solutions. Collaborating with teams to drive improvements and stakeholder support.
Data Engineer designing data pipelines in Python for a major railway industry client. Collaborate with Data Scientists and ensure code quality with agile methodologies.
Senior Data Engineer responsible for building and optimizing data pipelines for banking analytics initiatives. Collaborating with data teams to ensure data quality and readiness for enterprise use.
Senior Data Engineer developing scalable data solutions on Databricks for analytics and operational workloads. Collaborating with cross - functional teams to modernize the data ecosystem.