Lead Data Engineer modernizing design standards for imagery models. Collaborate with stakeholders to build and optimize data pipelines and oversee integration efforts.
Responsibilities
Lead and partner with Data Science, DAVS, Architecture and other business, stakeholder teams to prioritize use cases, gather requirements and implement solutions and pipelines for Imagery Models.
Contribute to the Data Products Strategy for Imagery Model Outputs.
Design and own the end ‑ to ‑ end data and ML pipeline architecture for imagery models (ingestion, preprocessing, feature engineering, training, evaluation, deployment).
Integrate models into production systems and APIs with appropriate monitoring and alerting.
Incorporate assurance processes into data solutions.
Guide team members as they build complex data solutions, correct problems, apply transformations, and recommend data cleansing/quality solutions.
Design complex data solutions, including incorporating new data sources and ensuring designs are consistent across projects and aligned to data strategies.
Define and build frameworks for data solutions that can be applied to multiple projects.
Perform analysis of complex sources to determine value and utilize your subject matter expertise to recommend data to include in analytical processes.
Incorporate core data management competencies including data governance, data security and data quality.
Collaborate and build consensus with leadership and diverse groups of stakeholders in defining, estimating, prioritizing and planning of projects.
Perform data and system analysis, assessment and resolution for defects and incidents of high complexity and correct as appropriate.
Define standards and frameworks for testing on data movement and transformation code and data components.
Perform other duties as assigned.
Requirements
Bachelor’s Degree in STEM related field or equivalent.
Fifteen years of related experience.
2+ years of experience in imagery pipeline development and reusable frameworks.
Experience integrating imagery workflows into production systems and API's.
3+ years of experience with AWS Cloud and Python.
2+ years of experience with Databricks.
Expert knowledge of tools, techniques, and manipulation including cloud platforms, programming languages, and modern software engineering practices.
Excellent delivery skills with the ability to examine and assess the effectiveness of software design strategies and methodologies, devise, apply, and share ways to ensure the quality of complex computer systems.
Demonstrated track record of domain expertise including the ability to improve company level capabilities within domain, consult on business priorities and optimize value by identifying business aligned solutions.
Strong problem solving skills with the ability to create architecture that is particularly robust against single points of failure.
Excellent communication skills with the ability to describe technology concepts in ways the business can understand, document effectively, collaborate across disparate groups.
Strong leadership skills with the ability to engage with other leaders and networks to solve problems as well as work to improve the entire engineering organization.
Benefits
Health Insurance : Employees and their eligible family members – including spouses, domestic partners, and children – are eligible for coverage from the first day of employment.
Retirement: Travelers matches your 401(k) contributions dollar-for-dollar up to your first 5% of eligible pay, subject to an annual maximum. If you have student loan debt, you can enroll in the Paying it Forward Savings Program. When you make a payment toward your student loan, Travelers will make an annual contribution into your 401(k) account. You are also eligible for a Pension Plan that is 100% funded by Travelers.
Paid Time Off: Start your career at Travelers with a minimum of 20 days Paid Time Off annually, plus nine paid company Holidays.
Wellness Program: The Travelers wellness program is comprised of tools, discounts and resources that empower you to achieve your wellness goals and caregiving needs. In addition, our mental health program provides access to free professional counseling services, health coaching and other resources to support your daily life needs.
Volunteer Encouragement: We have a deep commitment to the communities we serve and encourage our employees to get involved. Travelers has a Matching Gift and Volunteer Rewards program that enables you to give back to the charity of your choice.
Data Architect designing and implementing data architectures supporting analytics and ML for federal clients. Collaborating with teams to translate mission needs into robust data solutions.
IT Data Engineer developing data pipelines and integrations for Scanfil Group's global IT organization. Collaborating across teams to enhance data solutions and reporting capabilities.
Data Engineer developing Azure data solutions at PwC New Zealand. Responsibilities include data quality monitoring, pipeline development, and collaboration with stakeholders in a supportive environment.
Senior Data Engineer designing and implementing the Enterprise Data Platform at Stellix. Focusing on analytics and insights with a growth path to Principal Data Engineer or Data Architect.
R&D Data Engineer at DXC, transforming complex data into digital assets for global analytics and Smart Lab solutions. Collaborating on ELN and LIMS tools for enhanced data management.
Senior Data Engineer at mobility AI company designing large - scale data processing pipelines. Leading technical decisions and mentoring junior engineers in data architecture.
Data Engineer role focusing on data pipelines and processing at 42dot, a mobility AI company. Responsibilities include data collection, schema management, and pipeline monitoring.
Senior Data Engineer at Booz Allen building advanced tech solutions for mission - driven projects. Utilizing data engineering activities, pipelines, and platforms for impactful data insights.
Senior Software Engineer contributing to Workday's AI/MLOps cloud ops platform. Involves data ingestion, computation, and generation of curated data sets with modern technologies.