Data Engineering Lead designing and optimizing scalable data pipelines for a property insurance company. Collaborating across teams, mentoring engineers, and driving data architecture initiatives.
Responsibilities
Build and optimize distributed data processing jobs using Apache Spark on Databricks.
Implement Delta Lake, DLT pipelines, dbt transformations and Medallion architecture for scalable and reliable data workflows.
Design and automate ETL pipelines using Azure Data Factory, Databricks, and Synapse Analytics.
Integrate data from diverse sources including Duck Creek, Intacct, Workday and external APIs.
Develop dimensional models (Star/Snowflake schemas), stored procedures, and views for data warehouses.
Ensure efficient querying and transformation using SQL, T-SQL, and PySpark.
Leverage Azure DevOps, CI/CD pipelines, and GitHub for version control and deployment.
Utilize Azure Logic Apps and ML Flow for workflow automation and model training.
Implement role-based access control (RBAC), data encryption, and auditing mechanisms.
Work closely with data scientists, analysts, and business stakeholders to deliver high-quality data solutions.
Mentor junior engineers and contribute to code reviews and architectural decisions.
Requirements
Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
5+ years of experience in data engineering with at least 2 years on Databricks.
Proficiency in Python, Scala, SQL, and Spark.
Hands-on experience with Azure Data Services (ADF, ADLS, Synapse).
Strong understanding of ETL, data warehousing, and data modeling concepts.
Experience with Microstrategy, Power BI, including DAX and advanced visualizations.
Familiarity with MLflow, LangChain, and LLM integration is a plus.
Knowledge of Duck Creek a plus.
Insurance Domain knowledge preferred.
Preferred Certifications Databricks Data Engineering Professional Azure/AWS Data Engineering Certifications
Benefits
Opportunities to stretch and grow: your professional and personal development matters to us.
Clarity and kindness: you can rely on us to be open, honest and supportive, offering clarity on what success looks like.
Support in good times and bad: we believe in showing up for each other consistently, not only when it’s easy.
A community that cares: we are committed to sustaining a community in which each person feels cared for as an individual.
Principal Data Engineer designing and developing innovative data analytical solutions for the gaming industry. Leading and mentoring while engaging with clients to fulfill their data engineering needs.
Specialist, Data Engineering at CoverMyMeds enhancing and expanding data platforms for commercial data products. Collaborating with multiple teams to design scalable data solutions from various sources.
Team Lead in Data Engineering at Avanquest mentoring data engineering team and ensuring efficient data management across platforms. Collaborating with departments to align solutions and optimize workflows.
Data Architect at RSM leading AI - driven data migration initiatives within Salesforce ecosystem. Implementing data governance and optimizing performance across complex datasets.
Senior Data Engineer at Capgemini designing and optimizing scalable data architectures on Databricks and GCP. Collaborating across teams to transform business needs into reliable technical solutions.
Data Engineer transforming legacy on - premises systems to cloud - native architectures for advanced data analytics. Collaborating with teams to build efficient data solutions using Python and AWS.
Data Engineering Academy focused on Snowflake and Databricks for professionals interested in expanding their technical capabilities. Fully remote with future office work in Monterrey or Saltillo after completion.
Senior Data Engineer at Intent HQ designing and scaling data platforms. Building high - impact intelligence from millions of customer insights with a focus on performance and reliability.
SAP Data Engineer supporting MERKUR GROUP's evolution into a data - driven company. Responsible for data integration, modeling, and collaboration with various departments in Group Finance.
Data Engineer at Booz Allen Hamilton organizing data and developing advanced technology solutions. Leading data engineering activities for mission - driven projects and mentoring multidisciplinary teams.