Data Engineer crafting data ingestion and transformation processes for an AI healthcare platform. Collaborating with teams to turn complex healthcare data into actionable insights.
Responsibilities
Own data ingestion, transformation, and curation across Bronze, Silver, and Gold layers of our Databricks-based Medallion Architecture.
Manage and optimize data pipelines using Airflow for orchestration and Airbyte (or similar tools) for multi-source ingestion.
Build and maintain connectors and workflows for APIs, EHR/EMR systems (FHIR), resident life, and IoT/monitoring data sources.
Implement batch and streaming pipelines supporting both analytics and near real-time use cases.
Develop and monitor data quality, validation, and profiling frameworks across ingestion points.
Support AI enablement efforts — preparing data for LLM-based insights, population health analytics, and predictive modeling use cases (e.g., fall risk, medication adherence, staffing optimization).
Collaborate closely with data science to enable curated datasets and semantic layers for Superset and AI query interfaces.
Partner with our infrastructure team to maintain infrastructure as code (Terraform) for data services, ensuring scalability and reproducibility.
Partner with security and compliance officers to move towards HIPAA and SOC 2 alignment for all data storage and processing.
Requirements
4–8 years of hands-on data engineering experience.
Strong proficiency with Airflow and Databricks (Spark, Delta Lake, SQL, Python).
Experience building scalable ingestion pipelines with Airbyte, Fivetran, or custom API connectors.
Solid understanding of Azure data ecosystem (Data Lake, Blob Storage, Key Vault, Functions, FHIR Server, etc.).
Experience implementing and maintaining ETL/ELT pipelines in a HIPAA or regulated environment.
Comfort with both SQL and Python for transformations, orchestration, and testing.
Strong grasp of data modeling, schema evolution, and versioned datasets.
Ability to operate independently and deliver results in a small, fast-moving team.
Experience with FHIR and healthcare data structures and interoperability standards.
Familiarity with vector databases (e.g., pgvector, Pinecone) or embedding pipelines for AI/LLM applications.
Experience with GitHub best practices for maintaining and sharing code.
Familiarity with Superset or other analytics tools for internal visualization.
Understanding of security best practices, including encryption, RBAC, and least-privilege design.
Benefits
Christmas Bonus: 30 days, to be paid in December.
Major Medical Expense Insurance: Coverage up to $20,000,000.00 MXN.
Minor Medical Insurance: VRIM membership with special discounts on doctor’s appointments and accident reimbursements.
Dental Insurance: Always smile with confidence!
Life Insurance: (Death and MXN Disability)
Vacation Days: 12 vacation days in accordance with Federal Labor Law, with prior approval from your manager. + Floating Holidays: 3 floating holidays in addition to the 7 official holidays in Mexico.
Senior Data Engineer focusing on data infrastructure for an AI - driven insurtech startup based in Nepal. Collaborating with teams to optimize data models and maintain data quality.
Senior Professional Consultant leading architecture and design for SAP BW and SAC solutions at Freudenberg. Collaborating with stakeholders and optimizing performance of data landscapes.
Senior Data Engineer designing and managing data architectures to transform large - scale data into insights for Humana. Involves leading technical discussions and implementing best data practices.
Data Engineer II at Early Warning Services developing data science tools and infrastructure. Collaborating on software enhancements and mentoring interns in a hybrid work environment.
Senior Data Architect responsible for optimizing data architecture and supporting data - driven business decisions at TruStage. Leading technical guidance for data architecture and cross - functional team collaboration.
Senior Data Architect developing data architecture plans at The Hartford, collaborating with internal teams to align data standards and practices. Leading complex solutions with a focus on operational effectiveness.
Senior Solution Architect defining architecture framework for SA‑CCR in regulatory risk. Collaborating with stakeholders to ensure compliance and efficient data governance.
Senior Data Engineer optimizing and designing data pipelines on AWS for The Rec Hub. Collaborating with the team to enhance data processing and mentorship.
Data Engineer at TeCreation focusing on data analysis and innovative system development in the well - being industry. Collaborating on data integration and business intelligence reporting.