Data Engineer II designing and building data pipelines for analytics at Honeywell. Collaborating with data scientists and product owners in a hybrid work setting in Charlotte, NC.
Responsibilities
Design & build pipelines to ingest, transform, and publish structured/unstructured data from SFDC, EDW, ADLS, Event Hub, and APIs into Databricks/Snowflake, following Delta Lake and Unity Catalog standards
Model data (star/snowflake, CDC, SCD, dimensional views) to support analytics (e.g., commercial pipeline metrics, quote/discount modeling)
Operationalize ML/analytics pipelines including bronze→silver→gold processing, joins with model/market indicators, and serving outputs to applications/APIs
Harden platforms: CI/CD with Azure DevOps; monitor jobs/clusters; optimize PySpark/SQL performance; enforce data governance (quality, privacy, lineage, access)
Partner & document: collaborate with product owners and data science; write runbooks and technical specs; contribute to weekly updates and stewardship forums
Requirements
Min 4 years of experience in data engineering, ETL, or database development/administration
Hands‑on Azure Databricks, CI/CD & DevOps, and Snowflake experience
Strong Python, SQL, PySpark; comfort with both structured and unstructured data
Experience with Agile delivery
Bachelor’s degree in a technical discipline such as science, technology, engineering, mathematics
Experience with at least one NoSQL store (e.g., HBase/Cassandra/MongoDB)
Familiarity with Hadoop ecosystem (HDFS, Spark), and data integration/ETL tools
Exposure to ML ops tooling (MLflow), AKS‑backed API services, and integration patterns between Databricks, Snowflake, and application layers
Demonstrated contributions to data quality/stewardship initiatives (lineage, metadata, GDM frameworks)
Clear communication and ability to present technical trade‑offs to stakeholders
Working knowledge of SFDC data model and commercial processes (opportunities, quotes, quote line items)
Benefits
Comprehensive benefits package including employer-subsidized Medical, Dental, Vision, and Life Insurance
Short-Term and Long-Term Disability
401(k) match
Flexible Spending Accounts
Health Savings Accounts
EAP
Educational Assistance
Parental Leave
Paid Time Off (for vacation, personal business, sick time, and parental leave)
Staff Data Engineer at Headspace building privacy - first data platforms for mental health support. Leading data engineering strategies and mentoring team members to enhance data - driven decision making.
Senior Data Engineer building and implementing data pipelines at Headspace. Collaborating with analytics and data science teams to enhance personalized mental health support.
Data Engineering Intern working on data pipelines and infrastructure in fast - growing fintech. Collaborating with data engineers, learning best practices and developing data solutions.
Senior Software Engineer building and maintaining data infrastructure for Gusto. Collaborating with Data Science and Business Intelligence teams to achieve their goals.
Data Engineer building and maintaining scalable data pipelines for AI Search Infrastructure at You.com. Collaborating across teams to ensure data quality and enable AI capabilities.
Data Engineer developing and managing technology - based data solutions for clients in different industries in Greece. Participating in software development lifecycle within Agile team setting.
Data Architect leading design and governance of high - quality data architectures for clients. Collaborating with engineering teams and stakeholders to transform business challenges into scalable data solutions.
Data Engineer supporting vehicle buying and selling solutions through integration pipelines. Collaborating with teams to build digital vehicle platforms and optimize data processes in São Paulo.
Senior Advanced Data Engineer designing and optimizing data architecture for Honeywell. Collaborating with cross - functional teams to drive data - driven decision - making and operational efficiency.
Senior Data Engineer building and operating data platforms at bsport for analytics and AI/ML. Collaborating with Data team to enrich data layers and maintain platform observability.