Data Engineer at Berkshire Hathaway GUARD Insurance Companies collaborating with business leaders to transform data into insights. Building and governing data products for underwriting and operations.
Responsibilities
Collaborate closely with cross-functional teams; including Underwriting, Product, Claims, Finance, and Distribution/Operations to transform business questions into clearly documented requirements and actionable data solutions.
Model and publish curated, analysis ready datasets, semantic models in Power BI or analytics delivery that illuminate small commercial metrics.
Continuously learn and expand business knowledge to proactively produce automated analytic solutions to business challenges.
Design, develop, and maintain scalable ELT/ETL pipelines and lakehouse structures in Microsoft Fabric; Follow platform standards, optimize for performance, reliability, and cost.
Follow and implement documentation standards, governance and data quality controls across datasets.
Use Azure DevOps (ADO) for CI/CD and version control (branching, pull requests, release pipelines) to harden delivery and improve repeatability.
Document technical architecture and workflows with clear diagrams and runbooks to accelerate onboarding and reduce operational risk.
Requirements
Bachelor’s degree in computer science, information technology, or a related field.
2-3 years of experience in data engineering or a related role.
Proficiency in Microsoft Fabric is a plus.
Exposure to Azure DevOps (ADO) and CI/CD practices.
Strong SQL, Python and data modeling skills.
Experience with cloud platforms, preferably Azure.
Excellent problem-solving and analytical skills.
Strong communication and collaboration abilities.
Applicants must be authorized to work in the U.S. without current or future sponsorship.
Benefits
Competitive compensation
Healthcare benefits package that begins on first day of employment
401K retirement plan with company match
Enjoy generous paid time off to support your work-life balance plus 9 ½ paid holidays
Up to 6 weeks of parental and bonding leave
Hybrid work schedule (3 days in the office, 2 days from home)
Longevity awards (every 5 years of employment, receive a generous monetary award to be used toward a vacation)
Tuition reimbursement after 6 months of employment
Numerous opportunities for continued training and career advancement
Senior Data Engineer at Red Hat designing and optimizing 데이터 솔루션 supporting sales and forecasting. Collaborating with teams and applying modern data engineering practices to ensure data quality.
Senior Data Engineer leading the design and implementation of data pipelines for NVIDIA’s analytics and monitoring systems. Collaborating across teams to enhance data ingestion and analysis capabilities.
Associate Data Engineer at Boeing India supporting API Development and Data migration with a focus on engineering and technology solutions. Involves working independently to gather requirements and supporting architecture for API services and data analytics.
Senior Data Engineer building and maintaining robust data pipelines for various data products at Beep Saúde. Collaborating within the team and leading data governance practices.
Software Developer in Test working on cloud - based data platform at Tecsys. Ensuring quality and reliability of data pipelines and transformations using automation frameworks.
Data Engineer responsible for designing, building, and optimizing data pipelines and architectures in a tech environment. Requires extensive experience with modern data warehousing and cloud platforms.
Lead Data Engineer role at Brillio focusing on AI & Data Engineering with expertise in Azure and MS Fabric. Collaborate within the Data Engineering team in Pune, Maharashtra, India.
Data Architect at Whiteshield designing scalable, secure data architectures for national and enterprise transformation programs. Architecting modern data platforms to support analytics, AI and operational use cases.
Data Engineer managing scalable data ecosystems for actionable business intelligence and cross - functional stakeholder collaboration. Optimizing ETL/ELT pipelines and ensuring data integrity and security.
Data Engineer specializing in data architecture and solutions for a banking environment, driving value for customers through innovative engineering practices and technologies in data management.