**Architect and Evolve Our Core Data Platform** You will own the technical vision and roadmap for our data platform, steering its evolution on our modern cloud stack and ensuring it meets the demands of a rapidly scaling business.
**Own the Architecture:** Design, implement, and refine a robust data lakehouse architecture (e.g., Medallion) using Databricks and Delta Lake to ensure data reliability and performance.
**Build Scalable Ingestion Frameworks:** Develop and maintain resilient, reusable patterns for ingesting data from a diverse set of sources, including our systems, transactional databases, event streams, and third-party SaaS APIs.
**Define Data Modelling Standards:** Lead the implementation of our core data modelling principles (e.g., Kimball dimensional modelling) to produce curated, intuitive datasets for business intelligence and product analytics.
**Implement Robust Governance:** Use tools like Unity Catalog to establish a comprehensive data governance framework, covering data lineage, fine-grained access controls, and a user-friendly data catalogue.
**Manage Platform Performance and Cost:** Develop and implement strategies for monitoring, optimising, and forecasting our Databricks and cloud expenditure, ensuring the platform is both powerful and cost-effective.
**Champion Engineering Excellence and Best Practice** You will be the driving force for maturing our data operations, embedding a culture of quality, automation, and reliability into everything we do.
**Automate Everything with CI/CD:** Implement and advocate for automated CI/CD pipelines (e.g., using GitHub Actions) for all data assets, including dbt models, infrastructure changes, and Databricks jobs.
**Embed Git-Based Workflows:** Champion a Git-first culture for all data transformation code, establishing clear processes for branching, code reviews, and version control.
**Embed Automated Data Quality: **Implement comprehensive, automated data quality testing at every stage of our pipelines using tools like dbt test, ensuring data is accurate and trustworthy.
**Introduce Data Observability:** Establish thorough monitoring, logging, and alerting for all data pipelines to proactively detect, diagnose, and resolve issues before they impact the business**
Requirements
Mastery of data architecture principles, data modelling frameworks (e.g., dimensional modelling), and a strong understanding of data governance and security best practices.
A strong software engineering mindset, with significant experience implementing CI/CD for data, Git-based workflows, and automated data quality testing.
Exceptional communication and stakeholder management skills, with a proven ability to translate complex technical concepts for non-technical audiences and influence business decisions.
A genuine passion for leadership and mentorship, with a track record of elevating the technical skills of those around you.
**Tech Stack:**
Dbt
Databricks, Unity Catalog
Terraform
AWS: Redshift, Dynamo db, API gateway, Cloud Watch, Lambda, Streaming with Kenisis/Firehose, Glue, Bedrock
Stitch & Fivetran
Languages required include advanced SQL, python
Benefits
Enjoy a flexible remote-first work policy (with a work-from-home stipend to set you up for success!)
Own A piece of Deputy via our Employee Share Ownership Plan (ESOP)
Take paid parental leave to support you and your family
Stay protected with Group Salary Continuance Insurance
Access support through our Employee Assistance Program
Enjoy additional leave days — including study assistance, celebration days and volunteering
Join our global working groups focused on collaboration, belonging and connection
Get creative at our annual Hackathons
Take advantage of our novated leasing for electric vehicles, internet reimbursement and more!
Senior Data Engineer at Red Hat designing and optimizing 데이터 솔루션 supporting sales and forecasting. Collaborating with teams and applying modern data engineering practices to ensure data quality.
Senior Data Engineer leading the design and implementation of data pipelines for NVIDIA’s analytics and monitoring systems. Collaborating across teams to enhance data ingestion and analysis capabilities.
Associate Data Engineer at Boeing India supporting API Development and Data migration with a focus on engineering and technology solutions. Involves working independently to gather requirements and supporting architecture for API services and data analytics.
Senior Data Engineer building and maintaining robust data pipelines for various data products at Beep Saúde. Collaborating within the team and leading data governance practices.
Software Developer in Test working on cloud - based data platform at Tecsys. Ensuring quality and reliability of data pipelines and transformations using automation frameworks.
Data Engineer responsible for designing, building, and optimizing data pipelines and architectures in a tech environment. Requires extensive experience with modern data warehousing and cloud platforms.
Lead Data Engineer role at Brillio focusing on AI & Data Engineering with expertise in Azure and MS Fabric. Collaborate within the Data Engineering team in Pune, Maharashtra, India.
Data Architect at Whiteshield designing scalable, secure data architectures for national and enterprise transformation programs. Architecting modern data platforms to support analytics, AI and operational use cases.
Data Engineer managing scalable data ecosystems for actionable business intelligence and cross - functional stakeholder collaboration. Optimizing ETL/ELT pipelines and ensuring data integrity and security.
Data Engineer specializing in data architecture and solutions for a banking environment, driving value for customers through innovative engineering practices and technologies in data management.