Data Engineer developing scalable data solutions across multi-cloud environments for clients. Mentoring junior engineers while ensuring data quality and promoting best practices within the team.
Responsibilities
Design, build, and optimise scalable Databricks Lakehouse solutions across AWS, Azure, and GCP
Develop robust data ingestion, transformation, and orchestration pipelines using Databricks (Spark, Delta Lake, Workflows)
Build high-quality data models to support analytics, reporting, and AI/ML use cases
Implement medallion architectures (bronze, silver, gold) and modern data engineering patterns
Collaborate closely with clients to translate business requirements into well-architected, actionable data solutions
Support or implement dbt, CI/CD pipelines, Git-based workflows, and engineering best practices
Ensure strong data quality, governance, lineage, security, and performance optimisation within Databricks environments.
Work alongside analytics, governance, and AI consultants to deliver cohesive, end-to-end solutions
Contribute to reusable assets, accelerators, and internal frameworks that strengthen Intelligen’s Databricks capability
Mentor junior engineers and positively influence client delivery and engineering standards
Requirements
4–6+ years’ experience in data engineering or analytics engineering
Strong hands-on experience with Databricks (Spark, Delta Lake, Workflows), ideally in production environments
Experience with at least one major cloud platform: AWS, Azure, or GCP
Strong SQL skills and experience building complex data transformations
Familiarity with modern data stacks — e.g. Databricks, Snowflake, dbt, cloud data lakes, orchestration tools
Experience working across the full data lifecycle: ingestion → transformation → modelling → consumption
Consulting, stakeholder-facing experience, or cross-functional delivery exposure
Knowledge of DevOps concepts, version control, and/or CI/CD in data environments
Excellent communication, problem-solving, and collaboration skills
Sydney-based, with ability to work on-site with clients as required
A mindset of curiosity, delivery excellence, and continuous learning
Benefits
Work From Home - Flexible hours
Training & Development
Free Food & Snacks
Many socials and community groups
Opportunity to drive projects that are of interest to you!
Data Engineer role specializing in Azure & Snowflake at InfoCentric. Leading design and delivery of enterprise - scale data platforms for large organizations.
Principal Data Architect at PointClickCare ensuring coherent and scalable data architecture. Driving unified data direction while collaborating with Engineering Architecture team for AI enablement.
Data Engineer Tech Lead developing data solutions at Carelon. Leading a cross - functional team to optimize data workflows and maintain data integrity.
Lead Data Engineer responsible for evolving Manna’s data infrastructure for drone delivery. Overseeing data architecture and analytics while building scalable data pipelines.
Data Engineer designing, implementing, and optimizing data pipelines for DeepLight AI. Collaborating closely with a multidisciplinary team to analyze large - scale data.
Data Engineer designing and maintaining scalable ETL pipelines at Satori Analytics. Collaborating with teams to deliver high - quality analytics solutions across various industries.
Data Architect responsible for defining enterprise data architecture on AWS and Databricks Lakehouse platforms. Enabling scalable data lakes and enterprise analytics for financial services organizations.
Data Platform Operations Support leading data engineering strategy across projects for EXL. Driving innovation and optimization while collaborating with various teams in the organization.
Manager II leading data engineering projects at Navy Federal Credit Union. Overseeing data governance and quality initiatives while managing engineering teams in a hybrid work environment.
Principal Data Engineer responsible for architecting scalable data pipelines and building high - quality data foundations. Collaborating closely with experts to ensure data readiness for advanced analytics.