Develop scalable data pipelines and analytics solutions at Miami University. Collaborate with stakeholders to enhance data quality and maintainability.
Responsibilities
Develop, maintain, and enhance data pipelines using modern ELT tools (e.g., dbt, Fivetran, Airflow, or similar) in a cloud-based environment.
Write, optimize, and maintain SQL and/or Python-based transformations that support scalable analytics solutions.
Monitor, troubleshoot, and resolve issues in production data pipelines to ensure reliability, performance, and data integrity.
Implement data validation, testing, and quality checks to improve the consistency and trustworthiness of data assets.
Collaborate with stakeholders and technical partners to translate business needs into scalable data solutions.
Support and improve existing data integrations and workflows with a focus on maintainability and performance optimization.
Contribute to and follow best practices for version control, testing, and deployment (CI/CD).
Create and maintain documentation for data pipelines, models, and system processes.
Contribute to team practices that support consistent delivery and continuous improvement.
Design, develop, and optimize scalable data pipelines using modern ELT tools (e.g., dbt, Fivetran, Airflow, or similar) in a cloud-based environment.
Lead the development of advanced SQL and/or Python-based transformations supporting enterprise analytics and reporting.
Own production data pipelines, including monitoring, performance tuning, troubleshooting, and ensuring reliability and data integrity.
Design and implement robust data validation, testing, and quality frameworks to ensure trusted data at scale.
Design scalable data models and transformation patterns to support enterprise reporting and analytics needs.
Partner with stakeholders to translate complex business requirements into sustainable, high-impact data solutions.
Drive improvements to existing data pipelines and processes to enhance performance, scalability, and maintainability.
Lead adoption of best practices for version control, testing, deployment, and operational support (CI/CD).
Develop and maintain comprehensive documentation for data pipelines, models, and architecture.
Guide team practices that support consistent delivery, operational excellence, and continuous improvement.
Mentor team members and contribute to the growth of technical standards and capabilities across the team.
Requirements
Bachelor’s degree in computer science, information technology, or a relevant field earned by date of hire with two to four or more years of relevant experience OR Associate’s degree in computer science, information technology, or a relevant field earned by date of hire and four to six or more years of relevant experience.
Ability to analyze complex data and develop practical, scalable solutions.
Ability to troubleshoot and resolve data pipeline and data quality issues in a timely manner.
Ability to translate business needs into effective technical data solutions.
Ability to communicate technical concepts clearly to both technical and non-technical audiences.
Ability to work collaboratively across teams and build effective working relationships.
Ability to manage multiple priorities and adapt to changing requirements in a dynamic environment.
Ability to document solutions and processes to support maintainability and knowledge sharing.
Experience developing and maintaining data pipelines using modern ELT tools (e.g., dbt, Fivetran, Airflow, or similar).
Experience working with cloud data platforms such as Snowflake, Amazon Redshift, Google BigQuery, or Azure Synapse.
Experience using Python for data processing, automation, or integration.
Experience implementing data validation, testing, or monitoring solutions.
Experience using version control systems (e.g., Git) and contributing to CI/CD workflows.
Experience building or supporting data models and analytics solutions.
Experience working in Agile or iterative development environments.
Experience supporting enterprise data systems in a higher education or similarly complex environment.
Experience designing and optimizing scalable ELT pipelines using tools such as dbt, Fivetran, Airflow, or similar.
Experience working with cloud data platforms such as Snowflake, Amazon Redshift, Google BigQuery, or Azure Synapse at scale.
Strong experience using Python for data processing, automation, and system integration.
Experience designing dimensional data models and large-scale data transformations.
Experience implementing and managing data quality, testing, and monitoring frameworks.
Experience leading or significantly contributing to CI/CD practices for data pipelines.
Experience optimizing data pipelines for performance, scalability, and cost efficiency.
Experience working in complex organizational environments (e.g., higher education, healthcare, or enterprise settings).
Product Developer at Twelve, leading product development for beauty, wellness, and lifestyle merchandise. Collaborating with clients and creative teams to deliver high - quality products.
Formation en alternance pour devenir Développeur d’Applications. 100% prise en charge, formation à distance en collaboration avec des entreprises partenaires à Niort.
Power BI / Power Apps Developer designing and maintaining business solutions in the Public Sector. Collaborating with teams to integrate data sources and provide actionable insights from actionable data.
Statistical Programmer II at Syneos Health working with SAS for drug development analysis. Collaborating in hybrid settings involving statistical programming for clinical solutions to enhance healthcare outcomes.
Sr Statistical Programmer developing statistical programming for clinical trials at Syneos Health. Collaborating with cross - functional teams and managing multiple projects in a hybrid work environment.
Head of IT Products & Engineering at McMillan Shakespeare overseeing the transformation to product - led engineering. Responsible for strategic direction and execution in AI - assisted development.
Senior React Native Engineer enhancing the Heja app for youth sports. Crafting beautiful experiences and building high - performance solutions in collaboration with the Product team.
Salesforce Developer creating custom software applications for NASA. Collaborating with stakeholders to implement Salesforce integrations and ensuring data accuracy.
Engineering Platform Lead at GSK focusing on high - quality clinical and commercial oligonucleotide products. Responsible for ensuring product and process quality in manufacturing with collaboration across teams.
Lead Mobile Engineer responsible for architecture and development of mobile applications for high - stakes educational assessments at College Board, focusing on security and performance.