Data Engineer specializing in Snowflake at Cayuse Commercial Services, focusing on data pipelines and analytics capabilities. Collaborating with business intelligence teams to enhance decision-making.
Responsibilities
Utilize SQL window functions to perform advanced data calculations, such as running totals, moving averages, and rankings, enabling deeper insights into sales trends and performance metrics.
Prepare and automate detailed real-time reports using Python and SQL, providing actionable insights to stakeholders and enhancing the company’s decision-making processes.
Develop and optimize data pipelines using Snowpipe and SnowSQL for real-time and batch data ingestion, achieving improved data processing efficiency by 20%.
Implement Snowpark to build scalable and high-performance data applications using Python, enhancing data manipulation, analysis, and processing capabilities.
Leverage Snowflake’s Time Travel features to enable auditing, data recovery, and version control, ensuring data integrity and regulatory compliance.
Manage Snowflake's Zero Copy Cloning to create cost-effective development, testing, and staging environments without incurring additional storage costs.
Collaborate with product managers to develop dynamic dashboards on Tableau, enabling tracking of key supply chain metrics and effectively identifying Key Risk Indicators (KRIs) and Key Performance Indicators (KPIs).
Requirements
Bachelor's degree in Computer Science, Data Engineering, Information Systems, or a related field.
Proven experience working with Snowflake Data Cloud and its key functionalities, including Snowpark, Snowpipe, and Time Travel.
Proficiency in programming with Python for data processing and analytics.
Advanced SQL skills, including working with window functions for complex calculations and analysis.
Experience developing interactive dashboards with Tableau or similar visualization tools.
Strong understanding of data engineering best practices, including data integrity, compliance, and real-time data ingestion techniques.
Familiarity with cloud data architecture and concepts such as Zero Copy Cloning and scalable application development.
Certification in Snowflake or related cloud data platforms (e.g., AWS, Azure, GCP) - preferred.
Experience in supply chain analytics or supporting business systems in a large-scale environment - preferred.
Strong collaboration skills and ability to work cross-functionally with product managers, analysts, and other engineering teams - preferred.
Senior Data Engineer at Clorox designing and maintaining data pipelines and solutions on cloud platforms. Collaborating with cross - functional teams to support data - driven business decisions.
Data Engineering & Warehousing Manager leading the design and development of enterprise data pipelines. Collaborating on data governance standards and ensuring scalable data solutions for Hastings Insurance.
Senior Data Engineer at Air Methods leading data - driven solutions and mentoring team members. Responsible for designing and improving data architecture and analytics to create impactful business insights.
Data Engineer III developing high - performance data solutions for Walmart Global Tech. Collaborating with teams to build scalable data pipelines and ensure data governance.
Data Engineer optimizing and maintaining data architecture for fintech solutions in Latin America. Involved in data governance, pipeline development, and cross - team collaboration for tech innovation.
DataOps Engineer at Eeze focusing on data pipeline stability across multiple products. Collaborating with IT teams to maintain quality, observability, and operational efficiency.
Data Engineer developing and enhancing data pipelines and models at ERNI Schweiz. Required skills include SQL and Python with opportunities for remote work in Europe.
Senior Data Engineer developing ETL and data pipelines for Burlington’s digital transformation team. Collaborating with analytics and engineering teams to support insights from data analysis.
Data Engineer responsible for Azure SQL database development in a leading Norwegian damage service company. Engage in data quality, integration, and collaboration on analytical tools.
GCP Data Engineer responsible for building and optimizing scalable data pipelines using GCP services. Develop, maintain, and ensure data quality in ETL/ELT workflows with Python and SQL.