Data Engineer building and maintaining data pipelines using cloud technologies. Joining transportation sector to enhance analytics, reporting, and operational insights.
Responsibilities
**About PrePass**
PrePass® is North America's most trusted weigh station bypass and toll management platform. We’re transforming how the transportation industry operates—creating solutions that keep trucks moving safely, efficiently, and compliantly. This means making bold decisions and building systems that support not only fleets but the broader economy. It all starts with enabling commercial vehicles to keep rolling with seamless toll management, weigh station bypass, and safety solutions. It’s what we do best, and we do it to meet the demands of the road every day.
That’s why people join us: our solutions are implemented in real-time, on highways and interstates across the nation, helping fleets go farther, faster. This work challenges and rewards, presenting complex problems that need ambitious answers. We hire bold thinkers with a heart for impact, a passion for progress, and the optimism to shape the future of transportation.
**About the Role**
We’re looking for a skilled Data Engineer to join our team in the transportation sector. In this role, you’ll work with modern cloud technologies to build and maintain data pipelines that support analytics, reporting, and operational insights. You’ll be part of a collaborative team focused on delivering reliable, scalable data solutions that help drive smarter decision-making across the organization.
This is a great opportunity for someone with solid experience in backend data systems who enjoys solving real-world problems and working with evolving data platforms. This is a hybrid position located at our office in downtown Phoenix.
**Key Responsibilities**
Design, develop, and maintain cloud-native data pipelines leveraging Databricks, Microsoft Azure Data Factory, and Microsoft Fabric to support robust data integration and analytics solutions.
Implement incremental and real-time data ingestion strategies using medallion architecture for data lake storage.
Write and optimize complex SQL queries to transform, integrate, and analyze data across enterprise systems.
Support and troubleshoot legacy data platforms built on SSIS and SQL Server, ensuring high availability and performance of critical data processes
Develop features with a focus on scalability, maintainability, and testability.
Troubleshoot and resolve data integration and quality issues, ensuring reliable data delivery.
Participate in proof-of-concept projects, providing technical analysis and recommendations.
Requirements
**Required**
5+ years of experience designing and building data solutions.
Strong proficiency in SQL and Python for data analytics and transformation.
Experience with ETL pipeline development and automation.
Solid understanding of Data Lake architecture and design principles.
Excellent collaboration skills and the ability to adapt in a dynamic environment.
**Preferred**
Experience with Azure Cloud services and cloud-based ETL tools.
Familiarity with data visualization tools such as Power BI or Tableau.
Understanding of event-driven architectures, including queues, batch processing, and pub/sub models.
Exposure to NoSQL databases like MongoDB or Cassandra.
**Bonus Points For**
Experience in Data Science or Machine Learning, particularly in model deployment or feature engineering.
Benefits
**How We Will Take Care of You**
Robust benefit package that includes medical, dental, and vision that start on date of hire.
Paid Time Off, to include vacation, sick, holidays, and floating holidays.
401(k) plan with employer match.
Company-funded “lifestyle account” upon date of hire for you to apply toward your physical and mental well-being (i.e., ski passes, retreats, gym memberships).
Tuition Reimbursement Program.
Voluntary benefits, to include but not limited to Legal and Pet Discounts.
Employee Assistance Program (available at no cost to you).
Company-sponsored and funded “Culture Team” that focuses on the Physical, Mental, and Professional well-being of employees.
Community Give-Back initiatives.
Culture that focuses on employee development initiatives.
Technical Lead for data engineering and reporting in healthcare technology at Dedalus. Shaping innovative software solutions and leading cross - functional technical teams in Australia.
Senior ML Data Engineer working on data pipeline curation for Mobileye's autonomous vehicle dataset. Collaborating across teams to enhance ML engineering and vision model applications.
Data Engineer managing customer datasets to enhance industrial research and development. Responsible for ETL pipelines and data ingestion for the Uncountable Web Platform.
Data Engineer designing and maintaining scalable data solutions on Databricks for clinical trials. Collaborating with teams to overcome data challenges and ensure the smooth logistics of clinical supplies.
Senior Manager leading a team of database engineers to manage CCC's data platform. Overseeing mission - critical applications and collaborating with cross - functional teams in a hybrid environment.
As a Principal Data Architect at Solstice, lead the design and implementation of data architecture solutions. Ensure data integrity, security, and accessibility to meet strategic organizational goals.
Data Platform Specialist overseeing data workflows and enhancing data quality for Stackgini's AI - driven IT solutions. Collaborating with teams to drive improvements and stakeholder support.
Data Engineer designing data pipelines in Python for a major railway industry client. Collaborate with Data Scientists and ensure code quality with agile methodologies.
Senior Data Engineer responsible for building and optimizing data pipelines for banking analytics initiatives. Collaborating with data teams to ensure data quality and readiness for enterprise use.
Senior Data Engineer developing scalable data solutions on Databricks for analytics and operational workloads. Collaborating with cross - functional teams to modernize the data ecosystem.