Senior Data Architect leading design of scalable systems using machine learning and data engineering practices. Collaborating with stakeholders while mentoring data engineers at Entrata, Inc.
Responsibilities
Lead the design of performant, scalable systems using machine learning and data engineering best practices; establish coding standards and mentor data engineers.
Collaborate with executive stakeholders to understand business priorities and drive delivery of complex data products aligned with company goals.
Define and execute the enterprise big data strategy, guiding teams to deliver impactful, data-driven solutions.
Oversee data warehouse strategy with BI and data systems teams, ensuring alignment and scalability.
Lead assessment of existing platforms for maintainability, reliability, scalability, and performance; identify improvement opportunities.
Drive enhancements for existing solutions, ensuring continuous alignment with business needs and technology trends.
Champion adoption of industry best practices and emerging technologies to accelerate business growth.
Foster collaboration and knowledge sharing across cross-functional teams to ensure effective data architecture.
Partial telecommuting permitted; on-site at 4205 Chapel Ridge Rd, Lehi, UT 84043 when not telecommuting.
Requirements
Bachelor’s degree or U.S. equivalent in Computer Engineering, Computer Science, Data Science, or a related field, plus 7 years of professional experience as a Software Engineer, Data Architect, or any occupation/position/job title involving data structuring for enterprise SaaS systems.
Must also have experience in the following:
7 years of professional experience performing coding in PhP or JavaScript;
5 years of professional experience coding and interpreting sensitive PII;
5 years of professional experience designing modular architectures for data ingestion, processing, storage, and model training;
3 years of professional experience recommending and driving technical initiatives;
3 years of professional experience in designing or developing in an event driven architecture including Kafka or Confluent;
3 years of professional experience with UML standards for architectural communication; and
3 years of professional experience working in an AWS cloud environment.
Manager II leading data engineering projects at Navy Federal Credit Union. Overseeing data governance and quality initiatives while managing engineering teams in a hybrid work environment.
Senior Data Engineer designing and maintaining data pipelines for Qodea's global technology solutions. Collaborating with teams to ensure data quality and governance across platforms.
Senior Data Engineer at Qodea designing scalable data pipelines and infrastructure. Delivering solutions utilizing cutting - edge tools and collaborating closely with teams for impactful results.
Senior Data Engineer building and maintaining data pipelines for cloud and AI solutions at Qodea. Collaborating with ML engineers and focusing on reliability and performance in a cloud - native environment.
Principal Data Engineer responsible for architecting scalable data pipelines and building high - quality data foundations. Collaborating closely with experts to ensure data readiness for advanced analytics.
Product Director managing Target's Customer Data Platform. Leading strategy, financials, and team development to enhance guest experience through data - driven initiatives.
Senior AI Data Pipeline Engineer building scalable data pipelines and optimizing AI workflows at Trimble. Designing architectures that enhance digital construction technology across industries.
Data Engineering Intern at Efficy supporting data management and ETL pipeline development. Collaborate with teams and contribute to the enhancement of data architecture.
Senior Data Engineer building and optimizing data pipelines for Garner Health. Seeking a candidate with experience in AWS, SQL, and Python with a mission - driven mindset.
Data Engineer (GCP) designing and maintaining scalable data platforms at LUZA Group in Portugal. Collaborating and ensuring data integrity across multiple complex datasets.