Data Architect leading the design and governance of data architecture for FFF Enterprises. Collaborating across teams to optimize data solutions and support analytics and reporting initiatives.
Responsibilities
Lead the design and governance of our enterprise data architecture.
Define and implement end-to-end data architecture standards, including data modeling, data integration, metadata management, and governance across the enterprise.
Design scalable data models across Bronze, Silver, and Gold layers of a Databricks Lakehouse using Delta Lake and Unity Catalog and DLT.
Collaborate with data engineers to optimize ingestion and transformation processes across streaming and batch pipelines.
Establish canonical models and semantic layers that power downstream BI tools and self-service analytics.
Define and enforce data quality, data lineage, and data security policies in coordination with governance teams.
Work with business stakeholders, product managers, and analysts to translate analytical use cases into high-quality data models and architecture patterns.
Provide architecture oversight and best practices guidance on data integration tools (e.g., Fivetran), data cataloging, and performance optimization.
Review and approve physical and logical data model changes across teams to ensure consistency and maintain architectural integrity.
Requirements
Bachelor’s Degree in a related field or four (4) years relevant experience in lieu of degree.
Deep understanding of SAP ERP data models, especially core financials, logistics, and materials domains.
Expertise in dimensional modeling, star/snowflake schemas, and modern data warehousing patterns.
Proficiency in SQL and Python, with the ability to guide data engineers on best practices.
Strong understanding of data governance, lineage, and security frameworks.
Ability to communicate architectural concepts clearly to both technical and non-technical audiences.
Must have seven (7) years [eleven (11) for non-degreed candidates] of experience in data architecture or data engineering, with a focus on enterprise-scale data.
Strong hands-on experience with Databricks, including Delta Lake, Unity Catalog, and Lakehouse architecture.
Proven experience in conceptual, logical, and physical data modeling using tools like ER/Studio, Erwin, Lucidchart, or dbt.
Experience working in cloud environments such as Azure, AWS, or GCP.
Senior Data Engineer at Red Hat designing and optimizing 데이터 솔루션 supporting sales and forecasting. Collaborating with teams and applying modern data engineering practices to ensure data quality.
Senior Data Engineer leading the design and implementation of data pipelines for NVIDIA’s analytics and monitoring systems. Collaborating across teams to enhance data ingestion and analysis capabilities.
Associate Data Engineer at Boeing India supporting API Development and Data migration with a focus on engineering and technology solutions. Involves working independently to gather requirements and supporting architecture for API services and data analytics.
Senior Data Engineer building and maintaining robust data pipelines for various data products at Beep Saúde. Collaborating within the team and leading data governance practices.
Software Developer in Test working on cloud - based data platform at Tecsys. Ensuring quality and reliability of data pipelines and transformations using automation frameworks.
Data Engineer responsible for designing, building, and optimizing data pipelines and architectures in a tech environment. Requires extensive experience with modern data warehousing and cloud platforms.
Lead Data Engineer role at Brillio focusing on AI & Data Engineering with expertise in Azure and MS Fabric. Collaborate within the Data Engineering team in Pune, Maharashtra, India.
Data Architect at Whiteshield designing scalable, secure data architectures for national and enterprise transformation programs. Architecting modern data platforms to support analytics, AI and operational use cases.
Data Engineer managing scalable data ecosystems for actionable business intelligence and cross - functional stakeholder collaboration. Optimizing ETL/ELT pipelines and ensuring data integrity and security.
Data Engineer specializing in data architecture and solutions for a banking environment, driving value for customers through innovative engineering practices and technologies in data management.