Junior Engineer creating GCP data-driven solutions for enterprise data architecture. Collaborating with stakeholders on data needs and optimizing performance and reliability.
Responsibilities
Responsibilities **• Design and build GCP data driven solutions for enterprise data warehouse and data lakes **• Design and implement scalable data architectures on GCP, including data lakes, warehouses, and real-time processing systems.**• Utilize ML Services like Vertex AI.**• Develop and optimize ETL/ELT pipelines using tools like Python, SQL, and streaming technologies (e.g., Kafka, Apache Airflow).**• Architect Historical and Incremental Loads and Refine Architecture on an ongoing basis**• Manage and optimize data storage, partitioning, and clustering strategies for high performance and reliability, utilizing services such as BigQuery, Spark, Pub/Sub, and Object Storage.**• Collaborate with data scientists, Data engineers, and other stakeholders to understand data needs and deliver solutions aligned with business objectives, security, and data governance.**• Automate infrastructure and deployments using Infrastructure as Code (IaC) using tools like Terraform and CI/CD practices (e.g., Tekton) to ensure reliability and scalability.**• Operationalize machine learning models by building data infrastructure and managing structured and unstructured data, supporting AI/ML/LLM workflows, including data labeling, classification, and document parsing.**• Monitor and troubleshoot data pipelines and systems to identify and resolve issues related to performance, reliability, and cost-effectiveness.**• Document data processes, pipeline designs, and architecture, contributing to knowledge transfer and system maintenance.**
Requirements
**Qualifications and Skills **
Must have -
Professional GCP - Data Engineer Certification.
2 + years coding skills in Java/Python and Terraform.
2+ years’ experience
Experience in working with Agile and Lean methodologies.
GCP Expertise: Strong proficiency in GCP services, including BigQuery, Dataflow, Dataproc, Data Fusion, Air Flow, Pub/Sub, Cloud Storage, Vertex AI, Cloud Functions, and Cloud Composer, GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table
Programming & Scripting: Expert-level skills in Python and SQL are essential. Familiarity with languages like Scala or Java can also be beneficial, especially for working with tools like Apache Spark.
Data Engineering Fundamentals: Solid understanding of data modeling, data warehousing concepts, ETL/ELT processes, and big data architecture, Designing pipelines and architectures for data processing.
Big Data Technologies: Experience with technologies like Apache Spark, Apache Beam, and Kafka is often required.
DevOps & MLOps: Knowledge of DevOps methodologies, CI/CD pipelines, and MLOps practices, including integrating data pipelines with ML workflows.
Security & Compliance: Expertise in implementing Identity and Access Management (IAM) policies, ensuring data encryption, and adhering to data privacy regulations.
Analytical & Problem-Solving Skills: Demonstrated ability to analyze complex datasets, identify trends, debug issues, and optimize systems for performance and cost efficiency.
Communication & Collaboration: Excellent communication and teamwork skills, with the ability to collaborate effectively with technical and non-technical stakeholders in agile environments.
Experienced in Visualization Tool – Qlik, Looker Studio, Power BI
Validation and Calibration Engineer ensuring equipment reliability and product quality in a pharmaceutical environment. Planning, executing, and maintaining validation and calibration activities in compliance with cGMP regulations.
Specialist, Validation conducting qualification and validation activities for pharmaceutical projects at Hikma. Collaborating across teams to ensure compliance and effective execution of validation processes.
Mine Planning Engineer responsible for developing underground mine designs and schedules for Evolution Mining. Collaborating with planning, scheduling, and underground operations teams for efficient execution.
Project Engineer - Electrical delivering engineering projects to support safe and efficient mining operations at Ernest Henry. Collaborating with teams for successful project execution and electrical system management.
Engineer designing, planning, and implementing cloud infrastructure for diverse clients in Defence Enterprise Business Unit. Support operations and manage system/network infrastructure projects effectively.
Load Calculation Engineer supporting certification activities and load calculation for wind turbine compliance. Requires advanced knowledge in wind‑turbine theory and proficiency with specific tools.
Engineer responsible for assuring software quality for Windfarm Control by developing programs and defining test cases. Collaborating with different departments in an international environment.
Software Engineer 3 at Newport News Shipbuilding collaborating on software requirements development and validation for naval systems. Conducting multidisciplinary research and ensuring compliance with software standards.
Process Engineer focused on continuous improvement in food manufacturing, leading projects and mentoring teams. Collaborating with plant leadership to implement lean manufacturing principles.
Manufacturing Engineer Intern supporting development and documentation of aerospace hydraulic actuator production processes. Collaborating with teams to improve product flow and quality while utilizing CAD tools.