Staff Data Engineer designing and improving ETL/ELT pipelines at Prosper. Collaborating with teams to use trusted data efficiently across internal and external systems.
Responsibilities
Work with engineers, DBAs, infrastructure, product, data engineers, and analysts to learn Prosper’s data ecosystem and keep it running fast and secure.
Forge strong relationships with business stakeholders, analysts, and data scientists to grasp their needs and craft data solutions to meet them.
Design and run self-checking ETL/ELT pipelines with logging, alerting, and automated tests.
Develop pipelines on Google Cloud Platform (GCP) using Python, dbt, and Airflow (Composer), and additional tools as needed.
Evaluate new tools and approaches; bring forward practical ideas that improve speed, quality, or cost.
Bring curiosity, ownership, and clear thinking to tough engineering problems.
Requirements
Degree in Computer Science or related field, or equivalent experience.
8+ years of object-oriented programming in an enterprise setting. Deep experience in Python; experience with Java, C#, or Go is a plus.
Proficiency in a SQL (e.g. BigQuery, T-SQL, Redshift, PostgreSQL), with an interest in dimensional modeling and data warehouses.
Solid Git/GitHub skills and familiarity with Agile and the SDLC.
Strong communication and collaboration skills across technical and non-technical teams.
DevOps experience with CI/CD, containers (Docker, Kubernetes), and infrastructure as code (Terraform or similar).
Proficient with LLM-assisted development in IDEs such as Cursor.
Commitment to an inclusive, learning-focused culture and continuous improvement.
Data Engineer supporting, developing, and maintaining a data analytics platform with agile delivery. Working with business and IT teams to leverage data technologies effectively.
Data Engineer at Domes Resorts responsible for designing data pipelines and supporting analytics. Join E - Commerce, Intelligence & Innovation team to drive business growth through data solutions.
VP Enterprise Data Architect managing scalable data solutions and architecture for Pacific Life. Leading teams and advancing AI - first data ecosystem.
Senior Data Engineer at Red Hat designing and optimizing 데이터 솔루션 supporting sales and forecasting. Collaborating with teams and applying modern data engineering practices to ensure data quality.
Senior Data Engineer leading the design and implementation of data pipelines for NVIDIA’s analytics and monitoring systems. Collaborating across teams to enhance data ingestion and analysis capabilities.
Associate Data Engineer at Boeing India supporting API Development and Data migration with a focus on engineering and technology solutions. Involves working independently to gather requirements and supporting architecture for API services and data analytics.
Senior Data Engineer building and maintaining robust data pipelines for various data products at Beep Saúde. Collaborating within the team and leading data governance practices.
Software Developer in Test working on cloud - based data platform at Tecsys. Ensuring quality and reliability of data pipelines and transformations using automation frameworks.
Data Engineer responsible for designing, building, and optimizing data pipelines and architectures in a tech environment. Requires extensive experience with modern data warehousing and cloud platforms.
Lead Data Engineer role at Brillio focusing on AI & Data Engineering with expertise in Azure and MS Fabric. Collaborate within the Data Engineering team in Pune, Maharashtra, India.