Data Engineer creating data pipelines for Santander's card domain. Collaborating with an agile team on strategic projects and leveraging Databricks and PySpark expertise.
Responsibilities
The Data Engineer will work with the Card team’s Data Engineering group to create data pipelines for ingesting and exposing card-domain data in Santander Brazil’s Corporate Data Lake.
The person will work in an agile team on a strategic area project and should have experience with Databricks and PySpark.
Requirements
Databricks proficiency: Experience working with Apache Spark on Databricks, including building and optimizing data pipelines.
PySpark, Python and Kedro experience: Strong programming skills in PySpark and Python and experience with Kedro to develop, debug and maintain data transformation code.
Batch and streaming data processing: Knowledge of batch and streaming (messaging) data processing, with the ability to design, implement and maintain data processing pipelines.
DevOps knowledge: Familiarity with Jenkins for continuous integration and continuous delivery (CI/CD), as well as automation of deployment tasks and pipeline management.
Git: Proficiency with Git for source code version control and effective team collaboration.
Agile methods: Understanding of agile principles and practices such as Kanban and Scrum for effective collaboration and project management.
Orchestration (e.g., Control‑M or others): Knowledge of workflow orchestration tools, important for scheduling and controlling workflows.
Microsoft Azure knowledge: Experience with key Microsoft Azure data services, including Azure Databricks, Azure Data Factory and Azure Storage.
AWS knowledge: Experience with key AWS services such as Aurora PostgreSQL, CloudWatch, Lambda and S3.
On‑Premises environments (Cloudera) experience: Previous experience with the Cloudera platform or other on‑premises big data solutions, including Hadoop, HBase and Hive, is desirable.
Object‑oriented development knowledge: Familiarity with Java is helpful (not required to write code, but to interpret it).
Optional certifications: AZ‑900 (Microsoft Azure Fundamentals) and DP‑900 (Microsoft Azure Data Fundamentals) are preferred and demonstrate solid knowledge of the Azure platform and data fundamentals.
Benefits
Bradesco Health Plan (30% co-payment)
Bradesco Dental Plan (no employee contribution)
Life Insurance
Wellhub (Gympass)
Childcare allowance
Allowance for children with special needs
Payroll‑deductible loan
Private pension
Pet plan
SESC benefits
Conexa telemedicine
Cost allowance
Meal / Food voucher
Multi‑benefits card
Medical plan upgrade
DIFFERENTIALS:
We are a socially responsible employer: extended maternity and paternity leave
INMaterna Program: support program for pregnant employees
Newborn welcome kit and the book "It Happened When I Was Born"
Professional development: courses available through the internal university
100% remote or hybrid, depending on project applicability.
Enterprise Architect Sr position at PNC collaborating with technology stakeholders to build enterprise data architecture. Focus on data models, governance, and guiding data decisions for business strategy.
Data Engineer architecting and maintaining data products using Databricks and Power BI for Nitro's digital transformation efforts. Collaborating to optimize processes and leverage cloud technologies.
Lead Data Engineer driving major transformation in enterprise technology at Capital One. Collaborating with Agile teams to develop, test, and implement innovative full - stack solutions.
Senior Data Engineer at Teya, focusing on scalable data product development and architectural design. Join to help build the modern data platform for advanced analytics and AI.
Seeking a Data Engineer with Palantir expertise, focusing on scalable solutions and data workflows. Join EXL, a leader in data analytics and digital operations.
Senior Data Engineer at Keyrus leading the design, development, and delivery of scalable data platforms. Collaborating with teams to translate requirements into production - grade solutions and mentoring engineers.
Senior Data Engineer for global payments platform designing ETL pipelines and data models. Collaborating across teams to tackle complex data challenges in an innovative fintech environment.