Designs and maintains ETL/data pipelines for BASF Agricultural Solutions in Montevideo; ensures data quality, integrity, and analytics readiness while collaborating across teams.
Responsibilities
Design, build, and maintain data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems such as data warehouses or data lakes.\n• Integrate data from different databases, applications, and external sources, ensuring data consistency, integrity and accuracy.\n• Create and maintain data models, schemas, and structures that facilitate efficient data storage and retrieval.\n• Transform and clean raw data to prepare it for analysis, including handling missing values, data normalization, and aggregation.\n• Implement data quality checks, validation processes, and error handling to maintain high data quality standards.\n• Administer and optimize databases, including performance tuning, indexing, and ensuring data security.\n• Automate ETL processes and routine tasks to improve efficiency and reduce manual intervention.\n• Continuously monitor data pipelines and infrastructure for performance issues, errors, and data anomalies.\n• Troubleshoot and resolve issues promptly.\n• Stay updated with the latest data engineering tools and technologies and evaluate their suitability for the organization's needs.
Requirements
Education: Graduates or advanced students of Software Engineering, bachelor's degree in computer science, Telematics Engineering, Information Technology Analyst (or related careers).\n• Working Experience: 3-5 years of experience in IT Consulting or related fields.\n• Language skills: Advanced level of English (spoken and written).\n• Technical skills: Advanced skills in understanding data requirements Advanced skills in design and implementation of ETL / ELT Data Pipelines Advanced skills in understanding data integrity requirements Advanced skills in Databricks, SQL and Azure Cloud (must)\n• Soft skills: Advanced skills in effective communication Basic skills in project management
Benefits
Hybrid and flexible working model\n• Private health insurance for you and your family from your first day in the company and life insurance\n• Flexible leave days in addition to your statutory leave\n• The possibility of remote work from abroad for up to 6 weeks a year\n• A free day on your birthday\n• Wellness benefits, such as nutrition services and in-company massages.
Senior Data Engineer at Capgemini designing and optimizing scalable data architectures on Databricks and GCP. Collaborating across teams to transform business needs into reliable technical solutions.
Data Engineer transforming legacy on - premises systems to cloud - native architectures for advanced data analytics. Collaborating with teams to build efficient data solutions using Python and AWS.
Data Engineering Academy focused on Snowflake and Databricks for professionals interested in expanding their technical capabilities. Fully remote with future office work in Monterrey or Saltillo after completion.
Senior Data Engineer at Intent HQ designing and scaling data platforms. Building high - impact intelligence from millions of customer insights with a focus on performance and reliability.
SAP Data Engineer supporting MERKUR GROUP's evolution into a data - driven company. Responsible for data integration, modeling, and collaboration with various departments in Group Finance.
Data Engineer at Booz Allen Hamilton organizing data and developing advanced technology solutions. Leading data engineering activities for mission - driven projects and mentoring multidisciplinary teams.
Senior Data Engineer at Bristol Myers Squibb developing scalable data pipelines for foundational products. Collaborating with data scientists and IT professionals to ensure data quality and accessibility.
Data Engineer II role focusing on developing and maintaining data pipelines for analytics. Collaborating with Data Science and Analytics teams to ensure data quality and reliability.
Senior Data Architecture Specialist designing and maintaining data integration solutions for Morgan Stanley. Involved in building data architecture and optimizing data storage using various technologies.