Junior Data Engineer collaborating with healthcare data sources and cross-functional teams to enhance data service capabilities. Developing and maintaining enterprise data management solutions for reporting and analysis.
Responsibilities
The Junior Data Engineer will work as part of the Data Management team to derive business value from enterprise data by implementing technical specifications provided by the Data Architect and Director of Data Operations & Engineering, including data storage, processing, transformation, ingestion, consumption, and automation.
This role will work with multiple healthcare data sources and cross-functional teams to establish integrated datasets across legacy and greenfield data systems/platforms.
Develop, implement, and maintain enterprise data management solutions to enable organizational business intelligence, reporting, visualization, and analysis.
Assist with the development, implementation and maintenance of an overall organizational data strategy that is in line with business processes.
Design and build data processing flows to extract data from various sources, such as databases, API endpoints, and flat files.
Load data into data storage systems, specifically Microsoft SQL Server, MongoDB, and Snowflake.
Transform data using industry-standard techniques such as standardization, normalization, de-duplication, filtering, projection, and aggregation.
Build and maintain data processing environments, including hardware and software infrastructure.
Collaborate with data producers, consumers, and subject matter experts to ensure smooth dissemination and flow of data within the organization.
Requirements
Minimum of 2+ years of experience working in data-related positions with increasing responsibility and scope of duties
2+ years working with relational databases
1+ years working with analytical data workloads
1+ years working with batch data processing technologies
Bachelor's Degree, or commensurate directly related work experience, is required with a concentration in a data-related field such as Computer Science, Informatics, Mathematics, Engineering, etc.
Demonstrated experience with relational and non-relational data storage models, schemas, and structures used in data lakes and warehouses for big data, business intelligence, reporting, visualization, and analytics
Hands-on experience with extract, transform, load (ETL) process design, data lifecycle management, metadata management, and data visualization/report generation
Practical experience with industry-accepted standards, best practices, and principles for implementing a well-designed enterprise data architecture.
Required Languages: Python and SQL
Required Libraries: PyData stack, Dask, and Prefect
Knowledge of healthcare interoperability standards such as HL7 (Health Level 7), FHIR (Fast Healthcare Interoperability Resources), CDA (Clinical Document Architecture), etc.
Knowledge of healthcare clinical code sets such as LOINC, SNOMED, CPT, ICD-10, etc.
Working knowledge of data flow orchestration tools such as Prefect and Airflow is preferred.
Benefits
Contexture provides a comprehensive benefits package. For details, please request a Benefit Summary from our Benefits Department.
Senior Data Engineer handling data engineering responsibilities in hybrid setting for banking industry. Collaborating with cross - functional teams and maintaining data quality in Azure environments.
Data Management professional at Kyndryl involved in creating innovative data solutions and ensuring the seamless operation of complex data systems. Collaborating with teams to transform requirements into scalable database solutions.
Software Engineer designing and developing scalable data processing applications on cloud infrastructure for Thomson Reuters. Collaborating with Data Analysts on AI - enabled solutions for data management and insight generation.
Manager of Data Platform overseeing AWS cloud infrastructure and Snowflake data warehouses for Thomson Reuters. Leading the design and implementation of data processing applications in a hybrid role located in Bengaluru.
Senior Data Engineer designing scalable data pipelines and solutions for Enterprise Data Lake at Thomson Reuters. Collaborating across teams to ensure efficient data ingestion and accessibility.
Senior Data Engineer at Technis developing scalable data pipelines and solutions for innovative connected spaces products. Collaborating within a cross - functional team to deliver high - quality data - driven outcomes.
Data Architect designing and implementing data architectures supporting analytics and ML for federal clients. Collaborating with teams to translate mission needs into robust data solutions.
IT Data Engineer developing data pipelines and integrations for Scanfil Group's global IT organization. Collaborating across teams to enhance data solutions and reporting capabilities.
Data Engineer developing Azure data solutions at PwC New Zealand. Responsibilities include data quality monitoring, pipeline development, and collaboration with stakeholders in a supportive environment.