Data Engineer designing and maintaining scalable ETL pipelines at Satori Analytics. Collaborating with teams to deliver high-quality analytics solutions across various industries.
Responsibilities
**What Your Day Might Look Like:**
Independently design and maintain scalable ETL pipelines within a collaborative project team, delivering clean, analytics- and AI-ready data.
Work with SQL, Python, PySpark and tools like MS Fabric, Azure Data Factory, Databricks, Snowflake to develop, optimize, and automate data processes.
Design robust ETL, scalable data models and optimised design patterns for analytics and BI workloads.
Ensure data quality and data governance through automated checks, monitoring, source to target mapping, data lineage and continuous improvements.
Collaborate with cross-functional teams to understand data requirements, business semantics and deliver high-quality solutions.
Troubleshoot, optimize, and support production pipelines to keep data flowing smoothly.
Use Git and Agile practices to work effectively in collaborative, iterative projects.
Requirements
**Your Superpowers 🚀**
BSc or MSc in Computer Science, Engineering, or similar.
Strong SQL, Python and/or PySpark skills.
Solid professional experience developing ETL pipelines (e.g. Azure Data Factory, Databricks, etc.) and modern data warehouses (e.g. MS Fabric or Databricks delta lakehouse, Snowflake, etc).
Solid professional experience working with relational and NoSQL databases and systems (MS SQL Server, PostgresSQL, MongoDB, etc).
Strong understanding of data modelling and design patterns (star-schema, data vault, SCD).
Basic knowledge of cloud platforms (Azure, AWS, or GCP).
Basic knowledge of visualization tools (Power BI, Tableau, Looker, etc.).
Understanding of Agile practices and version control systems (GitHub, Azure DevOps).
3+ years’ experience in hands-on data engineering.
Understanding of AI concepts and architectures.
Experience with enterprise platforms like Salesforce, SAP or Entersoft.
Familiarity with no-code/low-code ETL tools such as Airflow, dbt, Matillion, Fivetran, etc.
Advanced knowledge of PowerBI, Tableau, Qlik, etc.
Exposure to Java or Scala, and OO/functional programming concepts.
Benefits
**Perks on Perks:**
Competitive salary and hybrid work model – come hang out in our Athens office or work remotely from anywhere in European economic Area (EU, Switzerland etc.) or UK (up to 6 weeks per year).
Training budget to level up your skills from the top tech partners in the market (Microsoft, AWS, Salesforce, Databricks etc.) – whether it’s certifications or courses, we’ve got you covered.
Private insurance, top-tier tech gear, and the chance to work with a stellar crew.
Senior Data Engineer building and operating data platforms at bsport for analytics and AI/ML. Collaborating with Data team to enrich data layers and maintain platform observability.
Principal HR Data Engineer specializing in Microsoft Azure and Databricks Lakehouse platforms. Responsible for designing, implementing, and maintaining scalable data pipelines and architectures for analytics.
Manager, Business Solutions Data overseeing data processing lifecycle and team development at Ryan. Partnering with clients and teams to drive data solutions and operational excellence.
Senior Data Engineer at Trainline responsible for data pipelines and insightful analytics. Collaborate cross - functionally to enable impactful data - driven decisions.
Data Engineer responsible for creating pipelines and models to support analytics at Trainline. Collaborating with BI Developers and Data Scientists to drive business insights through data.
Distinguished Engineer driving design and architecture of mission - critical Data Platforms at Capital One. Leading technical strategies to enhance data discovery and governance for enterprise - wide AI readiness.
Principal Software Engineer at Clari + Salesloft developing enterprise - grade AI - driven applications for revenue intelligence with a dynamic team in India.
Data Engineer responsible for ingestion pipelines and data quality on AI - driven marketing platform. Collaborating with data teams to ensure accuracy and performance of data systems.
Data Architect designing and governing data foundations for analytics and AI applications at Clio. Collaborating cross - functionally to develop high - quality data models and standards.