Data Analytics Engineer responsible for designing scalable data pipelines at Ferryhopper. Focused on empowering data-driven decisions and optimizing data collection processes.
Responsibilities
The roleAs a Data Analytics Engineer, the main purpose is to leverage data in order to empower the organization to unlock value from multiple data sources.
You will contribute to the design of scalable data pipelines to support Ferryhopper’s growing data processing and analytics needs. To achieve this, your day-to-day tasks span from designing and creating data pipelines to collecting, processing, and storing large volumes of data from various sources to create clean, reliable datasets for reporting and analytics. You’ll need to be highly motivated, biased for action, have a problem-solving mindset, curiosity, and be passionate about data and enabling data-driven decisions.
Responsibilities
Contribute to the design, build, and maintenance of data pipelines and workflows to ensure reliable and high-quality data across Ferryhopper’s data sources, helping to develop clean ingestion and transformation processes.
Assist in orchestration and monitoring of data flows to guarantee data integrity and freshness across the BI ecosystem.
Support development and maintenance of data quality checks, including audits, validation logic, and anomaly detection to ensure consistent and reliable data.
Continuously optimize data collection and reporting workflows, automating manual processes and improving efficiency wherever possible.
Enable analytics and reporting by delivering well-structured, foundational datasets that support dashboards, reporting, and modeling across the business.
Perform light data exploration and data serving activities to support analytical and business teams in understanding and utilizing available data.
Requirements
2-3 years of experience as a Data Engineer, Analytics Engineer, or in a similar data-focused role.
Hands-on experience building analytical models and data pipelines that support scalable reporting and analytics, using modern transformation frameworks such as dbt.
Hands-on experience with data ingestion tools, such as dlt, Airbyte, Google Datastream.
Strong programming skills in SQL and Python.
Familiarity with modern cloud data platforms such as AWS or Google Cloud, with a focus on data storage and compute resources.
Strong communication and collaboration skills.
Excellent problem-solving and analytical skills.
Ability to work independently, manage multiple projects and priorities simultaneously, and meet deadlines with a high degree of accuracy and attention to detail.
A life-long learner who is curious, has a passion for solving hard, ill-defined problems, has comfort taking initiative and who continuously seeks to improve their skills and understanding. Enjoys working in a team environment as well as independently.
Nice to have:
Knowledge of coding best practices and software engineering principles, including version control, code review, testing, and documentation.
Familiarity with agile development methodologies, including scrum, Kanban, and agile project management tools.
Experience with orchestration and automation tools and frameworks (e.g., Dagster, Airflow) to schedule, monitor, build , and manage data workflows.
Tech stack:
SQL, Python, BigQuery, dbt, Google Cloud, AWS, S3, Lambda Functions, Power BI, Looker Studio
Benefits
The health of our company and the success of our products is directly related to the health of our team and the work environment we create for ourselves. With this in mind, we strive to provide an inclusive and positive working environment. In this respect, we offer:
A competitive compensation package
Equipment of your choice
Training and educational budget throughout the year
Joining a fast-growing ambitious international team
Fun team events and a vibrant company culture
Flexible working policy
***Remote policy: For teams located in Athens, the policy is to visit the office a minimum of once per week.***
***There are six weeks per year in which you can work from anywhere without visiting the office.***
Analytics Engineer crafting advanced data models using BigQuery and Snowflake for Dasa. Collaborating in a dynamic environment to innovate healthcare solutions for millions.
Senior Data & Financial Analytics Engineer at Pliant turning complex financial data into scalable insights. Collaborating with Finance and leadership to ensure reliable financial metrics.
Analytics Engineer analyzing and visualizing data for orthodontic treatments at DentalMonitoring. Collaborating closely with teams to provide insights and support data - driven decisions.
Analytics Engineer improving data processes at Zaha Hadid Architects. Developing scalable ELT/ETL pipelines and collaborating across teams for data - driven decision - making.
BI & Analytics Developer transforming raw data into business insights for decision - making at PCL Constructors. Focused on reporting, analytics, and ETL development.
Data Engineering Specialist at Dasa, optimizing data architectures for a leading health network in Brazil. Focused on innovation and technical excellence in data management.
Analytics Engineer joining Our Future Health to build and maintain data models. Working with large - scale health data to enable analytics for decision making and research.
ETL Developer developing and integrating financial crimes risk management solutions at Truist. Collaborating with business product owners and delivering technically complex solutions in the U.S.
Senior Marketing Analytics Engineer leading complex analytical initiatives at Generac Power Systems. Collaborating across teams to drive business effectiveness and enhance marketing analytics.