Senior KDB Developer at GFT designing and deploying time series systems across markets. Collaborating with Trading, Quant, and Platform teams in a fast-paced environment.
Responsibilities
Drive the design and development of real time market data and analytics services on KDB+/q (GW/RDB/HDB/tickerplant, pub/sub, CEP)
Own technical delivery: requirements, solution design, coding, testing, deployment, and production support in low latency environments
Collaborate with Traders, Quants, and Technology to deliver high performance analytics (asof joins, windowed analytics, intraday aggregations)
Optimize for latency and throughput (IPC, memory layout, partitioning, attributes, OS tuning, NUMA, hugepages)
Implement and harden ETL pipelines for tick capture and reference data enrichment; integrate with Kafka and streaming services where applicable
Contribute to coding standards, code reviews, and test automation for q/KDB+ (including PyKX integration points)
Ensure observability (metrics, tracing, logging), production readiness, and on call excellence. Risk & Compliance: Appropriately assess risk in decisions, protect client reputation and assets, comply with applicable laws, regulations, and policies, apply sound ethical judgment, and escalate issues transparently
Requirements
Expertise in KDB+/q with production systems: GW/RDB/HDB design, sym/partition strategies, attributes, asof/aj/uj, IPC patterns
Strong skills in q/KDB+; working proficiency in Python/PyKX and ideally Java for integration services
Solid Linux/UNIX fundamentals (networking, OS tuning) and familiarity with TCP/IP, UDP, Multicast; knowledge of FIX/OUCH/ITCH preferred
Proven track record of profiling and optimizing for microsecond level latency (e.g., vectorization, batching, zero copy, mmap)
Strong debugging and production incident response; experience with agile delivery
Market knowledge is a plus: market microstructure, SORs, algo trading systems
Nice to have Containers/Kubernetes, CI/CD, cloud (AWS/Azure/GCP), secrets/entitlements, Terraform/Ansible
Benefits
Benefit package that can be tailored to your personal needs (private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.)
On-line training and certifications fit for career path
Access to e-learning platform Mindgram - a holistic mental health and wellbeing platform
Work From Anywhere (WFA) - the temporary option to work remotely outside of Poland for up to 140 days per year (including Italy, Spain, the UK, Germany, Portugal, and Bulgaria)
Data Platform Specialist overseeing data workflows and enhancing data quality for Stackgini's AI - driven IT solutions. Collaborating with teams to drive improvements and stakeholder support.
Data Engineer designing data pipelines in Python for a major railway industry client. Collaborate with Data Scientists and ensure code quality with agile methodologies.
Senior Data Engineer responsible for building and optimizing data pipelines for banking analytics initiatives. Collaborating with data teams to ensure data quality and readiness for enterprise use.
Senior Data Engineer developing scalable data solutions on Databricks for analytics and operational workloads. Collaborating with cross - functional teams to modernize the data ecosystem.
Data Engineer focused on analytics and data pipeline development for network optimisation. Collaborating with teams to deliver high - quality data solutions with Python and SQL.
Senior Product Manager defining platform capabilities for Data Cloud in Salesforce. Collaborating with R&D teams while shaping product strategy for Data 360 integration.
Senior Data Engineer at Goodwin enhancing data platforms and fostering data - driven culture across teams. Collaborating with IT and Finance on technology solutions and data governance practices.
Director, Data Platform Design and Strategy at MedImpact leading data platform and AI innovations to enhance healthcare services. Overseeing enterprise projects and managing teams to meet strategic goals.
Data Engineer delivering AI - and data - driven solutions for Honeywell’s industrial customers. Architecting and implementing scalable data pipelines and platforms focused on IoT and real - time data processing.
Data Engineering Associate focusing on data quality control and management for distribution platform. Collaborates on large scale data projects to ensure data accuracy and availability for users.