Senior Data Engineer
2/28/2026
The Senior Data Engineer will act as a technical leader, responsible for the architecture, scalability, and reliability of the high-throughput, real-time data ecosystem. This involves leading the design of fault-tolerant ETL/ELT pipelines and driving the strategy for real-time messaging using Kafka and RabbitMQ.
Working Hours
40 hours/week
Company Size
51-200 employees
Language
English
Visa Sponsorship
No
Founded in 2017, we are dedicated to fostering an ecosystem of seamless resource exchange, where efficiency and precision are paramount. With cutting-edge solutions, we empower businesses to thrive and individuals to unlock their full potential. Committed to high-tech innovation, we are actively reshaping the future, one Byte at a Time.
As a Senior Data Engineer, you will be a technical leader responsible for the architecture, scalability, and reliability of our high-throughput, real-time data ecosystem. You will oversee the evolution of our data infrastructure, leveraging Kafka, RabbitMQ, Airflow, and ClickHouse to power mission-critical financial analytics. Your role is to bridge the gap between complex business requirements and high-performance engineering, ensuring our data pipelines can handle the rigours of real-time financial data processing.
Responsibilities:
- Lead the design and evolution of highly scalable, fault-tolerant ETL/ELT pipelines.
- Drive the strategy for real-time messaging and stream processing using Kafka and RabbitMQ to ensure sub-second data availability.
- Act as the subject matter expert for ClickHouse, optimising complex schema designs, indexing strategies, and query performance for large-scale financial datasets.
- Oversee the deployment of data services within cloud environments, implementing advanced security protocols and data governance standards essential for the finance industry.
- Collaborate with senior leadership to align data strategy with business objectives. Mentor data engineers through code reviews and technical guidance.
- Implement advanced monitoring and automated recovery systems to ensure the integrity and quality of high-stakes financial data.
Requirements:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- Proven experience in data engineering, with a strong background in designing and implementing ETL processes within cloud environments.
- Experience within the Finance or Trading technology sector, with a proven track record of handling real-time market or transactional data.
- Strong programming skills in Python, with experience in developing robust, maintainable, and scalable data processing pipelines.
- Extensive SQL knowledge and experience.
- Excellent problem-solving skills and the ability to work collaboratively in a team environment.
- Strong communication skills, with the ability to convey complex technical concepts to non-technical stakeholders.
- Hybrid working arrangement - 2 Days of remote work per week
- Opportunities for enriching career growth
- Complimentary snacks and beverages available in the office pantry
- Healthcare coverage (medical, dental, optical), gym benefits
- Flexibility in smart casual dress code
- Young, vibrant and open work culture
Please let Lifebyte Systems know you found this job on InterviewPal. This helps us grow!
We scan and aggregate real interview questions reported by candidates across thousands of companies. This role already has a tailored question set waiting for you.
Generate a resume, cover letter, or prepare with our AI mock interviewer tailored to this job's requirements.