Question
5-10

Senior Data Engineer

9/26/2025

The Senior Data Engineer is responsible for designing, developing, and maintaining data infrastructure and systems for efficient data processing. They collaborate with data scientists and analysts to create scalable and robust solutions.

Working Hours

40 hours/week

Company Size

10,001+ employees

Language

English

Visa Sponsorship

No

About The Company
HKT is a technology, media, and telecommunication leader with more than 150 years of history in Hong Kong. As the city’s true 5G provider, HKT connects businesses and people locally and globally. Our end-to-end enterprise solutions make us a market-leading digital transformation partner of choice for businesses, whereas our comprehensive connectivity and smart living offerings enrich people‘s lives and cater for their diverse needs for work, entertainment, education, well-being, and even a sustainable low-carbon lifestyle. Together with our digital ventures which support digital economy development and help connect Hong Kong to the world as an international financial centre, HKT endeavours to contribute to smart city development and help our community tech forward. HKT is part of Pacific Century Group, named by Forbes as one of the World's Best Employers 2023. For more information, please visit www.hkt.com. LinkedIn: linkedin.com/company/hkt
About the Role

Responsible for designing, developing, and maintaining the data infrastructure and systems required for efficient and reliable data processing. They work closely with data scientists, analysts, and other stakeholders to understand their requirements and translate them into scalable and robust solutions.

 

Key Responsibilities:

 

  1. Designing and implementing scalable data pipelines to collect, process, transform, and store large volumes of structured and unstructured data.

  2. Developing efficient ETL (Extract, Transform, Load) processes to ensure the availability of clean and accurate data for analysis.

  3. Building and maintaining data warehouses or data lakes to enable easy access to structured datasets for reporting and analytics purposes.

  4. Collaborating with cross-functional teams to identify opportunities for improving data quality, reliability, performance, and efficiency.

  5. Implementing appropriate security measures to protect sensitive data from unauthorized access or breaches.

  6. Monitoring system performance, identifying bottlenecks or issues, and implementing optimizations to ensure high availability and scalability.

  7. Conducting thorough testing of developed solutions to ensure they meet functional requirements and performance expectations.

  8. Mentoring junior team members by providing guidance on best practices in software development, database design, and data engineering techniques.

  9. Staying up-to-date with emerging technologies in the field of big data processing, cloud computing, distributed systems, etc., and evaluating their potential applications in the organization's context.

  10. Collaborating with stakeholders to understand their business needs and translating them into technical requirements.

 

Requirements:

 

  1. Bachelor's or Master's degree in Computer Science, Engineering or a related field.

  2. Proven experience as a Data Engineer or similar role with a focus on building scalable data processing systems.

  3. Strong programming skills in languages like Python/Java/Scala along with proficiency in SQL for querying databases.

  4. Experience working with big data technologies such as Hadoop ecosystem (HDFS, MapReduce), Apache Spark/PySpark for distributed computing.

  5. Proficiency in working with relational databases (e.g., MySQL) as well as NoSQL databases (e.g., MongoDB).

  6. Familiarity with cloud platforms like GCP/AWS/Azure for deploying scalable infrastructure using services like S3, EC2 instances etc.

  7. Knowledge of containerization technologies like Docker/Kubernetes is desirable but not mandatory.

  8. Strong problem-solving skills with an ability to analyze complex datasets efficiently.

  9. Excellent communication skills to effectively collaborate with cross-functional teams.

  10. Candidate should have a strong understanding of data platforms, including architecture, components, and functionalities. Candidate should be proficient in data integration, storage, processing, analytics, and visualization technologies.

  11. Knowledge of programming languages like SQL, Python, or R is also beneficial.

Key Skills
Data EngineeringETLData WarehousingData LakesBig Data TechnologiesHadoopApache SparkSQLPythonJavaScalaCloud PlatformsNoSQL DatabasesDockerKubernetesData QualityData Integration
Categories
TechnologyData & AnalyticsEngineeringSoftware
Apply Now

Please let hktservice know you found this job on PrepPal. This helps us grow!

Apply Now
Get Ready for the Interview!

Do you know that we have special program that includes "Interview questions that asked by hktservice?"

Elevate your application

Generate a resume, cover letter, or prepare with our AI mock interviewer tailored to this job's requirements.