Junior Data Engineer
5/11/2026
Maintain data pipelines and ETL processes to support reporting, analytics, and AI use cases. Collaborate with senior engineers to manage cloud data warehouses and ensure data quality through automated testing.
Working Hours
40 hours/week
Company Size
10,001+ employees
Language
English
Visa Sponsorship
No
Company Description
Entain India is the engineering and delivery powerhouse for Entain, one of the world's leading global sports and gaming groups. Established in Hyderabad in 2001, we've grown from a small tech hub into a dynamic force, delivering the latest software solutions and support services that power billions of transactions for millions of users worldwide.
Scale drives us to create technology that supports Entain's mission to lead the change in global sports and gaming sector. At Entain India, we make the impossible possible, together.
This means that not only do you get to work for a dynamic organization delivering pioneering technology, gaming and business solutions, you can also have an exciting and entertaining career. At Entain India, Bright Minds Shine Brighter.
Job Description
Data intelligence and AI plays a transformational role in how we create value today and how we shape our future. We have a bold ambition to be a world-class data and AI-driven business for our customers across all our regions and departments.
You will work within our ASE Data & AI function. You will support teams by building and maintaining data infrastructure that enables efficient analysis generation and generative AI and ML applications. You will maintain data pipelines and systems that serve our diverse data consumers across the business spanning our brands in the Americas and Southern Europe. By applying your data engineering skills, you'll ensure data-driven decision-making throughout the organization, supporting business growth.
Important Responsibilities
- Help maintain data pipelines and Extract Transform and Load processes to support reporting, analytics, engineering and AI use cases.
- Work with internal and external data sources to ingest, process, and store datasets using existing connectors and APIs.
- Support the team in managing data within cloud DWH platforms such as Snowflake (or equivalent).
- Help develop data transformation workflows using DBT or similar tools.
- Help deploy data pipeline artifacts to production environments and support the team in managing releases.
- Help maintain CI/CD pipelines for data workflows, ensuring reliable and automated deployments.
- Write automated unit tests and data quality tests to support reliable deployments and ensure the correctness of data pipelines and transformations.
- Perform data validation checks and basic monitoring to help maintain data quality and reliability.
- Collaborate with senior data engineers and analysts to troubleshoot data pipeline issues and resolve data inconsistencies.
Qualifications
Essential:
- Bachelor's degree in computer science, Software Engineering, Data Science, or a related field, or equivalent practical experience.
- Working knowledge of SQL for querying and transforming data
- Basic to intermediate experience with Python for data processing and scripting
- Familiarity with cloud data warehouse platforms such as Snowflake, Big Query, or similar
- Exposure to data transformation or orchestration tools such as DBT, Airflow, or similar
- Experience with data pipeline concepts
- Familiarity with Git or other version control systems
- Work in a collaborative team environment
- Basic familiarity with data visualization tools such as Power BI or Tableau
- Awareness of data quality validation or monitoring tools
- Experience with data governance and data management principles
- Exposure to event-driven or streaming data systems (Kafka, Kinesis)
- Familiarity with containerization tools such as Docker
- Exposure to CI/CD workflows (GitHub Actions, GitLab CI, Jenkins)
- Interest in modern data engineering practices and AI/ML data workflows
- Basic familiarity with DevOps concepts, including CI/CD pipelines, deployment workflows, and automated testing practices
- Understanding of MLOps concepts, including how data pipelines support machine learning model training, versioning, and deployment workflows
- Familiarity with AI/LLM-based systems, such as RAG, embeddings, or vector databases
We're looking for someone who
- Communicates technical concepts to all kinds of partners
- Demonstrate when working with latest technologies and changing requirements
- To improve system performance and identify solutions
- Balances with delivery to meet our needs
- Collaborate across diverse teams, especially with data engineers and AI SMEs
- Take ownership of assigned projects from concept through implementation and maintenance
- Show curiosity about AI advancements and apply relevant insights to solve business problems
- Set and that motivate it.
- Collaborate and work well with others
Additional Information
At Entain India, we know that signing top players requires a great starting package, and plenty of support to inspire your best. Join us, and a great compensation package is just the beginning. Working for us you can expect to receive great benefits like:
- Safe home pickup and home drop (Hyderabad Office Only)
- Group Mediclaim policy
- Group Critical Illness policy
- Communication & Relocation allowance
- Annual Health check
And outside of this, you'll have the chance to turn recognition from leaders and colleagues into amazing prizes. Join a winning team of experienced people and be a part of an inclusive and supporting community where everyone is celebrated for being themselves.
Should you need any adjustments or accommodations to the recruitment process, at either application or interview, please contact us.
Please let Entain know you found this job on InterviewPal. This helps us grow!
We scan and aggregate real interview questions reported by candidates across thousands of companies. This role already has a tailored question set waiting for you.
Generate a resume, cover letter, or prepare with our AI mock interviewer tailored to this job's requirements.