Question
Full-time
5-10

Senior DataOps (DevOps) Engineer

1/11/2026

The Senior DataOps Engineer will architect and maintain scalable cloud-based data infrastructure while collaborating closely with Data Engineering to operationalize new pipelines and frameworks. They will also implement monitoring and alerting systems, ensuring reliability and performance across production data workflows.

Working Hours

40 hours/week

Company Size

201-500 employees

Language

English

Visa Sponsorship

No

About The Company
Tango is a social live-streaming app with over 400 million downloads worldwide. The home for creators around the world, Tango helps people unlock people’s creativity and push the live-streaming experience to the limit. With an all-in-one platform and digital economy, we are the premier destination for meeting new people, exploring live-streaming content, and discovering new creators & entertainment in real-time. Tango has global offices in Kyiv, Limassol, Warsaw, Dubai and Tel Aviv.
About the Role

Tango is a successful, market leader, a live-streaming Platform with 450+ million registered users, in an industry projected to reach $240 BILLION in the next couple of years. 

The B2C platform, based on the best-quality global video technology, allows millions of talented people around the world to create their own live content, engage with their fans, and monetize their talents.

Tango live stream was founded in 2018 and is powered by 500+ global employees operating in a culture of growth, learning, and success!

The Tango team is a vigorous cocktail of hard workers, creative brains, energisers, geeks, overachievers, athletes, and more. We push the limits to bring our app from “one of the top” to “the leader”. 

The best way to describe Tango's work style is not to use the word “impossible”. We believe that success is a thorny path that runs on sleepless nights, corporate parties, tough releases, and, of course, our users' smiles (and as we are a LIVE app, we truly get to see our users all around the world smiling right in front of us in real-time!).  

Do you want to join the party?


Responsibilities

  • Architect and maintain scalable cloud-based data infrastructure (compute, storage, orchestration, messaging, workflow management).
  • Collaborate closely with Data Engineering to operationalize new pipelines, frameworks, and data models.
  • Implement infrastructure-as-code (e.g., Terraform) to ensure consistent, automated environment provisioning.
  • Develop internal tooling to support deployment automation, testing frameworks, and pipeline lifecycle management.
  • Own reliability, uptime, and performance across all production data workflows.
  • Implement monitoring, alerting, logging, and traceability using modern observability platforms.
  • Champion data quality, lineage tracking, and automated validation frameworks.
  • Lead incident response, root-cause analysis, and postmortems for pipeline or platform issues.
  • Work daily with data engineers, analysts, platform engineers, and stakeholders to improve reliability and developer experience.
  • Lead architectural reviews and guide teams in adopting DataOps best practices.
  • Mentor junior engineers and contribute to long-term data platform strategy.
  • Maintain clear, consistent documentation of operational processes, infrastructure components, and standards.

Requirements

  • 3-5+ years in DataOps, DevOps, Platform Engineering, or similar roles.
  • Strong hands-on experience with modern cloud data ecosystems (GCP, AWS, Azure).
  • Deep understanding of:
  • Distributed systems and Data Pipelines
  • Orchestration frameworks (e.g., Airflow, Cloud Composer)
  • Streaming and messaging systems (Kafka, Pub/Sub, etc.)
  • Batch and streaming processing frameworks (e.g., Apache Beam, Spark, Flink)
  • Infrastructure-as-code (Terraform), containers (Docker), CI/CD tooling
  • Python and SQL for automation and data workflow integration
  • Experience operating production-grade data platforms with a strong focus on SLAs, reliability, and cost optimization.

Nice to Have

  • Google Cloud Platform experience- especially BigQuery, Dataflow, Pub/Sub, Dataplex, or Cloud Composer- is a significant plus.
  • Experience with BI platforms such as Looker.
  • Familiarity with ML Ops/model lifecycle management.
  • Real-time data processing experience with Kafka, Flink, or similar.
  • Expertise in cost optimization and performance tuning for cloud-based data warehouses.


Key Skills
DataOpsDevOpsPlatform EngineeringCloud Data EcosystemsDistributed SystemsData PipelinesOrchestration FrameworksStreaming SystemsBatch ProcessingInfrastructure-as-CodeContainersCI/CD ToolingPythonSQLMonitoringData Quality
Categories
TechnologyData & AnalyticsSoftwareEngineering
Apply Now

Please let Tango know you found this job on InterviewPal. This helps us grow!

Apply Now
Prepare for Your Interview

We scan and aggregate real interview questions reported by candidates across thousands of companies. This role already has a tailored question set waiting for you.

Elevate your application

Generate a resume, cover letter, or prepare with our AI mock interviewer tailored to this job's requirements.

Senior DataOps (DevOps) Engineer - InterviewPal Jobs