Question
Full-time
5-10

DevOps Engineer (Data Platform Group)

1/5/2026

The DevOps Engineer will provide strategic direction for data architecture and manage high-performance data-driven projects. They will also implement DevOps practices to streamline development and operations.

Working Hours

40 hours/week

Company Size

201-500 employees

Language

English

Visa Sponsorship

No

About The Company
BioCatch prevents financial crime by recognizing patterns in human behavior. We continuously collect more than 3,000 anonymized data points – keystroke and mouse activity, touch screen behavior, physical device attributes, and more – as people interact with their digital banking platforms. With these inputs, our machine-learning models reveal patterns in user behavior and provide device intelligence that, together, distinguish the criminal from the legitimate. Today, more than 30 of the world's largest 100 banks and 287 total financial institutions deploy our solutions, analyzing 16 billion user sessions per month and protecting 532 million people around the world from fraud and financial crime. Fraud is incessant, pervasive, and ever-evolving. It’s relentless. And that's why, at BioCatch, we fight to make banking safer every day.
About the Role

BioCatch is the leader in Behavioral Biometrics, a technology that leverages machine learning to analyze an online user’s physical and cognitive digital behavior to protect individuals online. BioCatch’s mission is to unlock the power of behavior and deliver actionable insights to create a digital world where identity, trust, and ease coexist.Today, 32 of the world's largest 100 banks and 210 total financial institutions rely on BioCatch Connect™ to combat fraud, facilitate digital transformation, and grow customer relationships.. BioCatch’s Client Innovation Board, an industry-led initiative including American Express, Barclays, Citi Ventures, and National Australia Bank, helps BioCatch to identify creative and cutting-edge ways to leverage the unique attributes of behavior for fraud prevention. With over a decade of analyzing data, more than 80 registered patents, and unparalleled experience, BioCatch continues to innovate to solve tomorrow’s problems. For more information, please visit www.biocatch.com. 

 

Main responsibilities: 

  • Data Architecture Direction: Provide strategic direction for our data architecture, selecting the appropriate componments for various tasks. Collaborate on requirements and make final decisions on system design and implementation. 
  • Project Management: Manage end-to-end execution of high-performance, large-scale data-driven projects, including design, implementation, and ongoing maintenance. 
  • Cost Optimization: Monitor and optimize cloud costs associated with data infrastructure and processes. 
  • Efficiency and Reliability: Design and build monitoring tools to ensure the efficiency, reliability, and performance of data processes and systems. 
  • DevOps Integration: Implement and manage DevOps practices to streamline development and operations, focusing on infrastructure automation, continuous integration/continuous deployment (CI/CD) pipelines, containerization, orchestration, and infrastructure as code. Ensure scalable, reliable, and efficient deployment processes. 
  • Our stack: Azure, GCP, Kubernetes, ArgoCD, Jenkins, Databricks, Snowflake, Airflow, RDBMS, Spark, Kafka, Micro-Services, bash, Python, SQL. 



Requirements

  • 5+ Years of Experience: Demonstrated experience as a DevOps professional, with a strong focus on big data environments, or Data Engineer with strong DevOps skills. 
  • Data Components Management: Experiences managing and designing data infrastructure, such as Snowflake, PostgreSQL, Kafka, Aerospike, and Object Store. 
  • DevOps Expertise: Proven experience creating, establishing, and managing big data tools, including automation tasks. Extensive knowledge of DevOps concepts and tools, including Docker, Kubernetes, Terraform, ArgoCD, Linux OS, Networking, Load Balancing, Nginx, etc. 
  • Programming Skills: Proficiency in programming languages such as Python and Object-Oriented Programming (OOP), emphasizing big data processing (like PySpark). Experience with scripting languages like Bash and Shell for automation tasks. 
  • Cloud Platforms: Hands-on experience with major cloud providers such as Azure, Google Cloud, or AWS. 

Preferred Qualifications: 

  • Performance Optimization: Experience in optimizing performance for big data tools and pipelines - Big Advantage. 
  • Security Expertise: Experience in identifying and addressing security vulnerabilities within the data platform - Big Advantage. 
  • CI/CD Pipelines: Experience designing, implementing, and maintaining Continuous Integration/Continuous Deployment (CI/CD) pipelines – Advantage. 
  • Data Pipelines: Experience in building big data pipelines - Advantage. 


Key Skills
DevOpsData ArchitectureProject ManagementCost OptimizationEfficiencyReliabilityCloud PlatformsProgrammingAutomationCI/CDBig DataMonitoringSecurityContainerizationOrchestrationInfrastructure
Categories
TechnologyData & AnalyticsSoftwareEngineering
Apply Now

Please let BioCatch know you found this job on InterviewPal. This helps us grow!

Apply Now
Prepare for Your Interview

We scan and aggregate real interview questions reported by candidates across thousands of companies. This role already has a tailored question set waiting for you.

Elevate your application

Generate a resume, cover letter, or prepare with our AI mock interviewer tailored to this job's requirements.