Job Description

Role Overview

This role will play a critical role in ensuring the reliability, scalability, and compliance of data pipelines that support surveillance systems across communications and trading activities, covering the above structured and unstructured data.

The role bridges engineering and operations, enabling robust data ingestion, transformation, and monitoring to meet regulatory and internal compliance requirements. The Data Ops Engineer will play a critical role in collaborating with upstream teams to ensure data completeness, accuracy, and timeliness is as expected and that any data completeness or quality issues are visible.

The role will also work on other Surveillance data initiatives such as persisting Surveillance Alerts in the firm’s data lake for analytics purposes.

Role Responsibilities

  • Design, build, maintain and optimise end-to-end data pipelines and workflows between the source data points and target destinations, working with the wider Surveillance Technology team to prioritise automation, scalability and strategy at the heart of the design.
  • Implement automated data completeness and quality checks, validation rules, and reconciliation processes to ensure accuracy, completeness, and timeliness of the data ingested and to make visible any data that is not processed.
  • Identify Critical Data Elements and implement failover and recovery strategies for the respective Data Flows.
  • Build AWS infrastructure using Terraform or CDK
  • Write unit, integration, and infrastructure tests
  • Monitor, investigate and resolve data anomalies through collaboration with, Business Analysts, Developers, and Testers across functions and verticals.
  • Implement data management and governance frameworks to ensure data is ingested and loaded per the requirements of the consuming platform; Scila for Trade Surveillance, and Global Relay for Communications Surveillance.
  • Partnering with the Data Strategy and Data Infrastructure team to ensure Data Lineage, auditability and retention policies are enforced across all necessary pipelines.
  • Ensuring that Data consumed and processed is compliance with regulatory, legal, and security protocols.
  • Work closely with surveillance analysts, compliance officers, and engineering teams to translate business rules into technical specifications.
  • Partnering closely with stakeholders and subject matter experts such as the Cloud Infrastructure team to optimise performance and costs.
  • Stay updated on industry trends and emerging tech to ensure continuous improvement.

Experience / Competences

Essential:

  • Strong experience ETL/ELT data pipeline builds from design, to implementation, to maintenance in relation to financial market messaging platforms, and trade & order systems.
  • Solid understanding of CI/CD pipelines, ideally with a background in software engineering, product management or data analytics.
  • Experience with some of EKS, Lambda, EventBridge, Step Functions, S3, DynamoDB, AWS Glue, Snowflake, Terraform and Transfer Family.
  • Strong proficiency in Python or Java, SQL, and data pipeline frameworks (e.g., Airflow, dbt, Spark), with solid experience with the AWS ecosystem.
  • Proven expertise with data governance frameworks and compliance regulations in financial services.
  • Knowledge of streaming technologies (Kafka, Kinesis) and API integrations, and hands-on experience with monitoring tools (e.g. Grafana) and observability practices.
  • Excellent problem-solving skills and ability to work in a fast-paced environment.
  • Bachelor’s degree in Computer Science, Data Science, Engineering, or related field.
  • Previous experience in Data Ops and Data Engineering.
  • Strong communication and collaboration skills to engage with technical and non-technical stakeholders.
  • Strong experience with Agile software delivery.

Desired:

  • Experience with market data ingestion, metadata extraction, and event-driven architectures.
  • Proficient with Terraform or CDK (infrastructure-as-code).
  • Experience in Business Communications Technology e.g. Bloomberg, ICE, Symphony, Teams Chat, etc.
  • Familiarity with security best practices, IAM, and VPN configuration.
  • Experience with regulatory compliance and data security in financial services.
  • Knowledge of financial markets and trading platforms.
  • Experience with GitLab, Qliksense & Alation
  • Certifications in DataOps, cloud platforms, or related areas.


Job Details

Role Level: Mid-Level Work Type: Full-Time
Country: Philippines City: Taguig National Capital Region
Company Website: http://www.cloud-bridge.co.uk Job Function: Data Science & AI
Company Industry/
Sector:
IT Services and IT Consulting

What We Offer


About the Company

Searching, interviewing and hiring are all part of the professional life. The TALENTMATE Portal idea is to fill and help professionals doing one of them by bringing together the requisites under One Roof. Whether you're hunting for your Next Job Opportunity or Looking for Potential Employers, we're here to lend you a Helping Hand.

Report

Disclaimer: talentmate.com is only a platform to bring jobseekers & employers together. Applicants are advised to research the bonafides of the prospective employer independently. We do NOT endorse any requests for money payments and strictly advice against sharing personal or bank related information. We also recommend you visit Security Advice for more information. If you suspect any fraud or malpractice, email us at abuse@talentmate.com.


Recent Jobs
View More Jobs
Talentmate Instagram Talentmate Facebook Talentmate YouTube Talentmate LinkedIn