Job Description

Job Purpose

Design and build reusable Python frameworks that span front‑end integrations, backend/middle‑tier services, and cloud execution. You’ll translate complex post‑trade data flows into robust data models, stored procedures, and regression automation, and own CI/CD—including scripts to run AutoSys/batch jobs and stored procedures across environments. The role also requires strong data‑quality ownership: verifying data accuracy, completeness, duplication handling, referential integrity, and conformance to business rules and transformation logic; reviewing and validating schema structures, data types, constraints, metadata, and lineage; designing and executing data‑validation test cases; and performing detailed source‑to‑target comparisons using SQL or automation frameworks - identifying data anomalies and document defects.

The ideal candidate brings pragmatic engineering discipline, capital‑markets data intuition, and a builder’s mindset.

Key Responsibilities

  • Python engineering & reusable frameworks
  • Build modular Python packages (data processing, API clients, orchestration adapters), publishable to internal artifact repositories; enforce code quality, testing, and documentation standards.
  • Develop backend services/APIs (e.g., FastAPI/Flask) and CLI tools to support front‑end, middle‑tier, and cloud workflows; implement resilient error handling, observability hooks, and secure secrets usage.
  • Data modeling, SQL & stored procedures
  • Design relational schemas and write/optimize complex SQL (windowing, CTEs, partitioning); author and refactor stored procedures (SQL Server/Oracle/Postgres) with attention to edge cases and performance.
  • Build data‑validation utilities that compare large datasets across environments and produce diffs for regression packs.
  • Post‑trade domain, test automation & regression
  • Map post‑trade flows (allocations, clearing/settlement, confirmations, reconciliations) into datasets, rules, and assertions for repeatable regression.
  • Read and translate stored‑proc logic into automated test scripts; build a central repository of reusable checks integrated into CI/CD.
  • Nice to have: familiarity with trade/blotter platforms (e.g., ION) or similar post‑trade systems.
  • CI/CD, DevOps & environment promotion
  • Design and operate multi‑stage CI/CD pipelines (Azure DevOps/Jenkins/GitLab) for code, data artifacts, and SQL deployables; implement approvals, rollbacks, environment strategy, and quality gates (lint, SAST/DAST, tests).
  • Containerize services where appropriate; integrate with AKS/Kubernetes or serverless jobs, and wire up metrics/alerts for runtime health.
  • Scheduling, batch & operationalization
  • Script the execution of AutoSys/batch jobs and stored procedures across dev/UAT/prod; add run‑books, logging, and guardrails; enable reliable, auditable promotions through environments.
  • Data validation
  • Perform comprehensive data validation across all stages of the data lifecycle
  • Validate data accuracy, consistency, completeness, timeliness, and adherence to business rules and transformation logic.
  • Identify and analyze data anomalies, integrity issues, schema mismatches, duplicate records, null handling problems, and referential integrity violations.
  • Execute SQL queries to validate data rules, transformation outputs, and pipeline results; independently troubleshoot data discrepancies and log defects with clear impact analysis.
  • Design and maintain reusable test cases, validation scenarios, and automated data verification scripts where applicable.

Validate metadata, schema structures, data types, constraints, primary/foreign key relationships, and data lineage compliance.

Key competencies

5–7 years of hands‑on software engineering with Python (incl. packaging, virtual envs, unit/integration testing); strong use of libraries such as pandas, SQL Alchemy/pyodbc, and asyncio/celery for pipelines and services.

Expert SQL skills and stored‑proc development (SQL Server/Oracle/Postgres), query tuning, and execution‑plan analysis for large datasets.

Proven experience designing CI/CD pipelines and automating promotion (code + data + DB objects) with Azure DevOps/Jenkins/GitLab; strong Git practices and code‑review hygiene.

Comfort with schedulers (AutoSys/Control‑M/Airflow) and shell/Python scripting for batch orchestration; familiarity with secrets management and environment configuration.

Domain understanding of post‑trade data flows and how to encode them into repeatable regression checks.

 Perform end-to-end data validation across the lifecycle, ensuring accuracy, completeness, consistency, referential integrity, and adherence to business rules and transformation logic.

Execute SQL-based source to target checks, validate schemas (data types, constraints, metadata, lineage), and identify anomalies such as duplicates, null issues, and mismatches; investigate and log defects with clear impact analysis.

 Design and maintain reusable data validation test cases and automated verification scripts to support scalable, repeatable quality checks.

Good analytical skills to understand post‑trade or financial datasets and translate them into repeatable validation scenarios (Capital Markets domain knowledge is good to have).

Experience working with test case management tools (e.g., JIRA, XRay) and following agile development practices.

Ability to contribute to QA processes, including manual testing, data validation, and regression testing for critical applications.

Strong attention to detail, good communication skills, and a willingness to learn and grow within a data engineering/testing environment.

Good to have

Cloud exposure (Azure/AWS), containers/Kubernetes, infrastructure as code (Bicep/Terraform), and DevSecOps gates (Sonar/Mend/ZAP).

Experience with capital‑markets platforms (e.g., ION blotter integrations) and messaging/API patterns in trading data stacks.


Job Details

Role Level: Mid-Level Work Type: Full-Time
Country: India City: Pune Division ,Maharashtra
Company Website: https://www.acuityanalytics.com/ Job Function: Information Technology (IT)
Company Industry/
Sector:
Banking And Real Estate

What We Offer


About the Company

Searching, interviewing and hiring are all part of the professional life. The TALENTMATE Portal idea is to fill and help professionals doing one of them by bringing together the requisites under One Roof. Whether you're hunting for your Next Job Opportunity or Looking for Potential Employers, we're here to lend you a Helping Hand.

Report

Disclaimer: talentmate.com is only a platform to bring jobseekers & employers together. Applicants are advised to research the bonafides of the prospective employer independently. We do NOT endorse any requests for money payments and strictly advice against sharing personal or bank related information. We also recommend you visit Security Advice for more information. If you suspect any fraud or malpractice, email us at abuse@talentmate.com.


Recent Jobs
View More Jobs
Talentmate Instagram Talentmate Facebook Talentmate YouTube Talentmate LinkedIn