Job Description

Location Name: NR Trident Tech Park

Job Purpose

The Senior Database Developer will design, build, and optimize the data backbone of the Engineering CRM platform. This role focuses on architecting PostgreSQL databases, building robust data pipelines using Azure Data Factory and Databricks, and ensuring seamless two way integration with HRMS and other systems. The purpose of this role is to ensure high-quality data availability, reliability, and performance across CRM modules, enabling smooth workflows for all internal users.

Duties And Responsibilities

  • Database Design & Optimization (PostgreSQL)
  •  Design and maintain scalable, normalized database schemas for the Engineering CRM.
  •  Develop efficient SQL queries, views, stored procedures, and functions.
  •  Optimize indexes, partitioning, and query execution plans for high performance.
  •  Manage versioning, migrations, and schema changes across environments.
  •  Ensure data security, access controls, and role-based permissions at DB level.
  • Data Pipeline Development (Azure Data Factory & Databricks)
  •  Create and manage ETL/ELT pipelines using Azure Data Factory (ADF).
  •  Develop transformation logic, notebooks, and processing workflows using Azure Databricks (PySpark/SQL).
  •  Set up data ingestion from APIs, files, databases, and internal systems.
  •  Ensure pipelines are resilient, fault tolerant, and optimized for performance & cost.
  • System Integrations (Employee Master / HRMS Sync)
  •  Build and maintain two way data sync processes between CRM and HRMS/employee master systems.
  •  Implement delta sync logic to avoid redundant or duplicate data movement.
  •  Ensure accurate mapping of employee attributes, hierarchies, organizational structures, etc.
  •  Collaborate with application, HR, and infra teams to resolve discrepancies and maintain master data integrity.
  • Data Quality, Governance & Monitoring
  •  Implement validation rules, data quality checks, and cleansing procedures.
  •  Set up monitoring, alerts, and logging for pipelines using Azure tools.
  •  Conduct root-cause analysis for integration or data quality issues.
  •  Maintain data dictionaries, lineage documentation, and metadata repositories.
  • Collaboration & Delivery
  •  Work closely with API developers, PMO/BA, QA, UI/UX, and product teams.
  •  Convert feature requirements into DB design and data architecture.
  •  Support testing cycles with test data creation, query validation, and data fixes.
  •  Participate in sprint ceremonies and assist in estimates and technical planning.


Required Qualifications And Experience

  •  Qualifications


Bachelor’s degree in computer science, Information Technology, or a related field

  •  Work Experience


Minimum 3+ years of hands-on experience as a Database Developer or Data Engineer.

  •  Skills Keywords


Work Experience Requirements

  •  3+ years of hands-on experience as a Database Developer or Data Engineer.
  •  Practical experience with PostgreSQL in production environments.
  •  Strong exposure to Azure Data Factory and Databricks.
  •  Experience integrating with HRMS, employee master systems, or similar enterprise data sources (preferred).
  •  Experience working in Agile/Scrum setups and CI/CD-based deployments.
  •  Knowledge of internal platforms, CRMs, or enterprise workflow tools (good advantage).


Core Database Skills

  •  Strong hands-on experience with PostgreSQL (required).
  •  Expertise in:


o Advanced SQL optimization

o Window functions, CTEs

o Triggers, functions, stored procedures

o Indexing strategies, vacuum/analyze tuning

  •  Knowledge of partitioning, performance troubleshooting, and DB health monitoring.


Azure Data Engineering Skills

  •  Hands-on experience with Azure Data Factory (pipelines, datasets, linked services).
  •  Good experience with Databricks (PySpark or Spark SQL).
  •  Experience with ingestion from REST APIs, file systems, blob storage, and databases.
  •  Familiarity with Azure App Service, Key Vault, Blob Storage (as used in your project).


Integration & ETL/ELT Understanding

  •  Data mapping, transformations, cleansing, and enrichment logic.
  •  Experience in enterprise master-data sync workflows (Employee, HRMS, Org Structure).
  •  Familiarity with JSON/XML, flat files, delta loads, and incremental pipeline design.


General Technical Skills

  •  Understanding of backend application behavior (preferably .NET Core APIs).
  •  Ability to debug issues across layers (API → ETL → DB → HRMS integration).
  •  Version control (Git), CI/CD exposure, SQL profiling tools.


Job Details

Role Level: Associate Work Type: Full-Time
Country: India City: Bengaluru ,Karnataka
Company Website: http://www.bajajfinserv.in Job Function: Information Technology (IT)
Company Industry/
Sector:
Financial Services

What We Offer


About the Company

Searching, interviewing and hiring are all part of the professional life. The TALENTMATE Portal idea is to fill and help professionals doing one of them by bringing together the requisites under One Roof. Whether you're hunting for your Next Job Opportunity or Looking for Potential Employers, we're here to lend you a Helping Hand.

Report

Disclaimer: talentmate.com is only a platform to bring jobseekers & employers together. Applicants are advised to research the bonafides of the prospective employer independently. We do NOT endorse any requests for money payments and strictly advice against sharing personal or bank related information. We also recommend you visit Security Advice for more information. If you suspect any fraud or malpractice, email us at abuse@talentmate.com.


Recent Jobs
View More Jobs
Talentmate Instagram Talentmate Facebook Talentmate YouTube Talentmate LinkedIn