Job Description

Job Purpose

As a member of the Data and Technology practice, you will be working on advanced AI & ML engagements tailored for the investment banking sector. This includes developing and maintaining data pipelines, ensuring data quality, and enabling data-driven insights. Your core responsibility will be to build and manage scalable data infrastructure that supports our proof of-concept initiatives (POCs) and full-scale solutions for our clients. You will work closely with data scientists, DevOps engineers, and clients to understand their data requirements, translate them into technical tasks, and develop robust data solutions

Desired Skills And Experience

5+ years of relevant experience in:

  • Experience in data engineering role, preferably within the financial services industry.
  • Experience with building data pipeline and ETL frameworks with programming abilities in Python, SQL, and Spark.
  • Proficiency in cloud platforms, particularly Azure and Databricks.
  • Experience with data integration from various sources including APIs, SFTPs and databases.
  • Strong understanding of data warehousing concepts and practices.
  • Experience building CI/CD pipelines and automating unit and integration testing.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication skills, both written and oral, with a business and technical aptitude.

Additionally, desired skills:

  • Familiarity with big data technologies and frameworks.
  • Experience with financial datasets and understanding of investment banking metrics.
  • Knowledge of visualization tools (e.g., PowerBI).

Key Responsibilities

  • Develop, optimize, and maintain scalable and reliable data pipelines using tools such as Python, SQL, and Spark.
  • Integrate data from various sources including APIs, databases, and cloud storage solutions such as Azure and Databricks.
  • Implement data quality checks and ensure the accuracy and consistency of data.
  • Manage and optimize data storage solutions, ensuring high performance and availability.
  • Work closely with data scientists and DevOps engineers to ensure seamless integration of data pipelines and support machine learning model deployment.
  • Monitor and optimize the performance of data workflows to handle large volumes of data efficiently.
  • Create detailed documentation of data processes.
  • Implement security best practices and ensure compliance with industry standards.


Job Details

Role Level: Mid-Level Work Type: Full-Time
Country: India City: Bengaluru ,Karnataka
Company Website: https://www.acuitykp.com Job Function: Manufacturing & Production
Company Industry/
Sector:
Financial Services

What We Offer


About the Company

Searching, interviewing and hiring are all part of the professional life. The TALENTMATE Portal idea is to fill and help professionals doing one of them by bringing together the requisites under One Roof. Whether you're hunting for your Next Job Opportunity or Looking for Potential Employers, we're here to lend you a Helping Hand.

Report

Similar Jobs

Disclaimer: talentmate.com is only a platform to bring jobseekers & employers together. Applicants are advised to research the bonafides of the prospective employer independently. We do NOT endorse any requests for money payments and strictly advice against sharing personal or bank related information. We also recommend you visit Security Advice for more information. If you suspect any fraud or malpractice, email us at abuse@talentmate.com.


Talentmate Instagram Talentmate Facebook Talentmate YouTube Talentmate LinkedIn