Job Description

Job Description

  • Analysis, design, development, support and maintenance of DW, Data Lake and Lakehouse infrastructures
  • Technical expertise on data pipelines and ETL processes.
  • Solid understanding of SQL and Data Modelling
  • Hands on experience with Databricks on AWS cloud solutions
  • In-depth knowledge of Databricks ETL and Orchestration, notebook development and job scheduling
  • Collaborate with cross-functional teams of business analysts, business users and IT stakeholders gathering, documenting, analysing and implementing requirement
  • Collaborate with multiple teams to develop and maintain data pipelines, harvesting data from systems of records to create data products
  • Create and optimize relational databases schema, tables, indexes, views, and stored procedures

Responsibilities

  • Perform data migration, transformation, and integration tasks between different database systems
  • Develop and maintain ETL processes to load and transform data from various sources
  • Work with other development groups to advise, guide and assist in their integrations
  • Familiarity with version control systems (e.g. GIT) and Agile development methodologies
  • Interact with vendor for support and consultancy
  • Work directly with reporting manager to ensure clear and accurate communication of current status, dependencies and estimated delivery timelines
  • Clearly communicate issues, risks and proposed solutions to relevant stakeholders
  • Coordinate the release process with business groups, operations in compliance with ITIL Change Management discipline
  • Communicate detailed descriptions of functionality changes and provide follow up support as required
  • Support QA testing
  • Provide direction & support to the Business Divisions for User Acceptance Testing
  • Utilizing Reporting tools to rapidly create, modify, refresh and update dashboards and reports

Qualifications

  • Experience with business intelligence, data analysis, data modelling and visualization solutions on Cloud, on-premises and hybrid
  • experience in Databricks on Amazon Web Services cloud computing platforms
  • Experience with Databricks platform
  • Experience with Python, SQL, relational database objects (Oracle and/or SQL Server) and procedural languages (PL/SQL and/or T-SQL)
  • Experience with Version Control systems
  • Web Services integration using REST/SOAP/JSON


Job Details

Role Level: Mid-Level Work Type: Full-Time
Country: India City: Hyderabad ,Telangana
Company Website: http://www.citco.com Job Function: Information Technology (IT)
Company Industry/
Sector:
Financial Services

What We Offer


About the Company

Searching, interviewing and hiring are all part of the professional life. The TALENTMATE Portal idea is to fill and help professionals doing one of them by bringing together the requisites under One Roof. Whether you're hunting for your Next Job Opportunity or Looking for Potential Employers, we're here to lend you a Helping Hand.

Report

Similar Jobs

Disclaimer: talentmate.com is only a platform to bring jobseekers & employers together. Applicants are advised to research the bonafides of the prospective employer independently. We do NOT endorse any requests for money payments and strictly advice against sharing personal or bank related information. We also recommend you visit Security Advice for more information. If you suspect any fraud or malpractice, email us at abuse@talentmate.com.


Talentmate Instagram Talentmate Facebook Talentmate YouTube Talentmate LinkedIn