Job Description

Roles & Responsibilities

Key Responsibilities

  • Lead design and execution of Dataproc ? Databricks PySpark migration roadmap.
  • Define modernization strategy, including data ingestion, transformation, orchestration, and governance.
  • Architect scalable Delta Lake and Unity Catalog–based solutions.
  • Manage and guide teams on code conversion, dependency mapping, and data validation.
  • Collaborate with platform, infra, and DevOps teams to optimize compute costs and performance.
  • Own the automation & GenAI acceleration layer, integrating code parsers, lineage tools, and validation utilities.
  • Conduct performance benchmarking, cost optimization, and platform tuning (Photon, Auto-scaling, Delta Caching).
  • Mentor senior and mid-level developers, ensuring quality standards, documentation, and delivery timelines.

Technical Skills

  • Languages: Python, PySpark, SQL
  • Platforms: Databricks (Jobs, Workflows, Delta Live Tables, Unity Catalog), GCP Dataproc
  • Data Tools: Hadoop, Hive, Pig, Spark (RDD & DataFrame APIs), Delta Lake
  • Cloud & Integration: GCS, BigQuery, Pub/Sub, Cloud Composer, Airflow
  • Automation: GenAI-powered migration tools, custom Python utilities for code conversion
  • Version Control & DevOps: Git, Terraform, Jenkins, CI/CD pipelines
  • Other: Performance tuning, cost optimization, and lineage tracking with Unity Catalog

Preferred Experience

  • 10–14 years of data engineering experience with at least 3 years leading Databricks or Spark modernization programs.
  • Proven success in migration or replatforming projects from Hadoop or Dataproc to Databricks.
  • Exposure to AI/GenAI in code transformation or data engineering automation.
  • Strong stakeholder management and technical leadership skills.

Experience

  • 11-12 Years

Skills

  • Primary Skill: Data Engineering
  • Sub Skill(s): Data Engineering
  • Additional Skill(s): Python, Apache Hadoop, Apache Hive, Apache Airflow, synapse, Databricks, SQL, Apache Spark, Azure Data Factory, Pyspark, GenAI Fundamentals, Cloud Pub/Sub, BigQuery

About The Company

Infogain is a human-centered digital platform and software engineering company based out of Silicon Valley. We engineer business outcomes for Fortune 500 companies and digital natives in the technology, healthcare, insurance, travel, telecom, and retail & CPG industries using technologies such as cloud, microservices, automation, IoT, and artificial intelligence. We accelerate experience-led transformation in the delivery of digital platforms. Infogain is also a Microsoft (NASDAQ: MSFT) Gold Partner and Azure Expert Managed Services Provider (MSP).

Infogain, an Apax Funds portfolio company, has offices in California, Washington, Texas, the UK, the UAE, and Singapore, with delivery centers in Seattle, Houston, Austin, Kraków, Noida, Gurgaon, Mumbai, Pune, and Bengaluru.


Job Details

Role Level: Not Applicable Work Type: Full-Time
Country: India City: Gurugram ,Haryana
Company Website: http://www.infogain.com Job Function: Information Technology (IT)
Company Industry/
Sector:
IT Services And IT Consulting Insurance And Retail

What We Offer


About the Company

Searching, interviewing and hiring are all part of the professional life. The TALENTMATE Portal idea is to fill and help professionals doing one of them by bringing together the requisites under One Roof. Whether you're hunting for your Next Job Opportunity or Looking for Potential Employers, we're here to lend you a Helping Hand.

Report

Disclaimer: talentmate.com is only a platform to bring jobseekers & employers together. Applicants are advised to research the bonafides of the prospective employer independently. We do NOT endorse any requests for money payments and strictly advice against sharing personal or bank related information. We also recommend you visit Security Advice for more information. If you suspect any fraud or malpractice, email us at abuse@talentmate.com.


Recent Jobs
View More Jobs
Talentmate Instagram Talentmate Facebook Talentmate YouTube Talentmate LinkedIn