Job Description

This role is for one of the Weekdays clients

Min Experience: 3 years

Location: Bengaluru, Chennai, Hyderabad, Pune

JobType; full-time

This role goes beyond simply maintaining pipelines. You will be responsible for designing and managing the foundational infrastructure upon which everything else is built.

Requirements

What Youll Do

  • Take full ownership of the data lakehouse, including its architecture, ingestion from CDC sources (Postgres, DynamoDB), scalability, and reliability
  • Develop and manage real-time stream processing frameworks for applications such as anomaly detection, customer 360 views, and live supply chain signals—ensuring high throughput and low latency
  • Design and scale OLAP stores to support both real-time and batch processing for internal analytics and AI/ML pipelines
  • Create self-service ETL and query frameworks that enable data consumers to operate quickly without creating bottlenecks for the platform team
  • Implement cost observability measures that provide detailed insights into compute, storage, and query expenses by job, user, and source—and then take action to reduce these costs
  • Build data movement APIs and reverse-ETL pipelines to efficiently deliver data to downstream consumers at scale
  • Establish robust job orchestration layers that remain stable under scale (experience with YARN, Airflow, EMR is a plus)

Who You Are

  • Have 3-12 years of experience in data engineering, with at least 1-7 years focused on building or managing a data platform (beyond just pipelines)
  • Possess deep hands-on expertise with tools like Spark, Hudi/Delta Lake, Kafka, Airflow, Debezium, Presto/Trino, DBT, Airbyte
  • Are comfortable working with the AWS data ecosystem, including EMR, S3, Athena, Glue, and CloudWatch
  • Have managed daily processing of terabytes and billions of events—scale is part of your daily experience
  • Have demonstrably reduced infrastructure costs and can provide metrics showing your impact
  • Are proficient in Java, Python, or Scala—ideally experienced in all three
  • Preferably have experience as a pod lead or tech lead; youre the person others rely on when things break at 2 a.m

Bonus

  • Experience with OLAP engines such as Pinot, Druid, or ClickHouse
  • Have built or contributed to data movement or reverse-ETL APIs
  • Familiarity with feature stores (Feast, Feathr) or data catalog tools like Datahub

What Makes This Different Our data platform powers AI that drives supply chain decisions for Fortune 500 companies. Youll directly witness the real business impact of your work—not just through dashboards. Join a small team with high ownership and the challenge of working at true scale.


Job Details

Role Level: Mid-Level Work Type: Full-Time
Country: India City: Hyderabad ,Telangana
Company Website: http://jobs.weekday.works Job Function: Others
Company Industry/
Sector:
IT Services and IT Consulting

What We Offer


About the Company

Searching, interviewing and hiring are all part of the professional life. The TALENTMATE Portal idea is to fill and help professionals doing one of them by bringing together the requisites under One Roof. Whether you're hunting for your Next Job Opportunity or Looking for Potential Employers, we're here to lend you a Helping Hand.

Report

Disclaimer: talentmate.com is only a platform to bring jobseekers & employers together. Applicants are advised to research the bonafides of the prospective employer independently. We do NOT endorse any requests for money payments and strictly advice against sharing personal or bank related information. We also recommend you visit Security Advice for more information. If you suspect any fraud or malpractice, email us at abuse@talentmate.com.


Recent Jobs
View More Jobs
Talentmate Instagram Talentmate Facebook Talentmate YouTube Talentmate LinkedIn