Lead Applications Developer GCP BigQuery Pub Sub Kafka GKE Java Python C
Talentmate
India
20th February 2026
2602-3589-1201
Job Description
Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page.
Découvrez votre prochaine opportunité au sein dune organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers lavenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de lautonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences daujourdhui et de demain.
Job Summary
Fiche de poste :
We are seeking a GCP-focused Data Engineer to build scalable, high‑quality data pipelines supporting our Data Maturity initiative for Logistics/Parcel Services. The ideal candidate has strong experience in GCP data services, data modeling, data quality frameworks, and understands logistics domain data such as shipment tracking, routing, and warehouse operations.
Key Responsibilities
Core Engineering (All Levels)
Pipeline Development: Design and develop scalable ETL/ELT pipelines using BigQuery, Pub/Sub, and Dataflow/Dataproc.
Microservices: Build and deploy APIs using Python/Java/C# to integrate enterprise and external logistics systems.
Orchestration: Orchestrate workloads via Composer (Airflow) or GKE using Docker and Kubernetes.
Data Quality: Implement validation checks, lineage tracking, and monitoring for pipeline SLAs (freshness, latency).
Modeling: Model logistics and supply chain data in BigQuery for analytics and operational insights.
DataOps: Apply CI/CD, automated testing, and versioning best practices.
Intermediate / Senior additions
System Design: Take ownership of end-to-end technical design for complex data modules.
Mentorship: Actively mentor junior engineers and conduct rigorous code reviews to ensure high engineering standards.
Best Practices: Establish and document DataOps standards and reusable patterns for the team.
Lead additions
POD Leadership: Act as the technical head of the data pod, ensuring sprint goals are met and unblocking the team.
Architecture: Define the high-level architecture and long-term technical roadmap for the logistics data platform.
Stakeholder Management: Partner with business leaders to translate complex logistics requirements into technical specifications.
Negotiation: Manage requirements scoping and prioritize backlogs by balancing technical debt with business value.
Coaching: Drive the professional growth of the entire engineering team through structured coaching and performance feedback.
Required Skills
Relevant experience –
Lead – Min 7+ yrs of relevance experience
Strong hands‑on experience with GCP BigQuery, Pub/Sub, GCS, Dataflow/Dataproc.
Proficiency in Python/Java/C#, RESTful APIs, and microservice development.
Experience with Kafka for event-driven ingestion.
Strong SQL and experience with data modeling
Expertise in Docker/Kubernetes (GKE) and CI/CD tools (Cloud Build, GitHub Actions, or ADO).
Experience implementing Data Quality, Metadata management, and Data Governance frameworks.
Preferred Qualifications
Experience with Terraform, Cloud Composer (Airflow)
Experience in Azure Databricks, Delta Lake, ADLS, and Azure Data Factory
Experience in Knowledge Graph Engineering using Neo4j and/or Stardog
Familiarity with Data Governance tools or Cataloging systems (AXON Informatica)
Logistics domain experience
Education
Bachelor’s degree in Computer Science, Information Systems, Engineering, or related field
Type De Contrat
en CDI
Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés.
Searching, interviewing and hiring are all part of the professional life. The TALENTMATE Portal idea is to fill and help professionals doing one of them by bringing together the requisites under One Roof. Whether you're hunting for your Next Job Opportunity or Looking for Potential Employers, we're here to lend you a Helping Hand.
Disclaimer: talentmate.com is only a platform to bring jobseekers & employers together.
Applicants
are
advised to research the bonafides of the prospective employer independently. We do NOT
endorse any
requests for money payments and strictly advice against sharing personal or bank related
information. We
also recommend you visit Security Advice for more information. If you suspect any fraud
or
malpractice,
email us at abuse@talentmate.com.
You have successfully saved for this job. Please check
saved
jobs
list
Applied
You have successfully applied for this job. Please check
applied
jobs list
Do you want to share the
link?
Please click any of the below options to share the job
details.
Report this job
Success
Successfully updated
Success
Successfully updated
Thank you
Reported Successfully.
Copied
This job link has been copied to clipboard!
Apply Job
Upload your Profile Picture
Accepted Formats: jpg, png
Upto 2MB in size
Your application for Lead Applications Developer GCP BigQuery Pub Sub Kafka GKE Java Python C
has been successfully submitted!
To increase your chances of getting shortlisted, we recommend completing your profile.
Employers prioritize candidates with full profiles, and a completed profile could set you apart in the
selection process.
Why complete your profile?
Higher Visibility: Complete profiles are more likely to be viewed by employers.
Better Match: Showcase your skills and experience to improve your fit.
Stand Out: Highlight your full potential to make a stronger impression.
Complete your profile now to give your application the best chance!