Job Description

Since launching in Kuwait in 2004, talabat, the leading on-demand food and Q-commerce app for everyday deliveries, has been offering convenience and reliability to its customers. talabat’s local roots run deep, offering a real understanding of the needs of the communities we serve in eight countries across the region.

We harness innovative technology and knowledge to simplify everyday life for our customers, optimize operations for our restaurants and local shops, and provide our riders with reliable earning opportunities daily.

Here at talabat, we are building a high performance culture through engaged workforce and growing talent density. Were all about keeping it real and making a difference. Our 6,000+ strong talabaty are on an awesome mission to spread positive vibes. We are proud to be a multi great place to work award winner.

Job Description

About the Role

We’re looking for a Data Engineer who’s passionate about building reliable, scalable, and cost-efficient data systems. You’ll work with a modern stack, Kafka, Google Cloud Platform (GCP), AWS, to design and maintain the pipelines that power analytics, machine learning, and product insights.

Ideal role for someone with solid foundational skills in data engineering who’s ready to deepen their expertise, take ownership of workflows, and collaborate across teams.

If you don’t know every tool in our stack yet, that’s okay. We value curiosity, problem-solving, and a willingness to learn just as much as existing technical skills.

Whats On Your Plate?

  • Design, build, and maintain data pipelines and workflows for batch and streaming use cases.
  • Work with Kafka to manage real-time data ingestion and event-driven architectures.
  • Leverage GCP and AWS services for storage, processing, and orchestration (e.g., BigQuery, Dataflow, S3, Lambda).
  • Orchestrate workflows using tools like Airflow or similar schedulers.
  • Ensure data quality and reliability through monitoring, alerting, and automated validation.
  • Collaborate with analysts, data scientists, and product teams to understand requirements and deliver data solutions that drive business impact.
  • Optimize for cost and performance across cloud environments.
  • Participate in code reviews, documentation, and knowledge sharing to raise the bar for the team.

Our Tech Stack

  • Data Ingestion & Streaming: Apache Kafka, Kafka Connect
  • Cloud Platforms: Google Cloud Platform (BigQuery, Dataflow, Pub/Sub, Cloud Storage), AWS (S3, Lambda, Glue)
  • Workflow Orchestration: Apache Airflow
  • Programming Languages: Python, SQL, (bonus: Java/Scala)
  • Infrastructure & DevOps: Terraform, CI/CD pipelines, Docker
  • Monitoring & Observability: Grafana, Prometheus, Cloud-native tools

Qualifications

What Did We Order?

What We’re Looking For

  • Experience (1-3 years) in data engineering, software engineering, or a related field.
  • Proficiency in SQL and at least one programming language (Python preferred).
  • Understanding of data modeling, ETL/ELT concepts, and cloud-based data warehouses.
  • Familiarity with streaming platforms (Kafka, Kinesis, or similar).
  • Comfort working in cloud environments (GCP, AWS, or Azure).
  • Strong communication skills, able to explain technical concepts to non-technical audiences.
  • Growth mindset, eager to learn, adapt, and take on new challenges.

Nice-to-Have (But Not Required And Willing To Learn)

  • Experience with infrastructure-as-code (Terraform, CloudFormation).
  • Exposure to containerization (Docker, Kubernetes).
  • Knowledge of data governance, security, and compliance best practices.

Additional Information

Why You’ll Love Working Here

  • Impact: Your work will directly influence how data powers decisions across the company.
  • Learning culture: We invest in your growth — from mentorship to training budgets.
  • Modern stack: Work with cutting-edge tools and cloud platforms.
  • Collaboration: Partner with talented engineers, analysts, and product managers.
  • Flexibility: We care about outcomes, not where you work from.

Our Hiring Philosophy

We know that a great data engineer isn’t defined by checking every box. If you’re excited about data engineering, have a solid foundation, and are eager to grow, we want to hear from you.


Job Details

Role Level: Mid-Level Work Type: Full-Time
Country: United Arab Emirates City: Dubai
Company Website: http://www.talabat.com Job Function: Information Technology (IT)
Company Industry/
Sector:
IT Services and IT Consulting and Software Development

What We Offer


About the Company

Searching, interviewing and hiring are all part of the professional life. The TALENTMATE Portal idea is to fill and help professionals doing one of them by bringing together the requisites under One Roof. Whether you're hunting for your Next Job Opportunity or Looking for Potential Employers, we're here to lend you a Helping Hand.

Report

Disclaimer: talentmate.com is only a platform to bring jobseekers & employers together. Applicants are advised to research the bonafides of the prospective employer independently. We do NOT endorse any requests for money payments and strictly advice against sharing personal or bank related information. We also recommend you visit Security Advice for more information. If you suspect any fraud or malpractice, email us at abuse@talentmate.com.


Recent Jobs
View More Jobs
Talentmate Instagram Talentmate Facebook Talentmate YouTube Talentmate LinkedIn