Job Description

Job Description

Job Title: Lead Data Engineer

Job Location: Dubai

Job Description:

Key Responsibilities:

Data Acquisition

  • Manage the existing Data pipelines built for data ingestion.
  • Create and manage new data pipelines following the best practices for the new ingestion of data.
  • Continuously monitor the data ingestion through Change Data Capture for the incremental load
  • Any failed batch job schedule to be analysed and fixed to capture the data
  • Maintaining and continuously updating on the technical documentation of the ingested data and maintaining the centralized datadictionary, with necessary data classifications.

Data Integration, Aggregation and Representation

  • Exposing of Data views or Data models to Reporting and source systems using Hive or Impala, or similar tools provided by DM
  • Exposing of cleansed data to Artificial Intelligence team for building data science models

Change Data Capture and incremental data load

  • Design and implementation Using Big Data Technologies ( Hadoop, Spark, Hive, Kafka and NiFi) for data ingestion and processing. Implementing data transformation and modeling techniques to optimize data for analysis and reporting. Identifying and resolving performance bottlenecks in data processing and storage systems to ensure optimal performance and efficiency.

Deliverables/Outcomes:

  • Implementation of Data Pipelines (Structured and semi[1]structured)

Skills:

  • Certified Big Data Engineer from Cloudera/AWS/Azure
  • Expertise with Big data products – Cloudera stack
  • Expertise in Big Data querying tools, such as Hive, HBase and Impala.
  • Expertise in SQL, writing complex queries/views, partitions, bucketing Strong Experience in Spark using Python/Scala
  • Expertise in messaging systems, such as Kafka or RabbitMQ
  • Hands on experience in Management of Hadoop cluster with all included services.
  • Implementing ETL process using Sqoop/Spark Implementation including loading from disparate data sets, Pre-processing using Hive.
  • Ability to design solutions independently based on high-level architecture. Collaborate with other development teams
  • Expertise in building stream-processing systems, using solutions such as Spark- Streaming, Apache NIFI, KAFKA
  • Expertise with NoSQL databasessuch as HBase
  • Experience with Informatica Enterprise Data Catalogue (EDC) implementation and administration.
  • Strong knowledge of data management, data governance, and metadata management concepts.
  • Proficiency in SQL and experience with various databases (e.g., Oracle, SQL Server, PostgreSQL) and data formats (e.g., XML, JSON, CSV).
  • Experience with data integration, ETL/ELT processes, and Informatica Data Integration


Job Details

Role Level: Mid-Level Work Type: Full-Time
Country: United Arab Emirates City: Dubai
Company Website: https://paraminfo.com Job Function: Information Technology (IT)
Company Industry/
Sector:
IT Services and IT Consulting

What We Offer


About the Company

Searching, interviewing and hiring are all part of the professional life. The TALENTMATE Portal idea is to fill and help professionals doing one of them by bringing together the requisites under One Roof. Whether you're hunting for your Next Job Opportunity or Looking for Potential Employers, we're here to lend you a Helping Hand.

Report

Disclaimer: talentmate.com is only a platform to bring jobseekers & employers together. Applicants are advised to research the bonafides of the prospective employer independently. We do NOT endorse any requests for money payments and strictly advice against sharing personal or bank related information. We also recommend you visit Security Advice for more information. If you suspect any fraud or malpractice, email us at abuse@talentmate.com.


Recent Jobs
View More Jobs
Talentmate Instagram Talentmate Facebook Talentmate YouTube Talentmate LinkedIn