Job Description

Job Summary

We are seeking a AWS Databricks with 6 to 10 years of experience to join our team. The ideal candidate will have expertise in Spark in Scala Delta Sharing and Databricks Unity Catalog . This role involves working with cutting-edge technologies like Databricks CLI Delta Live Pipelines and Structured Streaming. The candidate will play a crucial role in managing risk and ensuring data integrity using tools such as Apache Airflow Amazon S3 and Python. The position is hybrid with no travel

Responsibilities

  • Develop and maintain scalable data pipelines using Spark in Scala to ensure efficient data processing and analysis.
  • Implement Delta Sharing and Databricks Unity Catalog to manage and secure data access across the organization.
  • Utilize Databricks CLI and Delta Live Pipelines to automate data workflows and improve operational efficiency.
  • Design and execute Structured Streaming processes to handle real-time data ingestion and processing.
  • Apply risk management strategies to identify and mitigate potential data-related risks.
  • Integrate Apache Airflow for orchestrating complex data workflows and ensuring seamless data operations.
  • Leverage Amazon S3 for data storage solutions ensuring high availability and durability of data assets.
  • Utilize Python for scripting and automation tasks to enhance productivity and streamline processes.
  • Develop and optimize Databricks SQL queries to extract meaningful insights from large datasets.
  • Implement Databricks Delta Lake to ensure data reliability and consistency across various data sources.
  • Manage Databricks Workflows to automate and schedule data tasks improving overall data management efficiency.
  • Collaborate with cross-functional teams to ensure alignment on data strategies and objectives.
  • Contribute to the continuous improvement of data practices and methodologies to support the companys mission.

Qualifications

  • Possess strong expertise in Spark in Scala and Databricks technologies.
  • Demonstrate proficiency in Delta Sharing and Databricks Unity Catalog .
  • Have experience with Databricks CLI and Delta Live Pipelines.
  • Show capability in Structured Streaming and risk management.
  • Be skilled in Apache Airflow and Amazon S3.
  • Have a strong command of Python for data-related tasks.
  • Be familiar with Databricks SQL and Delta Lake.
  • Understand Databricks Workflows and their applications.
  • Exhibit problem-solving skills and attention to detail.
  • Be able to work in a hybrid model with a focus on day shifts.
  • Have excellent communication and collaboration skills.
  • Be committed to continuous learning and professional development.
  • Be adaptable to changing technologies and business needs.

Certifications Required

Databricks Certified Associate Developer for Apache Spark


Job Details

Role Level: Mid-Level Work Type: Full-Time
Country: India City: Bangalore Urban ,Karnataka
Company Website: https://www.cognizant.com Job Function: Others
Company Industry/
Sector:
IT Services and IT Consulting and Business Consulting and Services

What We Offer


About the Company

Searching, interviewing and hiring are all part of the professional life. The TALENTMATE Portal idea is to fill and help professionals doing one of them by bringing together the requisites under One Roof. Whether you're hunting for your Next Job Opportunity or Looking for Potential Employers, we're here to lend you a Helping Hand.

Report

Similar Jobs

Disclaimer: talentmate.com is only a platform to bring jobseekers & employers together. Applicants are advised to research the bonafides of the prospective employer independently. We do NOT endorse any requests for money payments and strictly advice against sharing personal or bank related information. We also recommend you visit Security Advice for more information. If you suspect any fraud or malpractice, email us at abuse@talentmate.com.


Talentmate Instagram Talentmate Facebook Talentmate YouTube Talentmate LinkedIn