To effectively design, develop, and manage data solutions using ETL technologies such as Azure Databricks (ADB) and Azure Data Factory (ADF), work with NoSQL databases like Cosmos DB, and apply object-oriented programming (OOP) principles through C#/.NET for scalable data integration and backend services.
Duties And Responsibilities
Databricks Development: Build and manage Databricks notebooks using SQL and PySpark within a reusable framework.
ETL Development: Design and maintain data integration pipelines in Azure Data Factory.
Database Proficiency: Strong knowledge of SQL and experience with relational databases like SQL Server, MySQL, etc.
API/Data Exchange Layer: Develop and manage backend services in C#/.NET, hosted on Azure App Service.
NoSQL Expertise: Work with Cosmos DB or MongoDB (documents & collections), including concepts like indexing, partitioning, change feed, throttling, etc.
CI/CD & Release Management: Utilize DevOps pipelines for code versioning, automated testing, and release.
Time-Series Data Handling: Leverage Azure Data Explorer (ADX) for time-series data, with knowledge of materialized views, sharding, and caching policies.
Real-Time Streaming (Nice to Have): Basic understanding of Azure Event Hub, including batching, offsets/checkpoints, payload handling, and throttling.
Cloud Platform Familiarity (Preferred): Exposure to Azure cloud services and architecture best practices.
Key Responsibilities/Major Challenges
Translate business requirements into technical solutions in collaboration with the PMO team.
Own end-to-end delivery of data projects, ensuring on-time execution and adherence to quality standards.
Design technical architecture and guide development efforts for enhancements and new projects.
Develop and maintain robust ETL pipelines and data integration modules across systems.
Ensure high data quality, platform stability, and resolution of critical process issues.
Monitor and resolve performance bottlenecks in data workflows and programs.
Establish best practices, standard operating procedures, and drive their implementation across teams.
Act as a liaison with business users and product managers to support daily data needs and strategic initiatives.
Coordinate with internal and external development teams to troubleshoot and resolve issues efficiently.
Manage workload through effective planning, prioritization, and progress tracking.
Balancing development timelines with on-time delivery in a dynamic and evolving technical environment.
Coordinating with multiple internal and external stakeholders to align priorities, resolve dependencies, and ensure smooth execution.
Ensuring code quality, data integrity, and performance while scaling data solutions across diverse systems and platforms.
Key Decisions / Dimensions
Making critical decisions during production issues, including root cause analysis, quick fixes, and long-term resolutions.
Prioritizing and escalating support tasks effectively to minimize downtime and business impact.
Driving decisions around technical design, architecture, and optimization to ensure performance, scalability, and maintainability of solutions.
Educational Qualifications
Required Qualifications and Experience
Graduate or Post?Graduate in Computer Science, Information Technology, or Data Science/Technologies.
Work Experience
0.5–1 year of hands?on data engineering experience.
Technical Expertise / Skills Keywords:
Azure Databricks - PySpark, SQ - Must Have
Azure Data Factory – For ETL & Data Integrations - Must Have
OOPS Concept Implementation in C#/.Net
COSMOS Database for NoSQL DB
Event Hub & Kafka for Change Feed & Real-Time Streaming - Good to Have
Azure Data Explorer as Time Series Database with Kusto Query Language (KQL) As Programming Language - Good to Have
Searching, interviewing and hiring are all part of the professional life. The TALENTMATE Portal idea is to fill and help professionals doing one of them by bringing together the requisites under One Roof. Whether you're hunting for your Next Job Opportunity or Looking for Potential Employers, we're here to lend you a Helping Hand.
Disclaimer: talentmate.com is only a platform to bring jobseekers & employers together.
Applicants
are
advised to research the bonafides of the prospective employer independently. We do NOT
endorse any
requests for money payments and strictly advice against sharing personal or bank related
information. We
also recommend you visit Security Advice for more information. If you suspect any fraud
or
malpractice,
email us at abuse@talentmate.com.
You have successfully saved for this job. Please check
saved
jobs
list
Applied
You have successfully applied for this job. Please check
applied
jobs list
Do you want to share the
link?
Please click any of the below options to share the job
details.
Report this job
Success
Successfully updated
Success
Successfully updated
Thank you
Reported Successfully.
Copied
This job link has been copied to clipboard!
Apply Job
Upload your Profile Picture
Accepted Formats: jpg, png
Upto 2MB in size
Your application for Data Engineer
has been successfully submitted!
To increase your chances of getting shortlisted, we recommend completing your profile.
Employers prioritize candidates with full profiles, and a completed profile could set you apart in the
selection process.
Why complete your profile?
Higher Visibility: Complete profiles are more likely to be viewed by employers.
Better Match: Showcase your skills and experience to improve your fit.
Stand Out: Highlight your full potential to make a stronger impression.
Complete your profile now to give your application the best chance!