Job Description

Position Summary

The Staff Data Engineer serves as a primary development resource for design, writing code, test, implementation, document functionality, and maintain of NextGen solutions for the GCP Cloud enterprise data initiatives. This role requires working closely with data teams, frequently in a matrixed environment as part of a broader project team. This role requires ‘self-starters’ who are proficient in problem solving and capable of bringing clarity to complex situations. The culture of the organization places an emphasis on teamwork, so social and interpersonal skills are equally important as technical capability. Due to the emerging and fast-evolving nature of GCP technology and practice, the position requires that one stay well-informed of technological advancements and be proficient at putting new innovations into effective practice.

In addition, this candidate will have a history of increasing responsibility in a small multi-role team. This position requires a candidate who can analyze business requirements, perform design tasks, construct, test, and implement solutions with minimal supervision. This candidate will have a record of accomplishment of participation in successful projects in a fast-paced, mixed team (consultant and employee) environment. In addition, the applicant must be willing to mentor other developers to prepare them for assuming their responsibilities.

As a Staff Data Engineer, you will collaborate closely with all team members to create a modular, scalable solution that addresses current needs, but will also serve as a foundation for future success. The position will be critical in building the team’s engineering practices in test driven development, continuous integration, and automated deployment and is a hands-on team member who actively coaches the team to solve complex problems.

Major Responsibilities:

Core Competencies

The following are highlighted entrepreneurial competencies and core expectations for the job/role:

  • Communication and interpersonal skills
  • Problem-solving and critical thinking skills.
  • Understand strategic imperatives.
  • Technology & business knowledge

At HCA DT&I, your deliverables will influence patient care. Every process, technology, and decision matters. This role will provide application development for specific business environments. Focus on setting technical direction on groups of applications and similar technologies as well as taking responsibility for technically robust solutions encompassing all business, architecture, and technology constraints.

  • Work with data engineers, data architects, data scientists, and other internal stakeholders to understand product requirements and then design, build, and monitor data platforms and pipelines that meet todays requirements but can gracefully scale.
  • Implement automated workflows that lower manual/operational costs, define and uphold SLAs for timely delivery of data, and move the company closer to democratizing data.
  • Enable self-service data architecture supporting query exploration, dashboards, data catalog, and rich data discovery.
  • Promote a collaborative team environment that prioritizes effective communication, team member growth, and success of the team over success of the individual.
  • Design and create real-time data pipelines that accelerate the time from idea to insight.
  • Adheres to and supports data engineering best practices, processes, and standards.
  • Produce high quality, modular, reusable code that incorporates best practices and serves as an example for less experienced engineers.
  • Help promote and support data security best practices that align with industry standards and regulatory and legal requirements.
  • Help mentor team members on complex data projects and following the Agile process.
  • Help lead data analysis efforts and solution proposals to data related and data architecture problems.
  • Help lead implementation of unit and integration tests and promote and conduct performance testing where appropriate.
  • Be a leader in the HCA data community. Evangelize data engineering best practices and standards, participate, or present at community events, and encourage the continual growth and development of others.
  • Be curious. Be growth minded. Encourage and enable this in others.
  • Demonstrate professional and personal maturity through self-leadership.
  • Build productive and healthy relationships within the department and other teams to foster growth of our culture, our people, and our platforms.
  • Practices and adheres to the “Code of Conduct” philosophy and “Mission and Value Statement.”
  • Perform other duties as assigned.
  • Responsible for building and supporting a GCP based ecosystem designed for enterprise-wide analysis of structured, semi-structured, and unstructured data.
  • Work independently, and complete tasks on-schedule by exercising strong judgment and problem-solving skills.
  • Analyze requirements, design AI/ML based solutions, and integrate those solutions for customer environments.
  • Proven experience effectively prioritizing workload to meet deadlines and work objectives.
  • Works in an environment with rapidly changing business requirements and priorities
  • Shares knowledge and experience to contribute to growth of overall team capabilities.
  • Actively participate in technical group discussions and adopt any modern technologies to improve the development and operations.

A Successful Candidate Will Have:

  • Strong understanding of best practices and standards for GCP Data process design and implementation.
  • 3+ years’ hands-on experience with GCP platform and experience with many of the following components:
  • Cloud Run, GKE, Cloud Functions
  • Pub/Sub,
  • Bigtable, Cloud SQL, Cloud Spanner
  • BigQuery, Dataflow, Data Fusion
  • Cloud Composer, DataProc, CI/CD, Cloud Logging
  • Vertex AI, NLP, GitHub
  • 4+ Years of hands-on experience with many of the following components:
  • Spark Streaming, Kafka
  • SQL, JSON, Avro, Parquet
  • Java, Python, or Scala

Certifications (a Plus, But Not Required):

  • GCP Cloud Professional Data Engineer

PHYSICAL DEMANDS/WORKING CONDITIONS (Specific statements of physical effort required and description of work environment, e.g., prolonged sitting at CRT. required travel %)

  • Prolonged sitting or standing at computer workstations including use of mouse, keyboard, and monitor.
  • Requires the ability to provide after-hours support.

Education & Experience:

Bachelors degree in computer science, related technical field, or equivalent experience

Required

Masters degree in computer science or related field

Preferred

5+ years of experience in Data Engineer

Required

1+ year(s) of experience in Healthcare

Preferred

7+ years of experience in Information Technology

Required

Travel Required

Knowledge, Skills, Abilities, Behaviors:

Demonstrates an empathetic and growth mindset with a willingness to learn new skills, technologies, and methodologies.

Required

Strong knowledge of public cloud best practices and design patterns used in creating, automating, and supporting data pipelines.

Required

Strong ability to assemble large, complex sets of data that meet functional and non-functional product requirements.

Required

Strong ability to identify, design, and implement internal process improvements including redesigning data platforms for greater scalability, optimized data delivery, and automating manual processes.

Required

Strong ability to create and use analytical tools to monitor data pipeline metrics and provide actionable intelligence to increase operational efficiency and valuable data outcomes.

Required

Strong ability to present and facilitate technical ideas.

Required

Expert ability using source control management tools such as Git/GitHub.

Required

Expert ability using CI/CD automation tools.

Required

Strong understanding of SQL and analytical data warehouses

Required

Strong understanding of Agile methodologies and how to apply Agile within the team.

Required

Helps coach and mentor junior team members and others external to the team.

Required

Proven ability to complete work, make sound decisions, and plan and accomplish goals without explicit direction/guidance from leadership.

Required

Builds and nurtures healthy relationships with all colleagues.

Required

Stays abreast of public cloud technologies, capabilities, and industry use of public cloud to help guide HCA’s strategy and adoption.

Required

Demonstrates productive and inclusive communication skills with all colleagues.

Required

Excellent problem-solving and analytical skills.

Required

Ability to work both independently and on a team.

Preferred

Ability to sit for long periods of time.


Job Details

Role Level: Mid-Level Work Type: Full-Time
Country: India City: Hyderabad ,Telangana
Company Website: https://hcahealthcare.in/ Job Function: Information Technology (IT)
Company Industry/
Sector:
Professional Services

What We Offer


About the Company

Searching, interviewing and hiring are all part of the professional life. The TALENTMATE Portal idea is to fill and help professionals doing one of them by bringing together the requisites under One Roof. Whether you're hunting for your Next Job Opportunity or Looking for Potential Employers, we're here to lend you a Helping Hand.

Report

Disclaimer: talentmate.com is only a platform to bring jobseekers & employers together. Applicants are advised to research the bonafides of the prospective employer independently. We do NOT endorse any requests for money payments and strictly advice against sharing personal or bank related information. We also recommend you visit Security Advice for more information. If you suspect any fraud or malpractice, email us at abuse@talentmate.com.


Recent Jobs
View More Jobs
Talentmate Instagram Talentmate Facebook Talentmate YouTube Talentmate LinkedIn