Job Purpose:
The Sr. Technology Engineer (STE) works towards moving the Data Platform to the advanced level of full automation, security and high availability. Works closely with other Data Leads, Delivery Leads and Chapter head and help drive the Data Management and Analytics vision and strategy to deliver common and consistent data capabilities across the Group.
The primary task is to drive and transform the data capabilities by enabling 100% automated deployments and containerization of the platform. Work with other architects and platform teams to ensure data is managed as an asset in a centralized, standardized, and consistent manner in order to maintain consistency and quality, using mature technologies and emerging data practices.
A highly skilled Senior Technology with extensive experience in DevOps, CI/CD, Kubernetes, OpenShift, Jenkins, and GitOps. The ideal candidate should also have a strong understanding of Big Data and Distributed Data Architecture. This role involves designing, implementing, and maintaining scalable and efficient infrastructure and data solutions to support our business objectives..
Job Content (Describe the key results delivered by the job holder, how they are measured and the tasks performed to achieve the results)
Key Results
Performance Measures
Main Tasks
Key Responsibilities:
- DevOps and CI/CD: Design, implement, and manage CI/CD pipelines using tools like Jenkins and GitOps to automate and streamline the software development lifecycle.
- Containerization and Orchestration: Deploy and manage containerized applications using Kubernetes and OpenShift, ensuring high availability and scalability.
- Infrastructure Management: Develop and maintain infrastructure as code (IaC) using tools like Terraform or Ansible.
- Big Data Solutions: Architect and implement big data solutions using technologies such as Hadoop, Spark, and Kafka.
- Distributed Systems: Design and manage distributed data architectures to ensure efficient data processing and storage.
- Collaboration: Work closely with development, operations, and data teams to understand requirements and deliver robust solutions.
- Monitoring and Optimization: Implement monitoring solutions and optimize system performance, reliability, and scalability.
- Security and Compliance: Ensure infrastructure and data solutions adhere to security best practices and regulatory requirements.
Qualifications:
- Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- Experience: Minimum of 5 years of experience in big data engineering or a related role.
- Technical Skills:
- Proficiency in CI/CD tools such as Jenkins and GitOps.
- Strong experience with containerization and orchestration tools like Kubernetes and OpenShift.
- Knowledge of big data technologies such as Hadoop, Spark, ETLs.
- Proficiency in scripting languages such as Python, Bash, or Groovy.
- Familiarity with infrastructure as code (IaC) tools like Terraform or Ansible.
- Soft Skills:
- Excellent problem-solving and analytical skills.
- Strong communication and collaboration abilities.
- Ability to work in a fast-paced, dynamic environment.
Preferred Qualifications:
- Certifications in DevOps, cloud platforms, or big data technologies.
- Experience with monitoring and logging tools such as Prometheus, Grafana, or ELK Stack.
- Knowledge of security best practices in DevOps and data engineering.
- Familiarity with agile methodologies and continuous integration/continuous deployment (CI/CD) practices.
Desired Or Essential
- Education
- General
- Professional
Master or Bachelor’s degree in computer science, information systems management or related field.
Essential
(Years & Type)
- Industry
- Regional
- Functional
- More than 8+ years of experience in information technology, with 3+ years spent in data engineering, architecture and technology solutions definitions and implementations.
- Extensive experience in banking and financial services domain
- Knowledge & Skills
- Technical
- Managerial
Up-to-date knowledge of CDP versions, including latest releases through 7.1.7 through 7.1.9
In-depth knowledge of Cloudera manager for deploying, configuring and monitoring CDP services
Strong understanding of security mechanisms like Kerberos, LDAP/AD integration, and Transport Layer Security (TLS)
Collaborate with supporting teams (database, network, security, system teams) and conduct root cause analysis of production issues and provide corrective actions
Strong command over Linux CLI, as it’s the foundation for managing CDP environments
Skills in automating repetitive tasks using scripting languages like Bash or Python
Ability to set up and manage monitoring solutions for CDP clusters to increase observability
Expertise in Data Architecture, Data Strategy and Roadmap for large and complex organization and systems and implemented large scale end-to-end Data Management & Analytics solutions
Experience in transforming traditional Data Warehousing approaches to Big Data based approaches and proven track record of managing risks and data security
Expertise with DW Dimensional modeling techniques, Star & Snowflake schemas, modeling slowly changing dimensions and role playing dimensions, dimensional hierarchies, and data classification
Experiences in cloud native principals, designs and deployments.
Extensive experience working with and enhancing Continuous Integration (CI) and Continuous Development (CD) environments
Expertise in Data Quality, Data Profiling, Data Governance, Data Security, Metadata Management, and Data Archival
Define workload migration strategies using appropriate tools
Drive delivery in a matrixed environment working with various internal IT partners
Demonstrated ability to work in a fast paced and changing environment with short deadlines, interruptions, and multiple tasks/projects occurring simultaneously
Must be able to work independently and have skills in planning, strategy, estimation, scheduling,
Strong problem solving, influencing, communication, and presentation skills, self-starter
Experience with data processing frameworks and platforms (Informatica, Hadoop, Presto, Tez, Hive, Spark etc.)
Hands-on experience with related/complementary open source software platforms and languages (e.g. Java, Linux, Python, GIT, Jenkins)
Exposure to BI tools and reporting software (e.g. MS PowerBI and Tableau)
- Behavioral Competencies
- Business Acumens
- Organization leadership skills
- Coaching and mentoring
- Excellent analytical skills
- Demonstrate critical and systems of thinking ability.
- Ability to negotiate and influence
- Disciplined, organized
- Flexibility / Adaptability
- Visionary
- Performance driven
S YNECHRON’S DIVERSITY & INCLUSION STATEMENT
Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.
All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.
Candidate Application Notice