Job Description

Company:IT Services Organization

Key Skills: Amazon Web Services (AWS), Azure, Octopus, IAM, S3, EC2,VPC, cloud watch, Nix, Azure, ANT, Maven, Tortoise SVN, GitHub, Chef, Puppet, Ansible, Terraform, Docker, WebLogic, Kubernetes, SonarQube, TFS, Git, Java, Agile, Apache HTTPD, Apache Tomcat, Jboss, J, AWS Red Shift, S3, Glue, EC2, EMR, Airflow

Roles and Responsibilities:

  • Design and implement ETL pipelines using AWS Glue, Python, and SQL for data ingestion, transformation, and loading.
  • Develop and maintain data lakes and data warehouses using Amazon S3 and Amazon Redshift.
  • Optimize data storage, query performance, and cost efficiency using Redshift Spectrum and tuning techniques.
  • Build reusable and scalable data ingestion and transformation frameworks following security and governance standards.
  • Troubleshoot performance bottlenecks and ensure high availability and reliability of data pipelines.
  • Follow AWS Well-Architected Framework and cloud security best practices.
  • Collaborate with cross-functional teams to define data models, governance rules, and solution architecture aligned with business objectives.
  • Implement data quality checks, monitoring, logging, and metadata management across ETL workflows.
  • Coordinate with SBU users, development teams, centers of excellence, and regional infrastructure teams during project execution.
  • Participate in requirements gathering, solution design, development, and end-to-end implementation within data analytics initiatives.
  • Prepare functional specifications, technical documentation, system configuration guides, and user manuals.
  • Support user training and knowledge transfer activities as required.

Skills Required:

  • Strong hands-on experience with Amazon Web Services including AWS Glue, S3, EC2, VPC, IAM, and CloudWatch.
  • Experience working with Azure cloud services and hybrid cloud environments.
  • Proficiency in ETL development using AWS Glue, Python, and SQL.
  • Hands-on experience with data warehousing solutions such as Amazon Redshift and Redshift Spectrum.
  • Experience with big data and workflow orchestration tools such as EMR and Apache Airflow.
  • Strong knowledge of infrastructure-as-code and configuration management tools including Terraform, Chef, Puppet, and Ansible.
  • Experience with containerization and orchestration tools such as Docker and Kubernetes.
  • Proficiency with CI/CD tools and build systems including Octopus, ANT, Maven, Git, GitHub, TFS, and Tortoise SVN.
  • Experience with application servers and web technologies such as WebLogic, Apache HTTPD, Apache Tomcat, and JBoss.
  • Working knowledge of Java and Agile development methodologies.
  • Familiarity with monitoring, code quality, and governance tools such as SonarQube.

Education: A degree in Computer Science, Information Technology, or a related discipline is preferred.


Job Details

Role Level: Mid-Level Work Type: Full-Time
Country: India City: Bengaluru ,Karnataka
Company Website: https://mycareernet.in/ Job Function: Engineering
Company Industry/
Sector:
Information Services

What We Offer


About the Company

Searching, interviewing and hiring are all part of the professional life. The TALENTMATE Portal idea is to fill and help professionals doing one of them by bringing together the requisites under One Roof. Whether you're hunting for your Next Job Opportunity or Looking for Potential Employers, we're here to lend you a Helping Hand.

Report

Disclaimer: talentmate.com is only a platform to bring jobseekers & employers together. Applicants are advised to research the bonafides of the prospective employer independently. We do NOT endorse any requests for money payments and strictly advice against sharing personal or bank related information. We also recommend you visit Security Advice for more information. If you suspect any fraud or malpractice, email us at abuse@talentmate.com.


Recent Jobs
View More Jobs
Talentmate Instagram Talentmate Facebook Talentmate YouTube Talentmate LinkedIn