Job Description

Job Title: Senior Data Engineer with Python and Snowflake - Pune


About Us

“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery.


WHY JOIN CAPCO?

You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.


MAKE AN IMPACT

Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.


#BEYOURSELFATWORK

Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.


CAREER ADVANCEMENT

With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.


DIVERSITY & INCLUSION

We believe that diversity of people and perspective gives us a competitive advantage.


Role Description:

Key Skills: Data Engineering, Python, Snowflake, AWS, Git/ Bitbucket

Exp: 9+yrs

Location – Hinjewadi, Pune

Shift timings: 12:30PM- 9:30PM

3 days WFO (Tues, Wed, Thurs)

Technical Requirement

Job Summary


Job Description: Python & Snowflake Engineer with AI/Cortex Development


  1. 7+ years of experience in developing Data Engineering and data science projects using Snowflake/AI Cloud platform on AWS cloud. Snow Park experience preferred. Experience with different data modeling techniques is required.
  2. 7+ yrs experience with Python development. Used tools like VS Code or anaconda, version control using Git or Bitbucket and Python unit testing frameworks.
  3. 1+ years of experience in building snowflake applications using Snowflake AI/Cortex platform (specifically cortex agents, cortex search and cortex LLM with understanding of context enrichment using Prompts or Retrieval-Augmented-Generation methods).
  4. Deep understanding of implementing Object oriented programming in the Python, data structures like Pandas, data frames and writing clean and maintainable Engineering code.
  5. Understanding multi-threading concepts, concurrency implementation using Python server-side python custom modules.
  6. Implementing Object-Relational mapping in the python using frameworks like SQLAlchemy or equivalent.
  7. Good at developing and deploying Python applications like lamda on AWS Cloud platform.
  8. Good at deploying web applications on AWS Cloud using docker containers or Kubernetes with experience of using CI/CD pipelines.
  9. Good at developing applications Snowpipe and Snowpark and moving the data from Cloud sources like AWS S3 and handling unstructured data from data lakes.
  10. Good at Snowflake Account hierarchy models, Account-role-permissions strategy.
  11. Good at Data sharing using preferably Internal Data Marketplace and Data Exchanges for various Listings.
  12. Good at the Data Governance/Security concepts within Snowflake, Row/Column level dynamic data masking concepts using Snowflake Tags.
  13. Good understanding of input query enrichment using Snowflake YAMLs and integrating with LLMs within Snowflake.
  14. Candidate is good at understanding of Relevance search and building custom interaction applications with LLMs.
  15. Nice to have experience in building Snowflake native applications using Streamlit and deploy onto AWS Cloud instances (EC2 or docker containers).
  16. Candidate continuously improving functionality through experimentation, performance tuning and customer feedback.
  17. Nice to have any application Cache implementation experience within Python web applications. Nice to have duckdb with Apache arrow experience.
  18. Nice to have implementing CI/CD pipelines within Snowflake applications.
  19. Good at analytical skills, problem solving and communicate technical concepts clearly.
  20. Experience using Agile and SCRUM methodologies and preferably with JIRA.


If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visit www.capco.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube.


Job Details

Role Level: Mid-Level Work Type: Full-Time
Country: India City: Pune ,Maharashtra
Company Website: http://www.capco.com Job Function: Information Technology (IT)
Company Industry/
Sector:
Financial Services

What We Offer


About the Company

Searching, interviewing and hiring are all part of the professional life. The TALENTMATE Portal idea is to fill and help professionals doing one of them by bringing together the requisites under One Roof. Whether you're hunting for your Next Job Opportunity or Looking for Potential Employers, we're here to lend you a Helping Hand.

Report

Similar Jobs

Disclaimer: talentmate.com is only a platform to bring jobseekers & employers together. Applicants are advised to research the bonafides of the prospective employer independently. We do NOT endorse any requests for money payments and strictly advice against sharing personal or bank related information. We also recommend you visit Security Advice for more information. If you suspect any fraud or malpractice, email us at abuse@talentmate.com.


Talentmate Instagram Talentmate Facebook Talentmate YouTube Talentmate LinkedIn