With 75 years of experience, our focus is on helping the most vulnerable children overcome poverty and experience fullness of life. We help children of all backgrounds, even in the most dangerous places, inspired by our Christian faith.
Come join our 33,000+ staff working in nearly 100 countries and share the joy of transforming vulnerable children’s life stories!
Key Responsibilities:
This position is responsible mainly for ensuring the integrity, timeliness and optimal performance of the Finance Data Warehouse/Data Lake platform. This role involves monitoring data ingestion (ETL/ELT), troubleshooting and applying code fix/change to production related issues and maintaining the documentation to support the business operations. Additionally, the position manages the environment to secure the data platform from unauthorized access and maintain platform scalability within the budget thresholds.
Work Set-up:
The role requires an on-site presence and involves regular collaboration with global teams across various time zones.
This position is eligible for Hybrid-Work based on GFS Manila Hybrid Work Position and Implementing Rules.
The position requires ability and willingness to travel domestically and internationally as needed.
Key Responsibilities:
Pipeline Monitoring and Data Validation
This Includes, But Is Not Limited To The Following:
Monitor and maintain ETL/ELT pipelines to ensure data warehouse/lake house platforms are up-to-date and optimized for performance.
Manage and perform the uploading of diverse data sources/source files into the data warehouse/data lake platform.
Monitor and Automate data validation by comparing data from source file vs data uploaded in data warehouse/data lake platform.
Assist in analyzing data quality issues and collaborate on resolution to be applied.
Document production operations manual.
Communicate changes, corrections and updates to users and stakeholders.
Design, Develop, Maintain and Deploy ETL/ELT Pipelines
Provide root cause analysis and fixes/code changes to production issues.
Design, develop and deploy data pipeline process/data loaders/data quality code to accommodate evolving business requirements.
Document root cause analyses, resolutions and any code fixes/changes applied.
Collaborate with developers during stabilization of deployed code.
Communicate changes, corrections and updates to users and stakeholders.
Manage the Datawarehouse/Lakehouse environments
This Includes But Not Limited To The Following:
Administer the different data layers of the data warehouse/lake house to ensure compliance with organizational data governance and standard data architecture.
Establish Code Deployment in Production to allow Continuous Integration/Continuous Delivery (CI/CD).
Daily monitoring of server performance and cost utilization
Coordination with IT counterparts for server maintenance and downtime schedule.
Test servers/cloud data services for availability.
Escalation of server problems encountered to IT counterparts.
Coordination with data engineers and solution architects and IT counterparts on storage and environments assessment and future requirements.
Training And Others
Build knowledge base and share best practices in the area of data engineering/administration.
Perform other additional tasks that may be assigned.
Attend and participate in meetings, conferences, workshops, chapel services, devotions, etc.
QUALIFICATIONS:
Candidate must possess a Bachelor’s degree in IT, Computer Science or related course/field.
At least 2 years work experience working with SQL Server Administration.
1 to 3 years work experience in ETL development or support.
Experience in SQL Server Administration, RDBMS, Microsoft SSIS OLAP Cubes administration.
Strong understanding of data pipelines (ETL/ELT) processes.
Proficiency in SQL, SSIS or any other ETL tools
GOOD-TO-HAVE:
Knowledge in Python, Excel (Advanced Data Manipulation).
Knowledge in data warehouse/data lake architecture.
Preferably knowledgeable in Cloud Datawarehouse deployment in Azure/AWS (e.g Microsoft Fabric, Microsoft DW Technology Stack) and data governance framework.
Knowledge in accounting/ERP, budget and planning software is an advantage.
Experience with large data in a multinational organization
Searching, interviewing and hiring are all part of the professional life. The TALENTMATE Portal idea is to fill and help professionals doing one of them by bringing together the requisites under One Roof. Whether you're hunting for your Next Job Opportunity or Looking for Potential Employers, we're here to lend you a Helping Hand.
Disclaimer: talentmate.com is only a platform to bring jobseekers & employers together.
Applicants
are
advised to research the bonafides of the prospective employer independently. We do NOT
endorse any
requests for money payments and strictly advice against sharing personal or bank related
information. We
also recommend you visit Security Advice for more information. If you suspect any fraud
or
malpractice,
email us at abuse@talentmate.com.
You have successfully saved for this job. Please check
saved
jobs
list
Applied
You have successfully applied for this job. Please check
applied
jobs list
Do you want to share the
link?
Please click any of the below options to share the job
details.
Report this job
Success
Successfully updated
Success
Successfully updated
Thank you
Reported Successfully.
Copied
This job link has been copied to clipboard!
Apply Job
Upload your Profile Picture
Accepted Formats: jpg, png
Upto 2MB in size
Your application for Datawarehouse Admin Associate
has been successfully submitted!
To increase your chances of getting shortlisted, we recommend completing your profile.
Employers prioritize candidates with full profiles, and a completed profile could set you apart in the
selection process.
Why complete your profile?
Higher Visibility: Complete profiles are more likely to be viewed by employers.
Better Match: Showcase your skills and experience to improve your fit.
Stand Out: Highlight your full potential to make a stronger impression.
Complete your profile now to give your application the best chance!