At Globe, our goal is to create a wonderful world for our people, business, and nation. By uniting people of passion who believe they can make a difference, we are confident that we can achieve this goal.
Job Description The Data Engineering Manager is responsible for the designing, building, and maintaining automated data pipelines across multi-platform environments. This role ensures the integration, storage, and cleansing of data to support the organization’s data strategy. The Data Engineering Manager drives compliance with data governance standards and best practices while ensuring the development and optimization of data platforms.DUTIES AND RESPONSIBILITIES:
1. Data Pipeline Design & Development
Lead the design, development, and optimization of automated data pipelines based on defined solution architectures.
Ensure seamless data ingestion, transformation, and loading processes that meet scalability, security, and business objectives.
Manage integration, storage, and cleansing of data to ensure readiness for downstream systems, including gold layer spokes and third-party outputs.
Implement end-to-end data flows using modern data engineering tools (e.g., Spark, Airflow, dbt, Snowflake, Databricks).
2. Data Engineering Strategy & Governance
Define and champion data engineering standards, frameworks, and coding practices to support scalable and sustainable product builds.
Ensure alignment with enterprise data governance policies and DevSecOps practices—including secure, auditable, and compliant processes.
Drive operational excellence by embedding data integrity, lineage, and auditability into all engineering workflows.
3. L3 Support, Maintenance & Optimization
Serve as the escalation point for L3 support, leading the resolution of complex pipeline and platform issues in coordination with QA and DevOps.
Conduct root cause analysis (RCA) for incidents, propose preventive actions, and implement long-term solutions.
Oversee system testing, performance tuning, and infrastructure optimization to maintain high availability and reliability.
4. Cross-Functional Collaboration & Stakeholder Engagement
Work closely with Solution Architects, Data Architects, Product Owners, and Infrastructure teams to ensure coherent execution of data products.
Engage with external partners and vendors to evaluate tools, platforms, and services that can enhance Globe’s data capabilities.
REQUIREMENTS:
Minimum of 3–7 years of progressive experience in ETL/ELT development, data pipeline design, and enterprise data engineering.
Proven track record in managing and optimizing automated data pipeline systems within Big Data and cloud-native environments.
Hands-on experience with distributed computing, data integration frameworks, and real-time streaming architectures.
Demonstrated experience in incident resolution, root cause analysis, and support for production-grade systems.
Experience in the telecom, fintech, or enterprise tech sector is a plus.
Level of Knowledge:
Advanced proficiency in data engineering tools and frameworks such as Airflow, dbt, and Kafka. Knowledge in Apache Spark, Talend, and NiFi is an advantage.
Understanding of data governance, data quality, metadata management, and enterprise security practices.
Strong working knowledge of cloud platforms (AWS preferred; GCP and Azure are a plus), including services like S3, Glue, EMR, or equivalent.
Strong command of SQL and PL/SQL for large-scale data manipulation and pipeline integration.
Familiarity with DevSecOps principles, including use of CI/CD tools and automation pipelines is an advantage.
Soft Skills:
Strong collaboration and interpersonal skills in cross-functional environments
Analytical mindset with structured problem-solving abilities
Excellent oral and written communication skills (English & Filipino)
Strategic thinking with an innovation-driven approach
Attention to detail and ability to manage multiple priorities in parallel
Technical Skills:
Big Data Tools: Airflow, dbt, Snowflake, Kafka. Apache Spark, Talend, NiFi, Hadoop is an advantage.
Cloud Platforms: AWS (preferred). GCP and Azure is an advanatage.
Languages & Tools: SQL, PL/SQL, Python (for scripting). Git and Terraform is an advantage.
Strong business acumen in data innovation and monetization
Knowledge of the telecommunications industry is an advantage.
DevSecOps: CI/CD pipelines, Infrastructure-as-Code, secure data pipeline practices is an advantage
Compliance: Familiarity with DPA, GDPR, ISO 27001, and enterprise-level data governance frameworks is an advantage
Equal Opportunity Employer
Globe’s hiring process promotes equal opportunity to applicants, Any form of discrimination is not tolerated throughout the entire employee lifecycle, including the hiring process such as in posting vacancies, selecting, and interviewing applicants.
Globe’s Diversity, Equity and Inclusion Policy Commitment can be accessed here
Make Your Passion Part of Your Profession. Attracting the best and brightest Talents is pivotal to our success. If you are ready to share our purpose of Creating a Globe of Good, explore opportunities with us.