This role demands 6-8 years of hands-on experience in designing, building, and optimizing high-performance data pipelines using Snowflake, Azure Data Factory (ADF), Key Vault, Blob Storage, and, iAzure Databricks with a strong focus on ETL, data integration, and real-time/batch processing.
Expertise in optimizing Snowflake on Azure, ensuring query performance tuning, cost management, and scalable architecture, is critical.
In this fast-paced, high-growth environment, the ideal candidate must have expert-level Python, SQL, and PySpark skills, coupled with deep proficiency in data analysis, validation, and integration—ensuring data accuracy, consistency, and usability across large-scale telecom datasets.
 A strong understanding of data governance, security, privacy, RBAC, encryption, and compliance frameworks is a must.
Success in this role requires exceptional agility in working with cross-functional teams, rapidly iterating on data solutions, and delivering insight-driven analytics to optimize customer acquisition, operational efficiency, and business performance.
Experience in data governance, catalog management, metadata management, and lineage tracking are a strong plus, ensuring data discoverability and compliance.
Continuous focus on automating and refining ETL workflows, streamlining data operations, and implementing cutting-edge best practices in data engineering.
Additionally, exposure to customer and growth analytics is highly desirable, enabling deeper business insights and data-driven decision-making.
Power BI skills are a strong plus, enhancing data storytelling and decision-making through advanced visualizations.
Require CI/CD, DevOps and agile delivery experience.
They will lead from the front, mentoring junior engineers, fostering a culture of innovation and continuous learning, and ensuring that data solutions are scalable, cost-effective, and future-ready.