Mumbai, Maharashtra, India
14 hours ago
GCP Engineer (Pooling)
Job Requirements

Overview of the job      

Data Engineer–Data Platforms

This role reports to the Director, India Data Platforms, P&G

 

About Data Platforms Team

We take pride in managing the most-valuable asset of company in Digital World, called Data. Our vision is to deliver Data as a competitive advantage for Asia Regional Business, by building unified data platforms, delivering customized BI tools for managers & empowering insightful business decisions through AI in Data. As a data solutions specialist, you'll be working closely with business stakeholders, collaborating to understand their needs and develop solutions to solve problems in area of supply chain, Sales & Distribution, Consumer Insights & Market performance.


In this role, you'll be constantly learning, staying up to date with industry trends and emerging technologies in data solutions. You'll have the chance to work with a variety of tools and technologies, including big data platforms, machine learning frameworks, and data visualization tools, to build innovative and effective solutions.

So, if you're excited about the possibilities of data, and eager to make a real impact in the world of business, a career in data solutions might be just what you're looking for. Join us and become a part of the future of digital transformation.

 

 

About P&G IT

Digital is at the core of P&G’s accelerated growth strategy. With this vision, IT in P&G is deeply embedded into every critical process across business organizations comprising 11+ category units globally creating impactful value through Transformation, Simplification & Innovation. IT in P&G is sub-divided into teams that engage strongly for revolutionizing the business processes to deliver exceptional value & growth - Digital GTM, Digital Manufacturing, Marketing Technologist, Ecommerce, Data Sciences & Analytics, Data Solutions & Engineering, Product Supply.


Responsibilities

Development of data and analytics cloud-based platform, including integrated systems and implementing ELT/ ETL jobs to fulfil business deliverables. Performing sophisticated data operations such as data orchestration, transformation, and visualization with large datasets. You will be working with product managers to ensure superior product delivery to drive business value & transformation. Demonstrating standard coding practices to ensure delivery excellence and reusability.

 

Data Ingestion: Develop and maintain data pipelines to extract data from various sources and load it into Google Cloud environments.Data Transformation: Implement data transformation processes, including data cleansing, normalization, and aggregation, to ensure data quality and consistency.Data Modeling: Develop and maintain data models and schemas to support efficient data storage and retrieval in Google Cloud platforms.Data Warehousing: Build data warehouses or data lakes using Google Cloud services such as Big Query. Data Integration: Integrate data from multiple sources, both on-premises and cloud-based, using Cloud Composer or other relevant tools.Data Governance: Implement data governance practices, including data security, privacy, and compliance, to ensure data integrity and regulatory compliance.Performance Optimization: Optimize data pipelines and queries for improved performance and scalability in Google Cloud environments.Monitoring and Troubleshooting: Monitor data pipelines, identify and resolve performance issues, and troubleshoot data-related problems in collaboration with other teams.Data Visualization: Build BI reports to enable faster decision making.Collaboration: Work with product managers to ensure superior product delivery to drive business value & transformation Documentation: Document data engineering processes, data flows, and system configurations for future reference and knowledge sharing.

Work Experience

Qualifications:

Experience: Bachelor's or master's degree in computer science, data engineering, or a related field, along with 2+ year work experience in data engineering and cloud platforms.Google Cloud Development: Strong proficiency in Google Cloud services such as Spanner, Cloud Composers, Looker Studio, etc. ETL Tools: Experience with ETL (Extract, Transform, Load) tools and frameworks, such as Spark and Cloud Composer/Airflow for data integration and transformation.Programming: Proficiency in programming languages such as PySpark, Python, and SQL for data manipulation, scripting, and automation.Data Modeling: Knowledge of data modeling techniques and experience with data modeling tools.Database Technologies: Familiarity with relational databases (e.g., Cloud SQL) for data storage and retrieval.Data Warehousing: Understanding of data warehousing concepts, dimensional modeling, and experience with data warehousing technologies such as Big Query. Data Governance: Knowledge of data governance principles, data security, privacy regulations (e.g., GDPR, CCPA), and experience implementing data governance practices.Data Visualization: Experience of working with Looker Studio to build semantic data model & BI reports/dashboards.Cloud Computing: Familiarity with cloud computing concepts and experience working with cloud platforms, particularly Google Cloud Platform.Problem-Solving: Strong analytical and problem-solving skills to identify and resolve data-related issues.Proficiency in DevOps Tools and CICD tools (e.g. Terraform, Github)Familiarity to Azure, Databricks and its relevant tech stacks would be an advantage to the role.

Confirm your E-mail: Send Email