Technical Skills:
1. 8+ years of experience in data engineering, ETL, and data warehousing.
2. Strong expertise in ETL tools (IBM Datastage).
3. Experience in SQL, PL/SQL, Oracle & Teradata
4. Experience with data modelling, data warehousing, and data governance.
5. Strong programming skills in Python, Linux & Unix Shell Scripting
6. Familiarity with Agile development methodologies
Nice to have:
1. Experience with cloud platforms (GCP, AWS, Azure etc).
2. Certification in GCP (e.g., Google Cloud Certified - Professional Data Engineer).
3. Experience with DevOps tools (e.g., Jenkins, GitLab CI/CD).
4. Experience with Scheduling tools (e.g., Autosys, Astronomer).
B.E/B.Tech/MCA/MSc in CS, IT or related branch 8+ years of experience in data engineering, ETL, and data warehousing GCP certification will be added preferenceKey Responsibilities:
1. Design, develop, and deploy data pipelines using ETL tools (e.g., IBM Datastage).
2. Develop data warehousing solutions using Teradata and other related services.
3. Collaborate with data analysts to integrate data sources and create data models.
4. Ensure data quality, security, and compliance with organizational standards.
5. Optimize data pipeline performance, scalability, and reliability.
6. Implement data governance and metadata management best practices.
7. Participate in on-call rotations for production support.