BANGALORE, IND
1 day ago
Data Engineer-Data Platforms-AWS
**Introduction** In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience **Your role and responsibilities** * Design, develop, and manage our data infrastructure on AWS, with a focus on data warehousing solutions. * Write efficient, complex SQL queries for data extraction, transformation, and loading. * Utilize DBT for data modelling and transformation. * Use Python for data engineering tasks, demonstrating strong work experience in this area. * Implement scheduling tools like Airflow, Control M, or shell scripting to automate data processes and workflows.Participate in an Agile environment, adapting quickly to changing priorities and requirements. **Required technical and professional expertise** * Proven expertise in AWS technologies, with a strong understanding of AWS services. Experience in Redshift is optional * Experience in data warehousing with a solid grasp of SQL, including ability to write complex queries. * Proficiency in Python, demonstrating good work experience in data engineering tasks. * Familiarity with scheduling tools like Airflow, Control M, or shell scripting. * Excellent communication skills, willing attitude towards learning **Preferred technical and professional experience** * Knowledge of DBT for data modelling and transformation is a plus. * Experience with PySpark or Spark is highly desirable. * Familiarity with DevOps, CI/CD, and Airflow is beneficial.Experience in Agile environments is a nice-to-have
Confirm your E-mail: Send Email