Kraków, PL
7 days ago
Senior Data Engineer

Digital & Technology Team (D&T) is an integral division of HEINEKEN Global Shared Services Center. We are committed to making Heineken the most connected brewery. That includes digitalizing and integrating our processes, ensuring best-in-class technology, and embedding a data-driven culture. By joining us you will work in one of the most dynamic and innovative teams and have a direct impact on building the future of Heineken!

Would you like to meet the Team, see our office and much more? Visit our website: Heineken (heineken-dt.pl)

 

Commerce DevOps Hub is being established. The newly created organization, being an integral part of the Global Digital & Technology Function, is tasked with maintaining, but most importantly developing (functionally and technologically) IT solutions supporting the Commerce area at Heineken. Commerce DevOps Hub is located in Kraków and will include highly qualified IT professionals with direct contribution to both the technological development of the Heineken Commerce and the Hub itself.
 

As the Senior Data Engineer in the Data Foundation scope, you have in-depth knowledge of the technical details and how the products work, and you will act as an expert in the team responsible for the technical configuration, development and integration of products and platforms based on our tech stack, used by specific business processes across OpCos, to better serve their customers.  

You will provide expert guidance and ensure alignment with architecture and business objectives and ensure on-time delivery. You drive the end-to-end operations of the platforms, keeping them streamlined, structured, and within service level agreements. 

You will collaborate closely with other Data and Analytics teams for efficient development pipelines. You will work directly with the Product Owner(s) and Product Architect(s), understanding the business needs and translating them into specifications and services in line with overall engineering standards and roadmaps.You will be expected to be implementing new features, deliver high quality code, follow the agile methodologies and be a team player. The most important part is to be part of the team effort towards value-driven outcomes and the successful completion of tasks. Your proactive approach will be key in maintaining comprehensive documentation and collaborating with your team members through offering help, raising question and actively taking part in all activities. You will serve as a key contributor in refining and driving excellence in solution engineering practices to deliver high-quality solutions throughout the software development lifecycle in our Data landscape.  

The role reports directly to the Data Engineering Lead or Chapter Lead. 
 

Your responsibilities would include:

coach and Mentor a team of experienced team members of Data Engineers in designing, developing, and delivering scalable, reliable, and high performing big data solutions 

coach and Mentor the design, development, and maintenance of scalable data pipelines and ETL processes. Monitor and optimize data infrastructure performance, identifying and resolving bottlenecks and issues

coach and Mentor  the team from a technical standpoint, and drive operational excellence, including code reviews, design reviews, testing, and deployment processes

be an individual contributor (~60%) engineering the software products/solutions, jointly with the team 

ensure that the team adheres to coding standards, best practices, and architectural guidelines, oversee team spirit and team performance, guide and mentor team members

oversee the implementation of the technical architecture, solve immediate technical challenges

implement good practices, coding standards and modern architecture for DataOps; be a “go-to-person for technical decisions and problem-solving within the team

ensure that the execution of DevSecOps is in place in the team's daily work

inspire, advise, and drive the selection of development approach

coordinate software development and address technical debt in the team

hire, onboard, mentor, and develop top engineering talents, fostering a culture of learning

collaboration, and continuous improvement

take Lead when needed in technical discussions with other teams/departments and oversees state-of-art quality of the stack

may be involved in cross-functional discussions, representing the domain in broader technical discussions across domains

responsible for designing and improving processes that enhance efficiency and quality

communication with Engineering Manager, Product Owner, Business Analyst and Scrum Master to align on project. / sprint goals, timeline and resource allocation

 

You are a good match if you have:  

8+ years of experience in Data Engineering, with a strong understanding of data integration, ETL processes, and data warehousing. 

hands-on experience and in-depth knowledge of the technologies listed as mandatory in the Technology Stack section 

strong understanding and implementation of software development principles, coding standards, and modern architecture  

familiarity with data governance and compliance standards. 

hands-on experience in implementing and managing End-to-End DataOps / Data Engineering projects in a team  

proven ability to lead software development teams of engineers with varying experience and adapt to team sizes from small to large 

experience in working in diverse projects with varying technologies, products, and systems  

strong problem-solving skills and ability to make critical technical decisions  

ability to guide / mentor other team members 

effective communication and interpersonal skills, with the ability to collaborate with technical and non-technical stakeholders. 

proven ability to demonstrate that can work independently and a self-starter 

pragmatic, and collaborative team player 

 

You are a good match if you know: 

Must have (all levels):  

proficiency in programming languages such as Python, SQL, and experience with big data technologies like Hadoop, Spark, and Kafka

experience with cloud platforms (e.g., AWS, Azure, GCP) and data storage solutions (e.g., Databricks, BigQuery, Snowflake)

experience with CI/CD processes and tools, including Azure DevOps, Jenkins, and Git, to ensure smooth and efficient deployment of data solutions

familiarity with APIs to push and Pull data from data systems and Platforms

familiarity with understanding software architecture High level Design document and translating them to developmental tasks

Nice to have: 

Familiarity with Microsoft data stack such as Azure Data Factory, Azure Synapse, Databricks, Azure DevOps and Fabric / PowerBI. 

Experience with machine learning and AI technologies 

Data Modelling & Architecture  

ETL pipeline design  

Expert in Python, Pyspark and SQL  

Azure Data Factory  

Azure DevOps  

Logging and Monitoring using Azure / Databricks services  

Apache Kafka 

 

At HEINEKEN Kraków, we take integrity and ethical conduct seriously. If someone has concerns about a possible violation of legal regulations indicated in Polish Whistleblowing Act or our Code of Business Conduct, we encourage them to speak up. Cases can be reported to global team or locally (in line with the local HGSS Whistleblowing procedure) by selecting proper option in this tool or by communicating it on hotline.

Confirm your E-mail: Send Email