Milano, USA
1 day ago
Data Engineer

WHO WE ARE

We are EssilorLuxottica, a global leader in the design, manufacture and distribution of ophthalmic lenses, frames and sunglasses. The Company brings together the complementary expertise of two industry pioneers, one in advanced lens technologies and the other in the craftsmanship of iconic eyewear, to create a vertically integrated business that is uniquely positioned to address the world’s evolving vision needs and the global demand of a growing eyewear industry.

With over 180,000 dedicated employees in 150 countries driving our iconic brands, our people are creative, entrepreneurial and celebrated for their unique perspectives and individuality. Committed to vision, we enable people to “see more and be more” thanks to our innovative designs and lens technologies, exceptional quality and cutting-edge processing methods. Every day we impact the lives of millions by changing the way people see the world.

JOB SCOPE AND MAIN RESPONSIBILITIES:

 

Along with having a passion and drive for Data and Analytics solutions, the Data Engineer in Advanced Analytics Platform will play a key role in building, automating, deploying, and designing data pipelines which will cater to the growing needs of the data science community in Essilor Luxottica using Cloud Native Technologies on a Modern Data architecture Platform. You will be responsible for developing and maintaining efficient, cost-effective Big Data Solutions and assisting in the Industrialization of Advanced Analytics Solutions.

 

AREAS OF RESPONSIBILITIES AND RELATED ACTIVITIES: Collecting, storing, and processing substantial amounts of data from various sources. Developing and maintaining data pipelines to ensure efficient and reliable data delivery. Designing and implementing data models and data warehouses to support data analysis. Knowledge working on Public Cloud based Technology preferably Azure Building and maintaining data infrastructure, including databases, data lakes, and data warehouses. Monitoring and optimizing data pipeline performance, including identifying and resolving performance issues and bottlenecks. The ability to work with CI/CD and DevOps process is essential. Keeping up to date with emerging technologies and trends in data engineering and statistical data analysis. Collaborating with data scientists and analysts to develop data-driven solutions and insights. Developing and implementing data quality standards and processes to ensure data accuracy and consistency. Ensuring data security and compliance with regulatory requirements. Troubleshooting and resolving data-related issues and problems. Conducting statistical data analysis to identify trends, patterns, and insights.

 

NETWORK OF INTERACTION:

INTERNAL : Business Units

EXTERNAL : Technology Vendors like MS, Google, SAP etc

 

 

TECHNICAL SKILLS - PORTRAIT OF A PERFECT CANDIDATE Pyspark – Azure DataBricks with Unity Catalog ETL/ELT using Datafactory Knowledge of Lakehouse and Azure Data Lake Services Ability to work with several types of formats like delta / parquet / json/ csv etc Knowledge of SAP CDC is a plus Synapse Workspaces and SQL Pools Orchestration Tool and dependency management like Datafactory/Airflow SQL / Scripting /  Python / Scala / sFTP etc Working knowledge of React (frontend Development) would be plus

Technical Soft Skills:

Strong customer engagement skills to understand customer needs for Analytics solutions . Experience in working with small and large teams in delivering analytics solutions for customers Have demonstrated ability to define, develop and implement data models and supporting policies, standards, and guidelines. Strong data analysis and analytical skills. Have demonstrated the ability to guide the development of data requirements for projects and drive a user experience that is easy to use and delivers the right information at the right time. Experience in working in a fast-paced agile environment. Strong problem solving and troubleshooting skills.
Confirm your E-mail: Send Email