Paris, France
52 days ago
Senior Data Engineer

We’re on a mission to make Algolia a data-driven organization, and we’re looking for a Senior Data Engineer to join our Data team and help us get there.

Important: This position is only open to candidates located in France, the UK, Romania, and the US, even for full-remote roles. Applications from other locations will not be considered.

Note: A minimum of 4 years of experience as a Data Engineer is required. Experience with Python, Airflow, and Spark is mandatory.

Our Data team aims at helping other teams make better decisions by providing relevant data while ensuring its integrity and consistency. This team has a significant impact as we work closely with business analysts on our Operations, Marketing, Product, and Infrastructure teams to help them discover meaningful trends.

Our data team is composed of a Data Engineering team (building pipelines and scaling the Lake & Warehouse), an Analytics Engineering team (building standardized data models), and Data Analysts.

As a member of the Data Engineering team, your daily responsibilities will include:

Developing data pipelines and ETL workflows mainly with Python, Airflow (AWS MWAA), Spark (AWS Glue, EMR), and other AWS services managed with Terraform. Improving and maintaining our data warehouse (AWS Redshift). Supporting the Analytics Engineering team to build data models with dbt to be used by Analysts on business reports. Interacting with Engineering and Business teams to understand requirements.

We maintain an open-minded approach and eagerly embrace experimentation with new technologies to achieve our goals, and we try to continuously challenge our choices on that topic. You will have the opportunity to make some decisions going in that direction.

As a senior joining the team, your goal will also be to mentor junior engineers and help them grow to be successful at Algolia.

With Algolia’s rapid growth, there will be many data-related challenges that will need to be tackled! Are you ready for the challenge?

Key responsibilities:

Design, build, enrich, and scale up our data pipelines. Work with engineers, data analysts, and business analysts to capture and model data. Monitor data integrity and growth. Help improve our data lake and data warehouse architecture to increase performance, simplicity, and autonomy of the users. Help analysts industrialize reports and dashboards to improve company productivity. Ensure every product is released with a data-driven approach.

You might be a good fit if you have:

Strong experience designing and building data pipelines. Experience orchestrating pipelines with Airflow. Professional knowledge of Python. Strong experience working with cloud platforms and architecting them. Strong experience with data warehouses. Experience with infrastructure topics specific to data engineering. Interest in Big Data challenges (~600Tb of data, +14Tb per week, ~700M files, +1.2M per week, ~300 dbt models, ~70 Airflow DAGs). Interest in understanding the data and business requirements. Excellent spoken and written English skills. A humble, curious, proactive mindset, and a balance between creativity, resourcefulness, and pragmatism.

Nice to have:

Experience using and managing AWS Redshift. Experience with a parallel data processing framework such as Apache Spark. Experience analyzing data quality using dbt. Experience at our current stage and beyond ($50-200M ARR range, high growth, lots of change, and building internal infrastructure).     #LI-Remote
Confirm your E-mail: Send Email
All Jobs from Algolia