Job Title: Senior Data Engineer
Location: Fully Remote
Employment Type: Contract (Initial 6 months with potential extension opportunities!)
About the Role:
Are you a seasoned Data Engineer with a passion for designing and delivering scalable, high-performing data solutions? As a Senior Data Engineer, you will work closely with engineering, product, and data teams to build and maintain cutting-edge data platforms and pipelines while mentoring the next generation of engineers.
Key Responsibilities:
Partner with global clients to deliver scalable, robust data solutions.
Design, build, and maintain data platforms focusing on performance, reliability, and security.
Integrate diverse data sources and optimize data processing workflows.
Address challenges proactively, resolve blockers, and foster effective communication across distributed teams.
Continuously enhance data systems to align with evolving business needs.
Onboard and mentor new engineers, fostering growth for clients and internal teams.
Required Qualifications:
5+ years in a senior developer role with hands-on experience building data processing pipelines.
Proficiency in Google Cloud Platform (GCP) services, particularly BigQuery and BigQuery SQL, for large-scale data processing.
Extensive experience with Apache Airflow, including creating DAGs, triggers, and workflow optimization.
Expertise in data partitioning, batch configuration, and performance tuning for terabyte-scale processing.
Strong proficiency in Python and/or Scala, with a solid grasp of modern data libraries and frameworks (e.g., Databricks, Snowflake, Spark, SQL).
Experience with unit testing, pre-commit checks, and strict type enforcement for data pipelines.
In-depth understanding of relational and NoSQL databases, data modeling, and data warehousing concepts.
Excellent oral and written English communication skills.
Preferred Skills:
Expertise in optimizing BigQuery performance using tools like Query Profiler and addressing compute resource bottlenecks.
Experience developing or testing custom operators in Apache Airflow.
Familiarity with Docker, Kubernetes, Helm, Terraform, Kafka, and CI/CD pipelines for data environments.
English proficiency at a minimum B2+ level.
Apply Now!
Reference
CR/123083_1736316449