The Big Data Engineer will be responsible for building and maintaining data pipelines and workflows that support ML, BI, analytics, and software products within the Fleet realm. This individual will work closely with data scientists, data engineers, analysts, and SME’s within the business to deliver new and exciting products and services. The main objectives are to develop data pipelines and fully automated workflows to drive operational efficiency and effectiveness by enabling data-driven decisions across the organization. This includes fostering collaboration, building partnerships, co-developing products, sharing knowledge, providing insights and valuable predictive information to business teams and leaders to highlight potential risks and opportunities that initiate the drive for change.
The starting salary for this role is $90K; commensurate with experience.
What you'll do:
Building, maintaining, optimizing, and automating data pipelines in both cloud and on-premise infrastructure. Data acquisition, data ingestion, data cleansing, data transformation, and data migration. Design, implement, and test data integrations and ETL processes and provide detailed documentation. Improving data integrity and quality and implementing modern solutions and frameworks to streamline existing processes. Identify trends in datasets and develop workflows and help with streaming analytics to take advantage of raw data flows. Identify opportunities for data acquisition from internal and external sources. Curate rich datasets and prepare data for predictive and prescriptive modeling. Strive to engineering excellence by simplifying, optimizing, and automating processes and workflows. Ensures data sets are thoroughly tested before release to production environments.What we're looking for:
A bachelor's degree or higher in Computer Science, Information Systems, Information Technology or related field. Minimum of 5 years hands-on experience in a data engineering role. Minimum 2 years of intensive experience with cloud-based Big Data, BI, and Analytics technologies, preferably AWS. Strong expertise with Big Data tools and related technologies such as Databricks/Spark/PySpark, Airflow, Glue, Athena, etc. and ETL tools such as Informatica or Stream Sets. Experience with event driven architectures and data streaming pub/sub technologies such as Kafka and Amazon Kinesis. Strong capabilities in scripting languages such as Python, R, Scala, etc. Experience with Data Virtualization tools such as Denodo or Dremio. Strong interpersonal and communication skills with Agile/Scrum experience. Strong problem solving and critical thinking skills with a proven record for identifying and diagnosing problems, and solving complex problems with simple, logical solutions. Outstanding verbal, written, presentation, facilitation, and interaction skills, including ability to effectively communicate technical and non-technical issues and concepts to multiple organization levels.What You’ll Get:
Up to 40% off the base rate of any standard Hertz Rental Paid Time Off Medical, Dental & Vision plan options Retirement programs, including 401(k) employer matching. Paid Parental Leave & Adoption Assistance Employee Assistance Program for employees & family Educational Reimbursement & Discounts Voluntary Insurance Programs - Pet, Legal/Identity Theft, Critical Illness Perks & Discounts –Theme Park Tickets, Gym Discounts & more