Remote, Massachusetts, United States of America
20 hours ago
Data Engineer III, Development

Work Schedule

Standard (Mon-Fri)

Environmental Conditions

Office

Job Description

COMPANY: LOCATION: TITLE: HOURS: DUTIES: TRAVEL:  REQUIREMENTS: Thermo Fisher Scientific Inc. 168 Third Avenue, Waltham, MA 02451 Data Engineer III, Development Monday to Friday, 8:00 am to 5:00 pm · Design, develop, test, deploy, support, enhance data integration solutions to connect and integrate Thermo Fisher enterprise systems and data pipelines in our Data Science and Enterprise Data Platform. · Innovate for data integration in Apache Spark-based Platform to ensure the technology solutions leverage cutting edge integration capabilities. · Facilitate requirements gathering and process mapping workshops, review business, functional requirement documents, author technical design documents, testing plans and scripts. · Implement standard operating procedures, facilitate review sessions with functional owners and end-user representatives, and leverage technical knowledge and expertise to drive improvements. · · · Define, design and document reference architecture. Implement of BI and analytical solutions. Follow agile development methodologies to deliver solutions and product features by following DevOps practices. · Can work remotely or telecommute. Can work remotely or telecommute. MINIMUM Education Requirement: Master’s degree or foreign degree equivalent in Computer Science, Information Systems or related field of study. MINIMUM Experience Requirement:  None. Alternative Education and Experience Requirement: Bachelor’s degree or foreign degree equivalent in Computer Science, Information Systems or related field of study and 3 years of experience as a Database Administrator, Data Engineer, or a related occupation.   Required knowledge or experience with: • Databricks, Data lake, Delta lake, Oracle, and Redshift; • Data Engineering Pipeline development; • Apache Spark, Glue, Kafka, Elastic; • Search, Lambda, S3, Redshift, RDS, MongoDB, and DynamoDB ecosystems; • PySpark in AWS Cloud; • Python and common python libraries; • Database queries, query optimization, debugging, user defined 56086028.v1-IMMIGRATION functions, views, and indexes; • Source control systems such as Git and Jenkins and continuous integration tools; • Databricks and Informatica; and • Development methodology and functional and technical design specifications. #LI-DNI 

Confirm your E-mail: Send Email