Westlake, TX, US
26 days ago
Senior Systems Engineer
Job Description:

Position Description:

Builds Extract, Transform and Load (ETL) workflows and data-driven solutions using Python. Evaluates new technologies and market trends through research and Proof of Concept (POC). Improves and innovates current data protection and security technologies offerings. Owns and continuously optimizes tools, process, and capabilities to support operational activities.

Primary Responsibilities:

Delivers impactful scalable, flexible, and efficient data solutions that conform to architectural designs and Fidelity technology strategy. Builds and owns a portfolio of policies, procedures, and best practices to provide operational and engineers disciplines, and to evolve data protection and security technologies. Employees design patterns and generalizes code to address common use cases. Authors high-quality and reusable code to contribute to broader repositories. Analyzes information to determine, recommend, and plan computer software specifications on major projects. Proposes modifications and improvements based on user need. Develops software system tests and validation procedures, programs, and documentation.

Education and Experience:

Bachelor’s degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Mathematics, Physics, or a closely related field and three (3) years of experience as a Senior Systems Engineer (or closely related occupation) developing applications using Apache Spark, Hadoop, or Snowflake within a distributed cloud data warehouse or data lake environment.

Or, alternatively, Master’s degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Mathematics, Physics, or a closely related field and one (1) year of experience as a Senior Systems Engineer (or closely related occupation) developing applications using Apache Spark, Hadoop, or Snowflake within a distributed cloud data warehouse or data lake environment.

Skills and Knowledge:

Candidate must also possess:

Demonstrated Expertise (“DE”) developing, designing, and implementing comprehensive data processing solutions using PySpark, Scala or Apache Spark; executing large-scale transformations and analytics within a Cloud environment — Amazon Web Services (AWS) using AWS S3, EMR, Glue, AWS Lambda, or AWS Step functions; conducting performance tuning and optimization of Spark jobs and ETL processes using Spark UI or Spark DAG to ensure optimal efficiency and throughput for seamless data processing. DE designing and implementing comprehensive solutions; developing Web applications (Full Stack), establishing automation frameworks, and executing proof-of-concept initiatives using OOP (Java) or Python Functional Programming (Clojure or Scala) to explore and integrate cutting-edge cloud technologies; and developing RESTful APIs using React JS, OOP (Java), Python Functional Programming (Clojure or Scala), Docker, Kubernetes, or serverless technologies to stay at the forefront of advancements in cloud computing. DE developing Continuous Integration/Continuous Delivery (CI/CD) and Infrastructure as a Code (IaC) Terraform or AWS Cloud formation templates; configuring and integrating DevOps tools, building, and releasing management; and automating self-services jobs, deploying applications, and orchestrating disaster recovery using Jenkins, Python, Shell Scripts or Source Code Management (SCM) tools (Bitbucket or GitHub) in a hybrid on-prem and Cloud environment (Amazon Web Services (AWS) or Google Cloud Platform (GCP)). DE installing, configuring, and managing  data security solutions using Guardium; navigating and integrating between Relational (Oracle, MySQL, or PostgreSQL), NoSQL(MongoDB, Cassandra, or Graph DB), Cloud databases (Aurora, Dynamo, or Elastic Cache) and Object/Block storage technologies; and executing data modeling principles and delving into database internals to ensure expert design and optimization across a broad spectrum of platforms using Apache Nifi or AWS Glue Catalogs.

#PE1M2

Certifications:

Confirm your E-mail: Send Email