Position Description:
Conducts data analysis and integration across multiple source repositories for a wide variety of structured and unstructured data leverages tools such as PostgreSQL, Oracle, and Hadoop to work in relational databases and big data platforms. Works with ETL/ELT data movement technologies to develop capabilities, products, and services to improve investment outcomes and enrich the customer experience. Builds processes supporting data transformation, data structures, metadata, dependency, and workload management using programming languages such as Java or Python. Works to conceptualize, build, test and deploy innovative Artificial Intelligence (AI) models and capabilities in B2B sales and service for a large, fast-growing business unit. Ensures QA readiness of software deliverables.
Primary Responsibilities:
Develops complex or multiple software applications and conducts studies of alternatives to translate divisional initiatives into business solutions.
Analyzes and recommends changes in project development policies, procedures, standards, and strategies to development experts and management.
Participates in architecture design teams.
Defines and implements application-level architecture.
Develops applications, components, and subsystems to support division-wide complex projects.
Recommends development testing tools and methodologies.
Reviews and validates test plans.
Develops comprehensive documentation for multiple applications or subsystems.
Establishes full project life cycle plans for complex projects across multiple platforms.
Advises on risk assessment and risk management strategies for projects.
Plans and coordinates project schedules and assignments for multiple projects.
Provides technology solutions to daily issues and estimates technical evaluation requirements for technology initiatives.
Advises senior management on technical strategy.
Mentors junior team members.
Performs independent and complex technical and functional analysis for multiple projects supports divisional initiatives.
Confers with systems analysts and other software engineers/developers to design systems and to obtain information on project limitations and capabilities, performance requirements and interfaces.
Develops and oversees software system tests and validation procedures, programs, and documentation.
Education and Experience:
Bachelor’s degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Information Management, Business Administration, or a closely related field and six (6) years of experience as a Director, Digital Assets Engineer (or closely related occupation) developing data warehouse and big data solutions, using Amazon Web Services (AWS) in a financial services environment.
Or, alternatively, Master’s degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Information Management, Business Administration, or a closely related field and four (4) years of experience as a Director, Digital Assets Engineer (or closely related occupation) developing data warehouse and big data solutions, using Amazon Web Services (AWS) in a financial services environment.
Skills and Knowledge:
Candidate must also possess:
Demonstrated Expertise (“DE”) building Cloud data solutions — sourcing and analyzing large data sets from heterogenous sources (Oracle, PostgreSQL, Files, Netezza, Kafka, and Hive) in XML, JSON, and Text formats — using AWS services, Snowflake and relational database (Oracle and PostgreSQL) as the data store for structured and semi-structured data within the financial services industry.
DE designing, developing, and automating data processing, using Hadoop Big Data technologies (Flume, Scloop, Hive, Java MapReduce, and Spark framework) within on-premises and AWS.
DE provisioning AWS Cloud infrastructure components (EC2, EMR, Lambda, RDS, and Elastic Search); facilitating and deploying data pipelines; performing source code management and version control for Continuous Integration/Continuous Delivery (CI/CD) to process financial industry data, using AWS Cloud Formation templates, Docker, and DevOps tools (Maven, Git, Stash, Jenkins, and Jira); writing complex SQL scripts for loading enterprise warehouses for analytics data preparation, using SQL workbench, SQL developer, DB Visualizer, Toad, and DBeaver; and creating model ready data, reports, and analyzing data to influence product strategy, using SQL, PL/SQL, and Python.
DE designing and developing complex financial data replication processes using parallel micro service framework in Python, Oracle PL/SQL, ETL/ELT, using Data Integration tools (Informatica or Ab Initio), UNIX Scripting, and Autosys; streaming real-time data pipelines, using Apache Airflow, Kubernetes, Docker, Kafka Streams, AWS Services (S3 Files, S3 events, Lambda, and SQS), Snowflake services (Snowpipe, Streams, and Tasks), Oracle Big Data Golden Gate, and Python; indexing large volumes of unstructured documents to facilitate search capabilities, using Apache Solr, Elasticsearch, or LucidWorks; and building real-time visualizations on financial and market data, using Logstash, File Beat, and Kibana.
Salary: $189,000.00 – $199,000.00/year.
#PE1M2
Certifications: