Job Description:
At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day.
One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being.
Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization.
Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us!
Responsibilities
Develop and deliver complex data solutions to accomplish technology and business goals.
Code design and delivery tasks associated with the integration, cleaning, transformation, and control of data in operational and analytics data system.
Implement multiple complex data solutions and share knowledge of the interaction across multiple data flows and systems.
Ensure the design and engineering approach for complex data solutions, requiring consistency across multiple flows and systems.
Conduct research, designs prototyping, and other exploration activities including evaluating new toolsets.
Design, develop, and maintain software solutions in Big Data Platforms.
Build E2E Data Engineering solution using Big Data technologies.
Understand and utilize Hadoop ecosystem including Cloudera.
Utilize HDFS, Map Reduce, Hive, impala, Linux, and Unix technologies to monitor Hadoop cluster performance and debug the issues causing performance degradation and write queries to extract data for leadership’s analysis and various Tableau dashboard creation.
Utilize Unix shell scripting and programming in Python, Scala, and Java.
Analyze the existing shell scripts and python code to debug any issues.
Remote work may be permitted within a commutable distance from the worksite.
Required Skills & Experience
Master's degree or equivalent in Applied Computer Science, Computer Information Systems, Management Information Systems, Engineering (any), or related; and
3 years of experience in the job offered or a related IT occupation.
Must include 3 years of experience in each of the following:
Designing, developing, and maintaining software solutions in Big Data Platforms;
Building E2E Data Engineering solution using Big Data technologies;
Understanding and utilizing Hadoop ecosystem including Cloudera;
Utilizing HDFS, Map Reduce, Hive, impala, Linux, and Unix technologies to monitor Hadoop cluster performance and debug the issues causing performance degradation and writing queries to extract data for leadership’s analysis and various Tableau dashboard creation;
Utilizing Unix shell scripting and programming in Python, Scala, and Java; and,
Analyzing the existing shell scripts and python code to debug any issues.
If interested apply online at www.bankofamerica.com/careers or email your resume to bofajobs@bofa.com and reference the job title of the role and requisition number.
Salary: $160,000 - $177,900 per year.
EMPLOYER: Bank of America N.A.
Shift:
1st shift (United States of America)Hours Per Week:
40