Hadoop Administrator with Cloudera
Kforce
Kforce has a client that is seeking a Hadoop Administrator with Cloudera in Birmingham, AL.
Summary:
The Big Data Design Engineer is responsible for architecture design, implementation of Big Data platform, Extract/Transform/Load (ETL), and analytic applications. The team is currently migrating to Snowflake. This person will handle existing data lake during migration.
Primary Responsibilities:
* Hadoop Administrator oversees implementation and ongoing administration of Hadoop infrastructure and systems
* Manages Big Data components/frameworks such as Hadoop, Spark, Storm, HBase, Hadoop Distributed File System (HDFS), Pig, Hive, Sqoop, Flume, Ozie, Avro, etc.
* Analyzes latest Big Data analytic technologies and innovative applications in both business intelligence analysis and new offerings
* Aligns with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and expand existing environments
* Handles cluster maintenance and creation/removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise
* Handles performance tuning of Hadoop clusters and Hadoop MapReduce routines
* Screens Hadoop cluster job performances and capacity planning
* Monitors Hadoop cluster connectivity and security
* As a Hadoop Administrator, you will manage and review Hadoop log files
* Handles HDFS and file system management, maintenance, and monitoring
* Partners with infrastructure, network, database, application, and business intelligence teams to guarantee high data quality and availability
* Collaborates with application teams to install operating system and Hadoop updates, patches, and version upgrades when required
* Acts of point of contact for vendor escalation
Confirm your E-mail: Send Email
All Jobs from Kforce