Hyderabad, Telangana
5 days ago
Senior Software Engineer -Kafka, GCP

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. 

Primary Responsibilities:

Manage the Hardware and Software for Kafka and its ecosystem components Work with Application teams to gather future requirements to plan the growth of the infrastructure and expand as needed. Implement Disaster Recovery (DR) for Kafka Build Automation Capabilities using tools like Terraform , Ansible, Git runner, Jenkins, etc. Kafka security (Kerberos, ACL, SSL, SASL, SCRAM, etc.) Research and implement the new capabilities for Enterprise Messaging Services Requirements Understand implications of data upstream and downstream Design, build, assemble, and configure application or technical architecture components using business requirements Establish best practice standards for configuring Source and Sink connectors Demonstrate a product mindset with an ability to set forward thinking and direction Be able to synthesize large amounts of complex data into meaningful conclusions and present recommendations Be able to maintain a positive attitude while working with high demands and short deadlines that leads to working after hours Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so

Required Qualifications:

Undergraduate degree or equivalent experience Hands-on experience with Kafka clusters hosted in GCP and on-prem K8 platform Hands-on experience in standing up and administrating Kafka platform from scratch which includes creating a backup & mirroring of Kafka Cluster brokers, broker sizing, topic sizing, h/w sizing, performance monitoring, broker security, topic security, consumer/producer access management (ACL) Experience with Kafka clusters, Cassandra, Kubernetes, Terraform or Helm Charts Experience with Linux (RHEL) /Unix Experience in building Kafka pipelines using Terraform, Ansible, Cloud formation templates, shells etc. Experience in implementing security & authorization (permission based) on Kafka cluster Experience in System Administrators with setting up Kafka platform in provisioning, access lists Kerberos and SSL configurations Experience in setting standards to automate deployments using Kubernetes, Docker, Jenkins Experience in open source and confluence Kafka, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center Experience in Kafka Mirror Maker or Confluent Replicator  Experience in High availability cluster setup, maintenance and ongoing 24/7 support Kafka Admin experience in managing critical 24/7 applications and optimizing Kafka for variable workloads Knowledge of Kafka API (development experience is a plus) Knowledge of best practices related to security, performance, and disaster recovery Thorough understanding of Kafka, producer/consumer/topic technologies and drive implementation of the technology Proven ability to concentrate on a wide range of loosely defined complex situations, which require creativity and originality, where guidance and counsel may be unavailable Proven excellent communications and interpersonal skills

 

Preferred Qualifications:

Experience in setting up Promethes Grafana or ELK monitoring tools
Experience as Linux (RHEL) /Unix administrator
Experience in PostgreSQL, SQL Server, No-SQL (Hbase), Oracle, and GCP/Azure Cloud
Experience with any RDBMS, No-SQL technologies, and GCP/Azure Cloud
Understanding of or experience with Programming languages like Python, etc.

 

At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Confirm your E-mail: Send Email