East Lansing, MI, US
29 days ago
Sr. Data Pipeline Engineer
Vertafore is looking for talented people to join our team in Michigan. Our dynamic environment provides professional development, fast upward mobility, and exposure to the latest and greatest in technology. Vertafore is a leading technology company whose innovative software solution are advancing the insurance industry. Our suite of products provides solutions to our customers that help them better manage their business, boost their productivity and efficiencies, and lower costs while strengthening relationships.  Our mission is to move InsurTech forward by putting people at the heart of the industry. We are leading the way with product innovation, technology partnerships, and focusing on customer success.  Our fast-paced and collaborative environment inspires us to create, think, and challenge each other in ways that make our solutions and our teams better.  We are headquartered in Denver, Colorado, with offices across the U.S., including East Lansing, Michigan – we are minutes from Michigan State University, Lansing Community College, and Cooley Law School ! Vertafore is a Flexible First working environment which allows team members to work from home as often as you’d like, while using our offices as a place for collaboration, community, and teambuilding. There are times you may be asked to come into an office and/or travel for specific meetings for a specific business purpose and this varies by job responsibilities. JOB DESCRIPTION Operate, manage, and maintain data pipelines that utilize multiple data sources and reporting tools such as Oracle, PostgreSQL and Pentaho. Implement and maintain reporting and pipeline infrastructure in AWS using automation tools such as Terraform, Ansible, and Chef. Perform operational monitoring and maintenance of said infrastructure and software platforms. Serve as the liaison and collaborator with development and data operations teams. Write automated tests across to continuously validate data pipeline functionality.   Core Responsibilities: Essential job functions included but are not limited to the following: · Provisions and manages data pipeline infrastructure using technologies such as Terraform and Chef. · Monitors and troubleshoots issues within the data pipeline infrastructure using tools like Dynatrace, and Kibana. · Drives and manages design conversations for features considering business need · Develops new features and supporting/bug fixing existing features · Ensures quality of code within the team written or reviewed · Follows industry trends and the open-source community · Interacts with customers to gather insights and translate technical concepts   Knowledge, Skills, and Abilities: · Advanced communication and interpersonal skills, able to diffuse tension and drive productive conversations, internally and externally · Strategic problem solver with excellent analytical and critical thinking skills · An innate curiosity about how things work; Proactively acquires new skills and learns new tools and technologies to troubleshoot complex issues · Comfortable and effective at making decisions across the product that help long-term maintainability, reuse, security. and performance · Applies knowledge gained and is effective at knowledge-sharing and assisting other team members · Able to work with global, distributed teams · In-tune with high performance and high availability design/programming · Highly proficient with multiple database technologies · Strong in both design and implementation at an enterprise level · Communicate clearly to explain and defend design decisions · Desire to collaborate with the business, designing pragmatic solutions   Qualifications: · Degree in computer science, engineering or related field required · Minimum 3 years of professional experience with Devops and data pipeline technologies.. · Experience working in an Agile environment required · 3+ years of experience in security best practices for software and data platforms required · 3+ years of experience with design, architecture, and implementation at an enterprise level required · 3+ years of experience with database technologies (e.g., Oracle, SQLServer, PostgreSQL, Cassandra, etc.) · Experience with the following technologies: Ansible, Chef, Git, Java Frameworks, Pentaho, Terraform, xDB   Additional Requirements and Details: · Travel required up to 5% of the time. · WFH Flexible · Occasional lifting and/or moving up to 10 pounds. · Frequent repetitive hand and arm movements required to operate a computer. · Specific vision abilities required by this job include close vision (working on a computer, etc.). · Frequent sitting and/or standing. · #LI-Hybrid · $90000 - $130000 / year
Confirm your E-mail: Send Email