Emeryville, CA (Hybrid)
1 day ago
Senior Data Engineer

The Basics

Tanium is looking for a dynamic Senior Data Engineer to join our Enterprise Data team! This person will design and optimize scalable data pipelines, transforming raw data into insights that drive strategic decisions. They will drive innovation by automating workflows, boosting data quality, and reimagining processes to maximize efficiency. They will partner with cross-functional teams to build intuitive data models that empower teams with self-service analytics. Above all, this person will shape the future of Tanium's data ecosystem, tackling high-impact problems in a collaborative, fast-paced environment. Join us to turn data into a competitive advantage and leave your mark on our growing organization!

This is a hybrid position, which will require in person attendance several days each week in one of the following locations: Durham, NC or Emeryville, CA.

What you'll do

Design and maintain processes and databases for the Snowflake Data Warehouse.  Design, maintain, configure, scale, and optimize optimal data pipelines and related infrastructure, ensuring reliable and optimal extraction, transformation, and loading (ETL) of data from various data sources.  Identify, design, and implement internal process improvements, including optimizing data delivery, automating manual processes, and improving governance and data quality.  Implement and enhance the collection and calculation of key business performance metrics, in addition to addressing data-related technical issues.  Re-design and re-engineer existing data pipelines built using Airflow, SQS, and S3.  Templatize repetitive and manual tasks using Terraform or GitHub Actions to streamline processes.  Implement and optimize data transformation workflows using DBT (Data Build Tool).  Collaborate with cross-functional teams to design semantic models and frameworks that enhance data accessibility and usability. 

Key responsibilities will be

Data Ingestion & Orchestration (40%):  Extract, transform, and load data from various sources, including Salesforce, NetSuite ERP, SuccessFactors, Coupa, and other corporate applications. Build autonomous pipelines in Python and orchestrate them via Airflow to deliver up-to-date data to the business and key stakeholders.  Process Ownership (30%):  Take ownership of all data going into the centralized enterprise data warehouse.  Set up regular maintenance checks, error notifications, and proactively resolve data-related issues. Ensure timely resolution of all data-related bugs and technical issues.  Process Improvement (20%):  Modernize the solution footprint by migrating the legacy tech stack and bringing it up to date.  Conduct research and make recommendations on products and services for rationalization and improvement. Work with business stakeholders and power users to understand their BI and Analytics needs, define and implement required data models, and facilitate the creation and maintenance of business-centric dashboards and performance metrics.  Mentorship and Comradery (10%):  Mentor junior engineers and analysts, focusing on knowledge sharing and growth.  Maintain a culture of transparency, accountability, and efficiency within the Business Systems organization and with all stakeholders.  Share architecture formally and informally with the team and stakeholders. 

We’re looking for someone with

Education Bachelor’s degree in computer science, Engineering, or a related field (or equivalent experience).  Experience 10+ years of experience in Data Engineering, BI Engineering, or related roles, specializing in data architecture, pipeline design, and data integration.  Expertise in Python for building and maintaining data pipelines.  Extensive experience in Snowflake for data warehousing, including performance tuning and database development.  Strong AWS infrastructure experience, including tools such as S3, RDS, MWAA, and SQS, and the ability to configure and build AWS environments.  Terraform expertise for infrastructure as code (IaC) and automation.  Extensive experience with GitHub and GitHub Actions for CI/CD processes and task automation.  Proficiency with DBT (Data Build Tool) for building and deploying data transformation workflows.  Solid experience with relational and NoSQL databases, with strong SQL development skills.  Experience templatizing and automating mundane tasks into repeatable workflows using Terraform or GitHub Actions.  Bonus points for: Experience designing and implementing semantic models.  Implementing decentralized approach like Data Mesh   Experience with Power BI or other business intelligence tools.  Familiarity with Kanban and Agile methodologies. Why join us? Work on cutting-edge data technologies and exciting projects that make a real impact.  Collaborate with a team of talented engineers and analysts in a supportive and innovative environment.  Opportunities for professional growth and skill development in the latest data engineering tools and practices. 

About Tanium 

Tanium delivers the industry's only true real-time cloud-based endpoint management and security offering. Its platform is real-time, seamless, and autonomous, allowing security-conscious organizations to break down silos between IT and Security operations that results in reduced complexity, cost, and risk. Securing more than 32M endpoints around the world, Tanium's customers include Fortune 100 organizations, top US retailers, top US commercial banks, and branches of the U.S. Military. It also partners with the world's biggest technology companies, system integrators, and managed service providers to help customers realize the full potential of their IT investments. Tanium has been named to the Forbes Cloud 100 list for nine consecutive years and ranks on the Fortune 100 Best Companies to Work For. For more information on The Power of Certainty™, visit www.tanium.com and follow us on LinkedIn and X. 

On a mission. Together. 

At Tanium, we are stewards of a culture that emphasizes the importance of collaboration, respect, and diversity. In our pursuit of revolutionizing the way some of the largest enterprises and governments in the world solve their most difficult IT challenges, we are strengthened by our unique perspectives and by our collective actions.   

We are an organization with stakeholders around the world and it’s imperative that the diversity of our customers and communities is reflected internally in our team members. We strive to create a diverse and inclusive environment where everyone feels they have opportunities to succeed and grow because we know that only together can we do great things. 

Each of our team members has 5 days set aside as volunteer time off (VTO) to contribute to the communities they live in and give back to the causes they care about most.   

What you’ll get

The annual base salary range for this full-time position is $85,000 to $260,000. This range is an estimate for what Tanium will pay a new hire. The actual annual base salary offered may be adjusted based on a variety of factors, including but not limited to, location, education, skills, training, and experience.

In addition to an annual base salary, team members will receive equity awards and a generous benefits package consisting of medical, dental and vision plan, family planning benefits, health savings account, flexible spending account, transportation savings account, 401(k) retirement savings plan with company match, life, accident and disability coverage, business travel accident insurance, employee assistance programs, disability insurance, and other well-being benefits.

 

For more information on how Tanium processes your personal data, please see our Privacy Policy.

#Hybrid

Confirm your E-mail: Send Email
All Jobs from Tanium