Responsibilities
Data Pipeline Design and Implementation:
Architect, develop, and maintain real-time and batch data pipelines using tools like Kafka, Snowflake, and Oracle Data Streams.
Implement Delta Sharing to facilitate secure and scalable data sharing across systems and organizations.
Data Integration and Streaming:
Design robust integration solutions for streaming and processing high-throughput data using Apache Kafka and similar technologies.
Build pipelines that integrate seamlessly with Snowflake, ensuring optimal performance for analytics and reporting workloads.
Performance Optimization:
Monitor and optimize the performance of data streams and pipelines.
Implement strategies to reduce latency and improve throughput for real-time data processing.
Troubleshooting and Support:
Diagnose and resolve complex data pipeline and integration issues, ensuring high availability and reliability of data flows.
Collaborate with DevOps and platform teams to maintain stable environments for data processing workloads.
Collaboration and Innovation:
Work closely with data architects, engineers, and analysts to align on business requirements and deliver scalable solutions.
Research and implement the latest tools and technologies to enhance the data plane infrastructure.
Data Governance and Security:
Ensure data security and compliance with industry standards while implementing data sharing solutions using Delta Sharing and Snowflake.
Maintain robust logging, monitoring, and alerting systems for all data pipelines.
Career Level - IC4