BENGALURU, KARNATAKA, India
2 days ago
Senior Data Engineer - Python, Databricks

 

Data Bricks and Python Engineer

About Oracle FSGIU - Finergy:

Finergy division within Oracle FSGIU exclusively focuses on the Banking, Financial Services, and Insurance (BFSI) sector, offering deep domain knowledge to address complex financial needs. Finergy has Industry expertise in BFSI. (On Accelerated Implementation) Finergy has Proven methodologies that fast-track the deployment of multi-channel delivery platforms, minimizing IT intervention and reducing time to market. Due to Personalization tools that tailor customer experiences, Finergy has several loyal customers for over a decade. (On End-to-End Banking Solutions) Finergy Provides a single platform for a wide range of banking services—trade, treasury, cash management—enhancing operational efficiency with integrated dashboards and analytics. Finergy offers Expert Consulting Services, Comprehensive consulting support, from strategy development to solution implementation, ensuring the alignment of technology with business goals.


Job Responsibilities

1. Software Development:- Design, develop, test, and deploy high-performance and scalable data solutions using Python,PySpark, SQL- Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications.- Implement efficient and maintainable code using best practices and coding standards.2. Databricks Platform:- Work with Databricks platform for big data processing and analytics.- Develop and maintain ETL processes using Databricks notebooks.- Implement and optimize data pipelines for data transformation and integration.3. Continuous Learning:- Stay updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks.- Share knowledge with the team and contribute to a culture of continuous improvement.4. SQL Database Management:- Utilize expertise in SQL to design, optimize, and maintain relational databases.- Write complex SQL queries for data retrieval, manipulation, and analysis.

Mandatory Skills:

4 to 8 Years of experience in Databricks and big data frameworksAdvanced proficiency in AWS, including EC2, S3 and container orchestration (Docker, Kubernetes)Proficient in AWS services and data migrationExperience in Unity Catalogue Familiarity with Batch and real time processingData engineering with strong skills in Python, PySpark, SQL

 

 

 


 

Career Level - IC2

Confirm your E-mail: Send Email