WNS (Holdings) Limited (NYSE: WNS), is a leading Business Process Management (BPM) company. We combine our deep industry knowledge with technology and analytics expertise to co-create innovative, digital-led transformational solutions with clients across 10 industries. We enable businesses in Travel, Insurance, Banking and Financial Services, Manufacturing, Retail and Consumer Packaged Goods, Shipping and Logistics, Healthcare, and Utilities to re-imagine their digital future and transform their outcomes with operational excellence.We deliver an entire spectrum of BPM services in finance and accounting, procurement, customer interaction services and human resources leveraging collaborative models that are tailored to address the unique business challenges of each client. We co-create and execute the future vision of 400+ clients with the help of our 44,000+ employees.
Job DescriptionSoftware/Data Engineer with experience using working in DatabricksCore ResponsibilitiesBuild, enhance, and maintain data pipelines in Databricks notebooks using Python and SQL.Work with Delta Lake for structured and semi-structured data.Develop and automate Databricks Jobs / Workflows for scheduled processing.Transform and clean data across bronze/silver/gold layers, using dbt (data build tool) to model, test, and document datasets.Write modular, reusable SQL using Jinja templating in dbt.Use GitHub Copilot in VS Code to help scaffold code, write boilerplate, and speed up routine tasks.Monitor, troubleshoot, and improve job reliability, performance, and cost efficiency.Collaborate with analysts and data scientists to deliver clean, accessible data.Follow version control best practices using Git and Databricks Repos.Required Skills2–4 years in data engineering or analytics engineering roles.Strong Python (pandas, PySpark basics) and SQL.Hands-on experience with Databricks — notebooks, clusters, jobs, and Delta tables.Experience with dbt for transformations, including writing Jinja-based SQL macros.Comfortable working in VS Code (or similar IDE) for development and version control.Experience using GitHub Copilot to support coding productivity, especially for boilerplate or repetitive tasks.Understanding of data lakehouse architecture (bronze / silver / gold layers).Familiarity with orchestration tools like Databricks Workflows, ADF, or Airflow.Experience in a cloud environment (Azure, AWS, or GCP).Exposure to Unity Catalog or Delta Live Tables is a plus.
QualificationsGraduate or above