Hyderabad
1 day ago
Lead II - Data Science

Role Proficiency:

Independently provides expertise on data analysis techniques using software tools; streamlining business processes and managing team

Outcomes:

      Managing and designing the reporting environment including data sources security and metadata.       Providing technical expertise on data storage structures data mining and data cleansing.       Supporting the data warehouse in identifying and revising reporting requirements.       Supporting initiatives for data integrity and normalization.       Assessing tests and implementing new or upgraded software and assisting with strategic decisions on new systems.       Synthesize both quantitative and qualitative data into insights       Generating reports from single or multiple systems.       Troubleshooting the reporting database environment and reports.       Understanding business requirements and translating it into executable steps for the team members.   Identify and recommend new ways to streamline business processes   Illustrates data graphically and translates complex findings into written text.   Locating results to help the clients make better decisions. Get feedback from clients and offer to build solutions based on the feedback.   Review the team’s deliverables before sending final reports to stakeholders.   Support cross-functional teams with data reports and insights on data.   Training end users on new reports and dashboards. Set FAST goals and provide feedback on FAST goals of reportees

Measures of Outcomes:

      Quality - number of review comments on codes written       Accountable for data consistency and data quality.       Number of medium to large custom application data models designed and implemented       Illustrates data graphically and translates complex findings into written text.       Number of results located to help clients make informed decisions.       Attention to detail and level of accuracy.       Number of business processes changed due to vital analysis.       Number of Business Intelligent Dashboards developed       Number of productivity standards defined for project   Manage team members and review the tasks submitted by team members   Number of mandatory trainings completed

Outputs Expected:

Determine Specific Data needs:

Work with departmental managers to outline the specific data needs for each business method analysis project


Management and Strategy:

Oversees the activities of analyst personnel and ensures the efficient execution of their duties.


Critical business insights:

Mines the business’s database in search of critical business insights and communicates findings to the relevant departments.


Code:

Creates efficient and reusable SQL code meant for the improvement
manipulation
and analysis of data. Creates efficient and reusable code. Follows coding best practices.


Create/Validate Data Models:

Builds statistical models; diagnoses
validates
and improves the performance of these models over time.


Predictive analytics:

Seeks to determine likely outcomes by detecting tendencies in descriptive and diagnostic analysis


Prescriptive analytics:

Attempts to identify what business action to take


Code Versioning:

Organize and manage the changes and revisions to code. Use a version control tool like git
bitbucket. etc.


Create Reports:

Create reports depicting the trends and behaviours from the analysed data


Document:

Create documentation for own work as well as perform peer review of documentation of others' work


Manage knowledge:

Consume and contribute to project related documents
share point
libraries and client universities


Status Reporting:

Report status of tasks assigned Comply to project related reporting standards/process

Skill Examples:

      Analytical Skills: Ability to work with large amounts of data: facts figures and number crunching.       Communication Skills: Communicate effectively with a diverse population at various organization levels with the right level of detail.       Critical Thinking: Data analysts must look at the numbers trends and data and come to new conclusions based on the findings.       Presentation Skills - reports and oral presentations to client Strong meeting facilitation skills as well as presentation skills. Attention to Detail: Making sure to be vigilant in the analysis to come to correct conclusions. Mathematical Skills to estimate numerical data. Work in a team environment Proactively ask for and offer help

Knowledge Examples:

Knowledge Examples

      Database languages such as SQL       Programming language such as R or Python       Analytical tools and languages such as SAS & Mahout.       Proficiency in MATLAB.       Data visualization software such as Tableau or Qlik or Power BI.       Proficient in mathematics and calculations.       Spreadsheet tools such as Microsoft Excel or Google Sheets       DBMS       Operating Systems and software platforms Knowledge about customer domain and also sub domain where problem is solved

Additional Comments:

UST is looking for Lead II - Data Analysis with below requirements, • Bachelor's degree (BA/BS) in a related field such as information systems, mathematics, or computer science. • Typically has 6 years of relevant work experience. Consideration given to equivalent combination of education and experience. • Excellent written and verbal communication skills. Strong organizational and analytical skills. • Expertise in Data Extraction, Transformation, Loading, Data Analysis, Data Profiling, and SQL Tuning. • Expertise in Relational & Dimensional Databases in engines like SQL Server, Postgres, Oracle… • Strong experience in designing and developing enterprise scale data warehouse systems using Snowflake. • Strong expertise in designing and developing reusable and scalable Data products with data quality scores and integrity checks. • Strong expertise in developing end to end complex data workflows using Data ingestion tools such as Snaplogic, ADF, Matallion etc. • Experience with cloud platforms AWS / Azure cloud technologies, Agile methodologies and DevOps is a big plus. • Experience in architecting cloud native solutions across multiple B2B and B2B2C data domains. • Experience in architecture of modern APIs for the secure sharing of data across internal application components as well as external technology partners. • Experience in Data orchestration tools like Apache Airflow, Chronos with Mesos cluster etc. • Expertise in designing and developing data transformation models in DBT. • Comparing and analyzing provided statistical information to identify patterns, relationships, and problems; and using this information to design conceptual and logical data models and flowcharts to present to management. • Experience with developing CICD pipelines in Jenkins or Azure DevOps. • Knowledge of Python for data manipulation and automation. • Knowledge of data governance frameworks and best practices. • Knowledge in integrating with source code versioning tools like Git Hub. Responsibilities: • Plan & analyze, develops, maintains, and enhances client systems as well as supports systems of moderate to high complexity. • Participates in the design, specification, implementation, and maintenance of systems. • Designs, codes, tests, and documents software programs of moderate complexity as per the requirement specifications. • Design, develop, and maintain scalable data pipelines using Snowflake, dbt, Snaplogic and ETL tools. • Participates in design reviews and technical briefings for specific applications. • Integrate data from various sources, ensuring consistency, accuracy, and reliability. • Develop and manage ETL/ELT processes to support data warehousing and analytics. • Assists in preparation of requirement specifications, Analyzing the data, design and develop data driven applications including documenting and revising user procedures and/or manuals. • Involved with resolution of Medium to severe complexity software development issues that may arise in a production environment. • Utilize Python for data manipulation, automation, and integration tasks. • Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL Server, PostgreSQL, SSIS, T-SQL, PL/SQL • Work with stakeholders including the Product, Data, Design, Frontend and Backend teams to assist with data-related technical issues and support their data infrastructure needs • Write complex SQL, T-SQL, PL/SQL queries, stored procedures, functions, cursors in SQL Server and PostgreSQL. Peer review other team members code • Analyze the long running queries/functions/procures, design and develop performance optimization strategy. • Create and manage SSIS packages and/or Informatica to perform day to day ETL activities. Use variety of strategies for complex data transformations using an ETL tool • Perform DBA activities like maintaining the systems health and performance tuning, manage database access, deployments to higher environments, on-call support, shell scripting and python scripting is a plus • Participate in employing the Continuous Deliver and Continuous Deployment (CI/CD) tools for optimal productivity. • Collaborate with scrum team members during daily standup and actively engage in sprint refinement, planning, review and retrospective. • Analyzes, reviews, and alters program to increase operating efficiency or adapt to new requirements. • Writes documentation to describe program development, logic, coding, and corrections.

Confirm your E-mail: Send Email