Senior Technical Specialist
Datavail
Job Title: Senior Technical Specialist
Education: B.E\B.Tech\MCA\B.SC
Experience: 10+ Years
Location: Bangalore/Hyderabad/Mumbai
Key Skills: SQL, PLSQL/TSQL/AWS DMS/AWS Glue/AWS DynamoDB/ Snowflake DBT/Fivetran.
Required Skills:
10+ Years of Proven experience in working with Various ETL / Datawarehouse tools. Extensive back-end development experience, ideally for customer-facing products. Serves as one of the subject matter experts for AWS, Snowflake and Fivetran. Proficient in end-to-end system profiling, performance tuning, and optimization. Strong database knowledge Oracle PL/SQL, tuning Good knowledge of UNIX scripting. Strong knowledge of SCD1 and SCD2 logic implementation using ETL Tools, SQL (T-SQL, PLSQL) and Spark SQL. Design & Build BI Delivery components like Data Marts, and Datawarehouse. Design and Develop solutions to meet customer functional and non-functional requirements. Define conceptual High-Level and Low-Level Data Models to address customer analytical needs. Skilled in writing well-structured, efficient python code and adept in designing, building, maintaining, testing, and deploying software. Expert in Python, with knowledge of at least one Python web framework such as Django, Flask, etc. Having Knowledge on pg8000, boto3, scikit learn, matplotlib, seaborn is an added advantage. Should have good experience in using Pandas DF, Spark DF, Numpy, Matplotlib. Proficient understanding of code versioning tools such as Git, Mercurial or SVN. At least 3 full years of recent Snowflake development experience Hands-on experience with Snowflake utilities, Snow SQL, Snowpipe, DBT. Able to administer and monitor Snowflake computing platform. Hands on experience with data load and manage cloud DB Evaluate Snowflake Design considerations for any change in the application Build the Logical and Physical data model for snowflake as per the changes required Define roles and privileges required to access different database objects. Define virtual warehouse sizing for Snowflake for different types of workloads. Strong Experience in Designing and Implementing the ETL Pipelines using AWS Glue and DMS to load / read data from/to Dynamo DB, Postgres and Snowflake. Good Knowledge of AWS Services like S3, IAM. Experienced in cloud environment architectures, utilizing technologies such as, AWS Lambda, Glue, S3, DMS, DynamoDB. PostgreSQL. Snowflake DBT Python. Fivetran. Hands-on experience with Fivetran data pipelines and relational databases, including Oracle, PostgreSQL, SQL Server, or MySQL. Good to have AWS Certified Solutions Architect – Professional or SnowPro Advanced Certification.
Confirm your E-mail: Send Email
All Jobs from Datavail