Must Have Skills (Top 3 technical skills only)*:
Spark AWS PythonNice to have skills (Top 2 only):
Spark AWSDetailed Job Description:
Data Engineer with expertise in Python and Big data technologies like Spark, Hive, Presto etc. Experience with AWS services S3, EC2, EMR, Lambda Functions and Step Functions.Experience with both SQL and NoSql DBProficient writing Spark jobs in Python and ScalaDeveloping Hive UDF and Hive jobsProven handson Software Development experienceExperience with testdriven developmentExposure to CICD processes using Maven and Jenkins, familiarity with GIT.Exposure to Scrum Agile frameworkPreferred exper
Minimum years of experience*: 5+
Certifications Needed: Yes
Top 3 responsibilities you would expect the Subcon to shoulder and execute*:
Exposure to Scrum Agile framework Flexibility and ability to work in onshore offshore model involving multiple agile teams Proven hands - on Software Development experienceInterview Process (Is face to face required?) Yes