Pune
20 days ago
Product Solution Architect - AWS & Big Data (Python/Java/Scala)

We are looking for an experienced and motivated Product Solution Architect with deep expertise in designing and implementing cutting-edge big-data solutions on the Amazon Web Services (AWS) platform. As a key member of our dynamic team, you will play an integral role in driving the success of our data-driven initiatives, utilizing AWS services to build scalable, reliable, and high-performance big-data solutions.

Key Responsibilities:

Solution Design: Collaborate with cross-functional teams including product managers, engineers, and data scientists to understand business requirements and transform them into innovative big-data solutions on AWS. Architecture Development: Design and implement robust, cost-effective, and well-architected AWS-based big-data solutions, adhering to industry best practices. Leverage AWS services such as Amazon S3, EMR, Glue, Redshift, DynamoDB, and more. Technology Evaluation: Stay current with the latest AWS services and big-data technologies. Evaluate and recommend tools and technologies to enhance data processing, storage, and analytics capabilities. Implementation & Deployment: Lead the technical implementation of big-data solutions on AWS, owning the full development lifecycle from prototyping to production deployment. Performance Optimization: Identify and resolve performance bottlenecks to ensure that big-data solutions are optimized for scalability, reliability, and cost-efficiency. Security & Compliance: Ensure solutions comply with stringent security and compliance standards, safeguarding sensitive data and adhering to industry regulations. Collaboration & Communication: Engage with stakeholders to understand their needs, provide technical guidance, and communicate complex concepts in a clear and understandable manner. Troubleshooting & Support: Assist in troubleshooting technical issues related to big-data solutions, providing timely support to ensure system uptime and reliability. AWS Ecosystem Expertise: Strong familiarity with the AWS ecosystem, including services like Sagemaker, Glue, Athena, Lambda, and training/processing jobs, to streamline data workflows.

Technical Expertise:

Hands-on experience with tools and technologies such as Glue, Glue Catalog, Crawler, Lambda, Airflow, IAM, S3, Athena, Redshift, Python, PySpark, SQL, DynamoDB. Strong understanding of data transformation techniques, version control (GIT/Bit Bucket), and data processing pipelines. Provide technical leadership to the data engineering team, mentor junior engineers, and foster a collaborative, innovative environment. Ensure high-quality data management across the entire data pipeline, including data quality, consistency, and accuracy. Implement and enforce data security measures and compliance standards. Maintain comprehensive documentation for data engineering processes, data models, and system architecture.

Basic Qualifications:

Experience: Proven experience as a Solution Architect or similar role with a focus on big-data solutions on AWS. AWS Expertise: Deep knowledge of AWS services relevant to big-data, including but not limited to Amazon S3, EMR, Glue, Sagemaker, Redshift, Athena, DynamoDB. Technical Skills: Proficiency in data modeling, data warehousing, and data integration concepts. Hands-on experience with Python, Java, or Scala for big-data processing and analytics. Big-Data Frameworks: Familiarity with big-data processing frameworks such as Apache Spark, Hadoop, Apache Flink. Security & Compliance: Solid understanding of data security, encryption, and best practices for compliance on AWS. Problem Solving: Strong problem-solving skills with the ability to tackle complex technical challenges. Communication Skills: Excellent communication and interpersonal skills, with the ability to work effectively with both technical and non-technical stakeholders.
Confirm your E-mail: Send Email