Markham, ON, CA
5 days ago
Data Tester
Position Description:

This role is hybrid and requires you to be in our Markham office 2-3 times per week - subject to change at any time.

Your future duties and responsibilities:

• Design, build and operationalize large scale enterprise data solutions in Hadoop, Postgres, and Snowflake.
• Demonstrates outstanding understanding of AWS cloud services especially in the data engineering and analytics space.
• Analyze, re-architect and re-platform on premise big data platforms.
• Parse unstructured data, semi structured data such as JSON, XML etc. using Informatica Data Processor.
• Analyze the Informatica PowerCenter Jobs and redesign and develop them in BDM.
• Work will also encompass crafting & developing solution designs for data acquisition/ingestion of multifaceted data sets (internal/external), data integrations & data warehouse/marts.
• You are collaborative with business partners, product owners, partners, functional specialists, business analysts, IT architecture, and developers to develop solution designs adhering to architecture standards.
• Responsible for supervising and ensuring that solutions adhere to enterprise data governance & design standards.
• Act as a point of contact to resolve architectural, technical and solution related challenges from delivery teams for best efficiency.
• Design and Develop ETL Pipeline to ingest data into Hadoop from different data sources (Files, Mainframe, Relational Sources, NoSQL Etc.) using Informatica BDM
• Work with Hadoop administrators, Postgres DBAs to partition the hive tables, refresh metadata and various other activities, to enhance the performance of data loading and extraction.
• Performance tuning of ETL mappings and queries.
• Advocate importance of data catalogs, data governance and data quality practices.
• Outstanding problem-solving skills.
• Work in an Agile delivery framework to evolve data models and solution designs to deliver value incrementally.
• You are a self-starter with experience working in a fast-paced agile development environment.
• Strong mentoring and coaching skills and ability to lead by example for junior team members.
• Outcome focused with strong decision making and critical thinking skills to challenge the status quo which impacts delivery pace and performance and striving for efficiencies.

Required qualifications to be successful in this role:

• University degree in Computer Engineering or Computer Science.
• 3+ years of experience crafting solutions for data lakes, data integrations, data warehouses/marts.
• 3+ years of experience working on Hadoop Platform, writing hive or impala queries.
• 3+ years of experience working on relational databases (Oracle, Teradata, PostgreSQL etc.) and writing SQL queries.
• Solid grasp/experience with data technologies & tools (Hadoop, PostgreSQL, Informatica, etc.,)
• Experience on various execution modes in BDM such Spark, Hive, Native.
• Should have deep knowledge on performance tuning of ETL Jobs, Hadoop Jobs, SQL’s, Partitioning, Indexing and various other techniques.
• Experience in writing Shell scripts.
• Experience in Spark Jobs (Python or Scala) is an asset.
• Outstanding knowledge and experience in ETL with Informatica product suite.
• Knowledge/experience in Cloud Data Lake Design – preferred AWS technologies like S3, EMR, Redshift, Snowflake, Data catalog etc.,
• Experience implementing Data Governance principles and efficiencies.
• Understanding of reporting/analytics tools (QlikSense, SAP Business Objects, SAS, Dataiku, etc.,).
• Familiar with the Agile software development.
• Excellent verbal and written communication skills.
• Insurance knowledge an asset-Ability to foundationally understand complex business process driving technical systems.

Disclaimer: Use of the term ‘engineering’ in this job posting refers to the technical sense related to Information Technology (IT) and does not imply that the individual practices engineering or possesses the requisite license as prescribed by the applicable provincial or territorial engineering regulator. We are seeking individuals with expertise in IT engineering-related functions, but licensure from an engineering regulator is not a prerequisite for this position. Engineering is a regulated profession in Canada which is restricted in terms of use of titles and designation.

Skills: Data WarehousingHadoop HiveInformaticaOraclePostgre SQLSQLTeradataInsurance What you can expect from us:

Together, as owners, let’s turn meaningful insights into action.

Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because…

You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction.

Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise.

You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons.

At CGI, we recognize the richness that diversity brings. We strive to create a work culture where all belong and collaborate with clients in building more inclusive communities. As an equal-opportunity employer, we want to empower all our members to succeed and grow. If you require an accommodation at any point during the recruitment process, please let us know. We will be happy to assist.

Come join our team—one of the largest IT and business consulting services firms in the world.

Confirm your E-mail: Send Email