Work at Home, El Salvador
2 days ago
Data Engineer

Job Title:

Data Engineer

Job Description

Job Description Summary
We are seeking a skilled and detail-oriented Data Engineer to join our team. The Data Engineer will be responsible for designing, building, and maintaining efficient data pipelines and architectures that enable advanced data analytics and business intelligence. The ideal candidate will have hands-on experience with data integration, data warehousing, and cloud-based platforms, as well as a passion for optimizing data systems to drive data-driven decision-making.Key Responsibilities:

Data Pipeline Development:
Design, build, and manage scalable, reliable, and efficient ETL/ELT data pipelines to ingest, process, and store large datasets from various data sources.

Data Integration:
Integrate data from different sources such as APIs, relational databases, NoSQL databases, flat files, and streaming data into centralized data lakes or data warehouses.

Database Management & Optimization:
Implement and maintain data storage solutions such as relational databases (e.g., PostgreSQL, MySQL), NoSQL databases (e.g., MongoDB, Cassandra), and cloud-based data warehouses (e.g., Amazon Redshift, Google BigQuery).

Data Quality & Validation:
Ensure data quality and integrity through the design and implementation of data validation, cleansing, and enrichment processes. Identify and address data discrepancies and inconsistencies.

Collaboration with Teams:
Collaborate with data scientists, analysts, and software engineers to understand data requirements, and deliver data solutions that meet analytical and operational needs.

Performance Tuning:
Optimize data pipelines and data storage systems for maximum efficiency, scalability, and performance. Proactively monitor system performance and troubleshoot issues.

Data Governance & Security:
Work closely with the data governance and security teams to ensure that data solutions comply with organizational standards, privacy laws (e.g., GDPR, CCPA), and security policies.

Cloud Data Engineering:
Design and manage data workflows in cloud environments such as AWS, Azure, or Google Cloud, utilizing cloud-native services for data processing and storage.

Automation & Scheduling:
Automate data workflows and implement scheduling tools (e.g., Airflow, Cron) to ensure timely data delivery for reports, dashboards, and analytics.

Documentation:
Create and maintain technical documentation for data pipelines, data models, and data management processes to ensure knowledge sharing and reproducibility.

Requirements:

Education:
Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Technology, or a related field.

Experience:

Minimum of 3-5 years of experience in data engineering, ETL development, or database management.Experience with cloud platforms and cloud-native data processing tools.

Skills & Expertise:

Proficiency with SQL for querying and manipulating data.Experience with data integration tools (e.g., Apache Airflow, Talend, Informatica) and ETL/ELT processes.Strong programming skills in Python, Java, or Scala for data processing.Knowledge of big data technologies (e.g., Hadoop, Spark, Kafka).Experience with data warehousing solutions (e.g., Snowflake, Amazon

Location:

CRI Work-at-Home

Language Requirements:

Time Type:

Full time

If you are a California resident, by submitting your information, you acknowledge that you have read and have access to the Job Applicant Privacy Notice for California Residents

Confirm your E-mail: Send Email