Job Description:
We are looking for a skilled Data Engineer to design, build, and maintain scalable data infrastructure. You will work closely with data scientists, analysts, and software engineers to ensure reliable data pipelines and clean, accessible datasets. Your work will directly support data-driven decision-making across the organization.
Key Responsibilities:
Design, develop, and manage ETL/ELT pipelines to ingest, transform, and load data from various sources
Develop and maintain data warehouses and data lakes
Ensure data quality, governance, and consistency across platforms
Collaborate with stakeholders to gather data requirements and deliver business-ready datasets
Optimize data processing performance and storage costs
Implement data security and compliance best practices (e.g., GDPR, HIPAA)
Support real-time data streaming and batch processing pipelines
Automate workflows and monitoring for data processes
Work with tools like Apache Airflow, dbt, Spark, Kafka, etc.
Key Responsibilities:
Design, develop, and manage ETL/ELT pipelines to ingest, transform, and load data from various sources
Develop and maintain data warehouses and data lakes
Ensure data quality, governance, and consistency across platforms
Collaborate with stakeholders to gather data requirements and deliver business-ready datasets
Optimize data processing performance and storage costs
Implement data security and compliance best practices (e.g., GDPR, HIPAA)
Support real-time data streaming and batch processing pipelines
Automate workflows and monitoring for data processes
Work with tools like Apache Airflow, dbt, Spark, Kafka, etc.
Job Requirements:
Qualifications:
Required:
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
2–5+ years of experience in data engineering or related roles
Proficiency in SQL and Python
Experience with data warehouses (e.g., Snowflake, Redshift, BigQuery)
Familiarity with cloud platforms (AWS, Azure, or GCP)
Strong knowledge of ETL frameworks and data modeling (dimensional, star schema, etc.)
Required:
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
2–5+ years of experience in data engineering or related roles
Proficiency in SQL and Python
Experience with data warehouses (e.g., Snowflake, Redshift, BigQuery)
Familiarity with cloud platforms (AWS, Azure, or GCP)
Strong knowledge of ETL frameworks and data modeling (dimensional, star schema, etc.)
-
1 openings
- Job type: full-time
- Wage: $50000.00 - 60000.00 / year
- Experience: 1-year-to-less-than-2-years
- Education: bachelors-degree