As a Data Engineer, you'll build and maintain data pipelines, ensure data quality, and collaborate with teams to optimize data processes for analytics and ML applications.
Role: Data Engineer
Location: Egypt, Uzbekistan, and Pakistan (Remote)
Work Week: Sunday – Thursday
Work Timings: 9:00 AM – 6:00 PM (Saudi Arabian Time Zone)
Overview:
We’re seeking a Data Engineer to design, build, and maintain the data infrastructure that underpins our analytics, ML models, and decision-making processes. You’ll be responsible for building scalable data pipelines, integrating diverse data sources, and ensuring data quality, reliability, and accessibility across the organization. Working closely with data scientists, analysts, and product teams, you’ll enable data-driven insights while optimizing for performance and scalability. This is a great opportunity to have a direct impact on how data is leveraged across a fast-growing company.
Role & Responsibilities:
- Data Pipeline Development & Optimization:
- Design, build, and maintain scalable and reliable data pipelines to support analytics, ML models, and business reporting.
- Collaborate with data scientists and analysts to ensure data is available, clean, and optimized for downstream use.
- Implement data quality checks, monitoring, and validation processes.
- Data Architecture & Integration:
- Work with cross-functional teams to design efficient ETL/ELT workflows using modern data tools.
- Integrate data from multiple sources (databases, APIs, third-party tools) into centralized storage solutions (data lakes/warehouses).
- Support cloud-based infrastructure for data storage and retrieval.
- Performance & Scalability:
- Monitor, troubleshoot, and optimize existing data pipelines to handle large-scale, real-time data flows.
- Implement best practices for query optimization and cost-efficient data storage.
- Ensure data is available and accessible for business-critical operations.
- Collaboration & Documentation:
- Partner with product, engineering, and business stakeholders to understand data requirements.
- Document data workflows, schemas, and best practices.
- Support a culture of data reliability, governance, and security.
Requirements:
- Proficiency in Python and SQL for data engineering tasks.
- Strong understanding of ETL/ELT processes, data warehousing, and data modeling.
- Hands-on experience with cloud platforms (AWS, GCP, or Azure) and data storage solutions (BigQuery, Redshift, Snowflake, etc.).
- Familiarity with data orchestration tools Airflow, Airbyte is a must.
- Experience with containerization & deployment tools (Docker, Kubernetes) is a plus.
- Knowledge of data governance, security, and best practices for handling sensitive data.
- Familiarity to work with Git and GitHub.
- Dataform is a must
- Strong skills in eliciting requirements from cross-functional stakeholders and translating them into actionable data engineering tasks.
Experience:
- 2+ years in data engineering, building and maintaining data pipelines.
- 2+ years in SQL and Python development for production environments.
- Experience working in fast-growing startup environments is a plus.
- Exposure to real-time data processing frameworks (Kafka, Spark, Flink) is a plus.
Similar Jobs
Big Data
This role involves designing, developing, and maintaining data pipelines for data collection and annotation, ensuring data quality and efficiency in processes.
Top Skills:
AirflowApache IcebergAws AthenaAws QuicksightAzureGCPN8NNosql DatabasesPythonRubyScalaSQL
Fintech • HR Tech • Payments • Financial Services
Design and maintain data pipelines while optimizing data warehouses, collaborating with analysts and scientists, and ensuring data integrity through quality measures.
Top Skills:
Apache AirflowPythonSQL
Information Technology • Mobile • Consulting
The role involves building data lakes, developing data pipelines, ensuring data quality, collaborating with teams for analytics, and maintaining data dashboards.
Top Skills:
AIAirflowDbtDockerGCPGreat ExpectationKubernetesLookerLuigiMlMongoDBNoSQLPrefectPysparkPythonScalaSQLTerraform
What you need to know about the Charlotte Tech Scene
Ranked among the hottest tech cities in 2024 by CompTIA, Charlotte is quickly cementing its place as a major U.S. tech hub. Home to more than 90,000 tech workers, the city’s ecosystem is primed for continued growth, fueled by billions in annual funding from heavyweights like Microsoft and RevTech Labs, which has created thousands of fintech jobs and made the city a go-to for tech pros looking for their next big opportunity.
Key Facts About Charlotte Tech
- Number of Tech Workers: 90,859; 6.5% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Lowe’s, Bank of America, TIAA, Microsoft, Honeywell
- Key Industries: Fintech, artificial intelligence, cybersecurity, cloud computing, e-commerce
- Funding Landscape: $3.1 billion in venture capital funding in 2024 (CED)
- Notable Investors: Microsoft, Google, Falfurrias Management Partners, RevTech Labs Foundation
- Research Centers and Universities: University of North Carolina at Charlotte, Northeastern University, North Carolina Research Campus



