Type: Contract, per-project.
Location: Remote — within LATAM with ET ±1 hour
Availability: Contractor (40 hours per week)
Job Title: Data / Platform Engineer (AWS & Data Pipelines)
We are looking for a highly motivated Data / Platform Engineer to join our team and help design, build and operate scalable data pipelines and cloud-based solutions.
In this role, you will work closely with engineering and product teams to build both streaming and batch data pipelines, contribute to system design, and help drive automation and monitoring across our data platform.
Key Responsibilities
Design, build, maintain and primarily operate scalable streaming and batch data pipelines, with a strong focus on maintenance, monitoring, troubleshooting and continuous improvement of existing pipelines.
Work with AWS services, including Redshift, EMR and ECS, to support data processing and analytics workloads.
Develop and maintain data workflows using Python and SQL.
Orchestrate and monitor pipelines using Apache Airflow.
Build and deploy containerized applications using Docker and Kubernetes.
Break down high-level system designs into well-defined, deliverable tasks with realistic estimates.
Collaborate with cross-functional teams in a fast-paced and distributed environment across the US and Europe.
Drive automation, observability and monitoring to improve reliability, performance and operational efficiency.
Support knowledge transfer and ownership handover as part of the planned transition to the consuming team.
Required Qualifications
Strong professional experience with Python and SQL.
Hands-on experience with AWS, specifically Redshift, EMR and ECS. AWS experience is mandatory (other cloud providers are not considered equivalent for this role).
Proven experience building and operating both streaming and batch data pipelines.
Professional experience with Apache Airflow, Docker and Kubernetes.
Ability to translate high-level system designs into actionable technical tasks and realistic estimates.
Comfortable working in dynamic and fast-paced environments and in distributed teams.
Strong interest in automation and monitoring.
Strong hands-on experience with Apache Spark.
Senior-level profile with strong autonomy, communication skills and ability to work effectively in distributed teams.
Proven ability to transfer knowledge and support ownership handovers.
Fluent or professional working proficiency in English (both written and spoken).
Nice to Have
Previous experience in the telecom industry.
Experience with machine learning systems and/or event-driven architectures.
Experience with Apache Iceberg.
(*) SOUTHWORKS only hires individuals from countries that are not blocked or sanctioned by the United States, including those identified on the United States Office of Foreign Asset Control (OFAC).
Similar Jobs
What you need to know about the Charlotte Tech Scene
Key Facts About Charlotte Tech
- Number of Tech Workers: 90,859; 6.5% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Lowe’s, Bank of America, TIAA, Microsoft, Honeywell
- Key Industries: Fintech, artificial intelligence, cybersecurity, cloud computing, e-commerce
- Funding Landscape: $3.1 billion in venture capital funding in 2024 (CED)
- Notable Investors: Microsoft, Google, Falfurrias Management Partners, RevTech Labs Foundation
- Research Centers and Universities: University of North Carolina at Charlotte, Northeastern University, North Carolina Research Campus


