Agile Dream Team Logo

Agile Dream Team

Data Engineer

Posted 17 Days Ago
Remote
Hiring Remotely in USA
Mid level
Remote
Hiring Remotely in USA
Mid level
The Data Engineer will design and maintain scalable data pipelines, ensure data quality, implement governance standards, and collaborate with scientists and engineers for AI applications.
The summary above was generated by AI

At Agile Dream Team, we harness the power of data to drive intelligent business decisions. We are looking for a highly skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure for AI-driven applications.

Learn more about us at: www.agiledreamteam.com


Role Overview

As a Data Engineer, you will be responsible for building and optimizing data pipelines, ensuring data quality, and enabling real-time and batch processing for analytics and AI models. You will work closely with Data Scientists, AI Engineers, and Software Developers to develop scalable data solutions that support business intelligence and machine learning applications


Key Responsibilities

  • Design, develop, and maintain scalable ETL/ELT pipelines for structured and unstructured data.
  • Build data architectures that support batch and real-time data processing.
  • Optimize data storage, retrieval, and performance for analytics and AI applications.
  • Work with big data processing frameworks such as Apache Spark, Apache Flink, or Kafka.
  • Implement data governance, security, and compliance standards.
  • Ensure high data quality through data validation, monitoring, and anomaly detection.
  • Automate data ingestion, transformation, and processing for AI/ML workflows.
  • Deploy and manage data solutions in AWS, Azure, or GCP cloud environments.
  • Collaborate with Data Scientists and AI Engineers to enable data-driven AI models.
  • Utilize SQL and NoSQL databases for efficient storage and retrieval of large datasets.

Required Skills & Experience

  • Proficiency in data engineering frameworks: Apache Spark, Hadoop, Airflow, DBT.
  • Strong knowledge of SQL, NoSQL, and data modeling techniques.
  • Experience in ETL/ELT pipeline development using Python, Scala, or Java.
  • Hands-on experience with cloud data platforms (AWS Redshift, BigQuery, Snowflake, Databricks).
  • Expertise in real-time streaming technologies (Apache Kafka, Kinesis, Pulsar).
  • Experience with data warehouse and lakehouse architectures.
  • Strong understanding of data partitioning, indexing, and performance tuning.
  • Familiarity with containerized deployments (Docker, Kubernetes) for data pipelines.
  • Experience in CI/CD automation for data workflows.


Preferred Qualifications

  • Experience with machine learning data pipelines and feature engineering.
  • Hands-on knowledge of data lake technologies (Delta Lake, Iceberg, Hudi).
  • Familiarity with Terraform, Ansible, or other infrastructure-as-code (IaC) tools.
  • Understanding of graph databases (Neo4j, ArangoDB) and time-series databases.
  • Experience in data observability, lineage tracking, and governance.

Why Join Us?

  • Work with cutting-edge data technologies in a forward-thinking AI/ML company.
  • 100% remote role with a flexible schedule.
  • Opportunities for growth and continuous learning in Data Engineering and AI.
  • Engage in high-impact data projects that power real-world AI applications.
  • Competitive salary 

Ready to shape the future of data engineering? Apply now!

Top Skills

Airflow
Apache Kafka
Spark
Aws Redshift
BigQuery
Ci/Cd
Databricks
Dbt
Docker
Hadoop
Java
Kinesis
Kubernetes
NoSQL
Pulsar
Python
Scala
Snowflake
SQL

Similar Jobs

15 Days Ago
In-Office or Remote
Miami, FL, USA
100K-130K Annually
Mid level
100K-130K Annually
Mid level
Big Data • Cannabis • eCommerce • Logistics • Database • Business Intelligence • Big Data Analytics
Design and maintain a data analytics platform, build ETL pipelines, standardize and model data, optimize database performance, and ensure data quality.
Top Skills: AirbyteAWSDbtExcelGithub ActionsGlueHevoMetabaseModeMySQLPostgresPower BIPythonSnowflakeSparkSQLTerraformThoughtspot
Yesterday
Remote or Hybrid
Boston, MA, USA
137K-215K Annually
Senior level
137K-215K Annually
Senior level
Healthtech • Software • Analytics • Biotech • Pharmaceutical • Manufacturing
The Principal EDC Developer manages EDC systems for clinical trials, overseeing eCRF development, system configurations, and integrations while ensuring compliance with industry standards.
Top Skills: CtmsEcoaInformIrtMedrioRaveSASSharepointVeeva CdmsZelta
Yesterday
Remote or Hybrid
CO, USA
Mid level
Mid level
Artificial Intelligence • eCommerce • Information Technology • Internet of Things • Automation
Manage the day-to-day operations of the data center, including installation, configuration, maintenance of equipment, and troubleshooting.
Top Skills: CiscoDellVMware

What you need to know about the Charlotte Tech Scene

Ranked among the hottest tech cities in 2024 by CompTIA, Charlotte is quickly cementing its place as a major U.S. tech hub. Home to more than 90,000 tech workers, the city’s ecosystem is primed for continued growth, fueled by billions in annual funding from heavyweights like Microsoft and RevTech Labs, which has created thousands of fintech jobs and made the city a go-to for tech pros looking for their next big opportunity.

Key Facts About Charlotte Tech

  • Number of Tech Workers: 90,859; 6.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Lowe’s, Bank of America, TIAA, Microsoft, Honeywell
  • Key Industries: Fintech, artificial intelligence, cybersecurity, cloud computing, e-commerce
  • Funding Landscape: $3.1 billion in venture capital funding in 2024 (CED)
  • Notable Investors: Microsoft, Google, Falfurrias Management Partners, RevTech Labs Foundation
  • Research Centers and Universities: University of North Carolina at Charlotte, Northeastern University, North Carolina Research Campus

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account