G2i Logo

G2i

Data Engineer

Reposted 4 Days Ago
Remote
Hiring Remotely in USA
Mid level
Remote
Hiring Remotely in USA
Mid level
The Data Engineer will design and maintain data pipelines, work with diverse datasets, and ensure data quality for analytics. The role requires hands-on Python and SQL experience and involves collaboration with various teams to enhance data infrastructure.
The summary above was generated by AI
Data Engineer (Python Developer Background Preferred)

Position: Data Engineer (with Python development background)
Location: Remote
Team: Technology / Data Platform

About the Role:
Our client is on the hunt for a Data Engineer who brings strong Python development experience and is eager to build data pipelines from scratch. This role is pivotal to shaping the data infrastructure that powers AI-driven insurance analytics and risk exchange platform.

Key Responsibilities:

  • Design, develop, and maintain robust, metadata-driven data pipelines (batch and real-time).

  • Ingest, transform, and cleanse varied datasets—from internal sources, partners, and third-party providers.

  • Develop and support cloud-based infrastructure, APIs, and data services that drive Accelerant’s SaaS platform and analytics.

  • Ensure data quality through effective monitoring, metrics, validation, and DataOps practices.

  • Collaborate closely with product managers, data scientists, analysts, and engineers to understand data needs and to deliver scalable, performant solutions.

  • Continuously refine pipelines for better performance, scalability, and reliability.

Qualifications:

Required:

  • Proven experience as a Python Developer (3–5 years or more), with a clear progression into data engineering.

  • Hands-on experience building data pipelines from scratch using Python.

  • Strong SQL skills and familiarity with modern data stack tools.

  • Experience with cloud platforms (preferably AWS), consistent with Accelerant’s tech environment Accelerant Risk Exchange+1.

  • Solid understanding of DataOps methodologies and tools Accelerant Risk Exchange.

Nice to Have:

  • Familiarity with Snowflake, dbt, or similar modern data warehousing and transformation tools Accelerant Risk Exchange+1.

  • Experience managing structured and unstructured datasets.

  • Exposure to AI/ML workflows or infrastructure.

  • Background in real-time data ingestion or API-based data services.

Why You’ll Love Working Here:

  • Join a remote-first, high-trust team where engineers, data scientists, and seasoned insurance experts collaborate closely and ship features rapidly Accelerant Risk Exchange.

  • Be part of a mission-driven platform that bridges technological innovation with insurance industry transformation.

  • Gain autonomy, equity, flexible schedules, and the chance to impact small and midsize businesses through smarter insurance mechanisms Accelerant Risk Exchange+1.

Ready to Apply?
Please send your resume and a brief cover letter highlighting your Python background and pipeline-building experience

Top Skills

AWS
Dbt
Python
Snowflake
SQL

Similar Jobs

5 Days Ago
Easy Apply
Remote or Hybrid
2 Locations
Easy Apply
90K-100K
Junior
90K-100K
Junior
AdTech • Big Data • Information Technology • Marketing Tech • Sales • Software
The Data Engineer will develop and optimize ETL pipelines and systems in Google Cloud Platform, collaborating with engineers and scientists to support B2B data solutions.
Top Skills: AirflowBigQueryComposerDataflowGoogle Cloud PlatformKubernetesPub/SubPythonSQL
5 Days Ago
Remote or Hybrid
8 Locations
109K-201K Annually
Senior level
109K-201K Annually
Senior level
Artificial Intelligence • Healthtech • Machine Learning • Natural Language Processing • Biotech • Pharmaceutical
Develop and maintain data infrastructure and analytics for healthcare data. Collaborate on technical architectures and oversee data systems to ensure quality data practices and improve productivity.
Top Skills: DatabricksPysparkPythonSnowflakeSQLTableau
10 Days Ago
Remote or Hybrid
Orlando, FL, USA
5-5
Senior level
5-5
Senior level
AdTech • Cloud • Digital Media • Information Technology • News + Entertainment • App development
As a Data Engineer II, you will design, build, and maintain data pipelines, collaborate with product managers, and troubleshoot production issues.
Top Skills: Apache AirflowAWSDynamoDBGlueJavaPysparkPythonRedshiftS3ScalaSQLSQL

What you need to know about the Charlotte Tech Scene

Ranked among the hottest tech cities in 2024 by CompTIA, Charlotte is quickly cementing its place as a major U.S. tech hub. Home to more than 90,000 tech workers, the city’s ecosystem is primed for continued growth, fueled by billions in annual funding from heavyweights like Microsoft and RevTech Labs, which has created thousands of fintech jobs and made the city a go-to for tech pros looking for their next big opportunity.

Key Facts About Charlotte Tech

  • Number of Tech Workers: 90,859; 6.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Lowe’s, Bank of America, TIAA, Microsoft, Honeywell
  • Key Industries: Fintech, artificial intelligence, cybersecurity, cloud computing, e-commerce
  • Funding Landscape: $3.1 billion in venture capital funding in 2024 (CED)
  • Notable Investors: Microsoft, Google, Falfurrias Management Partners, RevTech Labs Foundation
  • Research Centers and Universities: University of North Carolina at Charlotte, Northeastern University, North Carolina Research Campus

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account