SMASH Logo

SMASH

Staff Data Engineer P-133

Reposted 6 Days Ago
Remote
Hiring Remotely in United States
Senior level
Remote
Hiring Remotely in United States
Senior level
Design and deliver scalable, GCP-native data solutions focusing on machine learning and analytics, architecting data storage, and implementing data workflows.
The summary above was generated by AI

SMASH, Who we are?
We believe in long-lasting relationships with our talent. We invest time getting to know them and understanding what they seek as their professional next step.

We aim to find the perfect match. As agents, we pair our talent with our US clients, not only by their technical skills but as a cultural fit. Our core competency is to find the right talent fast.

This position is remote within the United States. You must have U.S. citizenship or a valid U.S. work permit to apply for this role.

Role summary
You will design and deliver scalable, GCP-native data solutions that power machine learning and analytics initiatives. This role focuses on building high-quality, domain-driven data products and decentralized data infrastructure that enable rapid iteration, measurable outcomes, and long-term value creation.

Responsibilities

  • Design and implement a scalable, GCP-native data strategy aligned with machine learning and analytics initiatives.

  • Build, operate, and evolve reusable data products that deliver compounding business value.

  • Architect and govern squad-owned data storage strategies using BigQuery, AlloyDB, ODS, and transactional systems.

  • Develop high-performance data transformations and analytical workflows using Python and SQL.

  • Lead ingestion and streaming strategies using Pub/Sub, Datastream (CDC), and Cloud Dataflow (Apache Beam).

  • Orchestrate data workflows using Cloud Composer (Airflow) and manage transformations with Dataform.

  • Modernize legacy data assets and decouple procedural logic from operational databases into analytical platforms.

  • Apply Dataplex capabilities to enforce data governance, quality, lineage, and discoverability.

  • Collaborate closely with engineering, product, and data science teams in an iterative, squad-based environment.

  • Drive technical decision-making, resolve ambiguity, and influence data architecture direction.

  • Ensure data solutions are secure, scalable, observable, and aligned with best practices.

Requirements – Must-haves

  • 8+ years of professional experience in data engineering or a related discipline.

  • Expert-level proficiency in Python and SQL for scalable data transformation and analysis.

  • Deep expertise with Google Cloud Platform data services, especially BigQuery.

  • Hands-on experience with AlloyDB (PostgreSQL) and Cloud SQL (PostgreSQL).

  • Strong understanding of domain-driven data design and data product thinking.

  • Proven experience architecting ingestion pipelines using Pub/Sub and Datastream (CDC).

  • Expertise with Dataform, Cloud Composer (Airflow), and Cloud Dataflow (Apache Beam).

  • Experience modernizing legacy data systems and optimizing complex SQL/procedural logic.

  • Ability to work independently and lead initiatives with minimal guidance.

  • Strong critical thinking, problem-solving, and decision-making skills.

Nice-to-haves (optional)

  • Experience applying Dataplex for data governance and quality management.

  • Exposure to proprietary SQL dialects (T-SQL, PL/pgSQL).

  • Experience supporting machine learning or advanced analytics workloads.

  • Background working in decentralized, squad-based or product-oriented data teams.

  • Experience influencing technical direction across multiple teams or domains.

Top Skills

Alloydb
Apache Beam
BigQuery
Cloud Composer
Cloud Dataflow
Cloud Sql
Dataform
Dataplex
Datastream
Google Cloud Platform
Pub/Sub
Python
SQL

Similar Jobs

2 Days Ago
Easy Apply
Remote or Hybrid
USA
Easy Apply
158K-205K Annually
Senior level
158K-205K Annually
Senior level
Food • Software
The Senior Data Engineer handles ChowNow's data platform, collaborating with teams to enhance data availability and insights, supporting internal and customer-facing products.
Top Skills: AWSDbtPythonSnowflakeSQL
8 Days Ago
Remote
USA
150K-170K Annually
Senior level
150K-170K Annually
Senior level
Healthtech • Telehealth
The Senior Data Engineer will modernize data platforms, develop data pipelines on Snowflake using dbt, and ensure engineering best practices are followed.
Top Skills: AirflowDbtGitKafkaKinesisSnowflakeSQL
23 Days Ago
Remote
United States
Senior level
Senior level
AdTech • Marketing Tech • Sales
The Data Engineer will design and maintain scalable data solutions, build ETL pipelines, manage data lakes, and collaborate with cross-functional teams for analytics support.
Top Skills: AirflowAksSparkAzure DatabricksDatabricks WorkflowsPostgresPysparkPythonSpark Sql

What you need to know about the Charlotte Tech Scene

Ranked among the hottest tech cities in 2024 by CompTIA, Charlotte is quickly cementing its place as a major U.S. tech hub. Home to more than 90,000 tech workers, the city’s ecosystem is primed for continued growth, fueled by billions in annual funding from heavyweights like Microsoft and RevTech Labs, which has created thousands of fintech jobs and made the city a go-to for tech pros looking for their next big opportunity.

Key Facts About Charlotte Tech

  • Number of Tech Workers: 90,859; 6.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Lowe’s, Bank of America, TIAA, Microsoft, Honeywell
  • Key Industries: Fintech, artificial intelligence, cybersecurity, cloud computing, e-commerce
  • Funding Landscape: $3.1 billion in venture capital funding in 2024 (CED)
  • Notable Investors: Microsoft, Google, Falfurrias Management Partners, RevTech Labs Foundation
  • Research Centers and Universities: University of North Carolina at Charlotte, Northeastern University, North Carolina Research Campus

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account