Alpaca Logo

Alpaca

Senior Analytics Engineer

Posted 17 Days Ago
Remote
21 Locations
Mid level
Remote
21 Locations
Mid level
As a Senior Analytics Engineer, you'll lead the transformation layer of Alpaca's data platform, integrating diverse data sources and ensuring high standards in data modeling and accessibility for various stakeholders.
The summary above was generated by AI

Who We Are:

Alpaca is a US California headquartered brokerage infrastructure technology company and self-clearing broker-dealer, delivering execution and custody solutions for Stocks, ETFs, Options, Cryptocurrencies, and more, and has raised over $170 million in funding. Amongst our subsidiaries, Alpaca is a licensed financial services company in multiple countries, and we serve hundreds of financial institutions globally such as broker-dealers, investment advisors, hedge funds, and crypto exchanges.

Alpaca’s globally distributed team members bring in diverse experiences such as engineers, traders, and brokerage professionals to achieve our Mission of opening financial services to everyone on the planet. We are also deeply committed to open-source contributions and fostering a vibrant community. We will continue to enhance and improve our award-winning developer-friendly API and the infrastructure behind it.

Our Team Members:

We’re a team of 150+ globally distributed members who love working from our favorite places worldwide. Our team spans the USA, Canada, Japan, Hungary, Nigeria, Brazil, the United Kingdom, and more!

We’re looking for candidates eager to join Alpaca’s growing organization, who are excited about our Mission of “Open financial services to everyone on the planet and share our Values of “Stay Curious,” “Have Empathy,” and “Be Accountable.”


Your Role:
We are looking for a Senior Analytics Engineer to lead the vision for the transformation layer of our data platform. This platform integrates data from transactional databases backing core Alpaca applications, API logs, CRMs, payment systems, and marketing platforms. We process hundreds of millions of events daily, a number that is rapidly growing as we onboard new customers.

We prioritize open-source solutions and use Google Cloud Platform (GCP) as the foundation of our data infrastructure. Our transformation layer, powered by dbt running through the Trino query engine, builds data models that are delivered to end users via BI tools, reports, and reverse ETL. Our stakeholders range from finance to operations, customer success, and the executive team, necessitating varying data availability (from monthly to near-real-time) and integrity (up to cent-level precision). You will be working alongside our Data Engineers who drive the ingress of the data and the infrastructure of the Lakehouse and related services, and Data Scientists who consume the data models and augment them with new features.

Our team is 100% distributed and remote.

Responsibilities:

  • Develop scalable patterns in the transformation layer to support consistent BI tools integration across business verticals.
  • Ensure data discoverability and maintain high standards of change management, including model testing and data monitoring, as Alpaca’s products evolve.
  • Seamlessly integrate the Lakehouse with BI tooling to create repeatable ways of surfacing metrics to end users.
  • Set a high standard for development practices, ensuring quality in new data models and their orchestration.
  • Collaborate closely with finance, operations, customer success, and marketing teams to meet data modeling needs.

Must-Haves:

  • 3+ years of experience in data engineering or data analytics, focusing on the transformation component of the ELT process.
  • Experience building scalable transformation layers, preferably through formalized SQL models (e.g., dbt).
  • Experience in Python for transformations beyond SQL.
  • Experience with CI/CD and code version control.
  • Strong hands-on experience with relational databases (e.g., Postgres, Iceberg), including query optimization.
  • Ability to adapt quickly in a fast-paced environment and tailor solutions to evolving business needs.
  • Experience with ETL technologies for ingestion (e.g., Airbyte) and orchestration (e.g., Airflow).
  • Experience working in a cloud environment (AWS or GCP).
How We Take Care of You:
  • Competitive Salary & Stock Options
  • Benefits: Health benefits start on day 1. In the US this includes Medical, Dental, Vision. In Canada, this includes supplemental health care. In Japan, you are offered local benefits. Internationally, this includes a stipend value to offset medical costs.   
  • New Hire Home-Office Setup: One-time USD $500
  • Monthly Stipend: USD $150 per month via a Brex Card
  • Work with awesome hard working people, super smart and cool clients and innovative partners from around the world

Alpaca is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse workforce.

Recruitment Privacy Policy

Top Skills

Airbyte
Airflow
Ci/Cd
Dbt
Google Cloud Platform
Postgres
Python
Trino

Similar Jobs

13 Days Ago
Remote
India
Senior level
Senior level
Artificial Intelligence • Financial Services
The Senior Analytics Engineer will design, develop, and maintain data architectures and pipelines, oversee database management, and mentor junior engineers while collaborating with cross-functional teams to meet analytical needs.
Top Skills: AirflowAWSAzureDatabricksGCPHadoopJavaJenkinsKafkaMariadbNoSQLPostgresPysparkPythonScalaSparkSQL
5 Days Ago
Remote
Hyderabad, Telangana, IND
50K-80K
Senior level
50K-80K
Senior level
Cloud • Software • Analytics
Expand and optimize data pipeline architecture and data flow for cross-functional teams, while supporting software developers and data scientists on data initiatives.
Top Skills: Amazon KinesisAWSCassandraEc2EmrJavaNode.jsOraclePostgresPythonRdsRedshiftS3Spark-StreamingSQLStorm
5 Days Ago
Remote
Pune, Mahārāshtra, IND
Senior level
Senior level
Information Technology • Consulting
The Senior Data Engineering Consultant will lead data platform development, support client relationships, and ensure effective data management solutions. Responsibilities include designing data architectures, developing data pipelines, and collaborating on project delivery.
Top Skills: Amazon RedshiftAmazon Web ServicesAws GlueAzure Data FactoryAzure SynapseDbtDockerGoogle BigqueryGoogle Cloud PlatformJavaKubernetesLookerMatillionAzurePower BIPythonQuicksightScalaSnowflakeTableau

What you need to know about the Charlotte Tech Scene

Ranked among the hottest tech cities in 2024 by CompTIA, Charlotte is quickly cementing its place as a major U.S. tech hub. Home to more than 90,000 tech workers, the city’s ecosystem is primed for continued growth, fueled by billions in annual funding from heavyweights like Microsoft and RevTech Labs, which has created thousands of fintech jobs and made the city a go-to for tech pros looking for their next big opportunity.

Key Facts About Charlotte Tech

  • Number of Tech Workers: 90,859; 6.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Lowe’s, Bank of America, TIAA, Microsoft, Honeywell
  • Key Industries: Fintech, artificial intelligence, cybersecurity, cloud computing, e-commerce
  • Funding Landscape: $3.1 billion in venture capital funding in 2024 (CED)
  • Notable Investors: Microsoft, Google, Falfurrias Management Partners, RevTech Labs Foundation
  • Research Centers and Universities: University of North Carolina at Charlotte, Northeastern University, North Carolina Research Campus

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account