Timescale Logo

Timescale

Data Engineer

Posted 8 Days Ago
In-Office or Remote
3 Locations
Mid level
In-Office or Remote
3 Locations
Mid level
As a Data Engineer, you will design, build, and maintain data infrastructure, optimize database performance, and ensure data accessibility for analytics and business insights.
The summary above was generated by AI

At TigerData, formerly Timescale, we empower developers and businesses with the fastest PostgreSQL platform designed for transactional, analytical, and agentic workloads. Trusted globally by thousands of organizations, TigerData accelerates real-time insights, drives intelligent applications, and powers critical infrastructure at scale. As a globally distributed, remote-first team committed to direct communication, accountability, and collaborative excellence, we're shaping the future of data infrastructure, built for speed, flexibility, and simplicity.

TigerData is looking for a skilled and innovative Data Engineer with expertise in building scalable data infrastructure and a passion for enabling data-driven decision making across the organization. You will play a crucial role in designing, building, and maintaining the data systems that power our analytics, product insights, and business intelligence initiatives for Timescale, our open-source database for real-time analytics and time series at scale.

Data Engineers at Timescale are essential for ensuring our teams have reliable, accurate, and accessible data to make informed decisions. You'll design and implement robust ETL/ELT processes, manage data infrastructure, optimize database performance, and collaborate closely with Product, Finance, Engineering, and Leadership teams to enable self-service analytics and data democratization.

You'll succeed at Timescale if you are systematic, detail-oriented, performance-focused, a collaborative problem-solver, excited by technical challenges and scale, and passionate about building reliable data infrastructure that empowers teams to extract insights from complex datasets.

Timescale is a remote company with team members around the world, and English language fluency is a requirement. The preferred candidate for this role will be based in the United States or Europe.

Responsibilities:

  • Design, build, and maintain scalable data pipelines and ETL/ELT processes to ingest, transform, and deliver data from various sources including application databases, event streams, and third-party APIs.

  • Architect and optimize data warehouse solutions, ensuring efficient storage, retrieval, and processing of large-scale time-series and analytical datasets.

  • Implement and maintain data quality frameworks, monitoring systems, and alerting mechanisms to ensure data accuracy, completeness, and reliability across all data systems.

  • Collaborate with Product Managers, Marketing, Finance, and Sales to understand data requirements and build infrastructure that enables self-service analytics and advanced data exploration.

  • Optimize database performance, including query optimization, indexing strategies, and capacity planning for both operational and analytical workloads.

  • Build and maintain data infrastructure using cloud platforms (AWS, GCP, Azure) and modern data stack tools, ensuring scalability, security, and cost-effectiveness.

  • Develop and maintain data documentation, schemas, and governance processes to ensure data discoverability and proper usage across teams.

  • Work closely with Engineering teams to implement event tracking, logging, and instrumentation that captures meaningful product and user behavior data.

  • Support real-time data processing requirements and streaming analytics use cases, leveraging Timescale's time-series capabilities.

  • Champion data engineering best practices, including version control, testing, monitoring, and CI/CD for data pipelines.

Requirements:

  • 4+ years of proven experience as a Data Engineer, Analytics Engineer, or similar role, with significant experience building and maintaining production data pipelines.

  • Expert proficiency in SQL for complex data transformations, performance optimization, and working with large datasets. Strong PostgreSQL experience is highly preferred.

  • Proficiency in Python or another programming language for data pipeline development, automation, and scripting.

  • Experience with modern data stack tools such as dbt, Airflow, Dagster, or similar orchestration and transformation frameworks.

  • Strong experience with cloud data platforms (AWS Redshift/RDS, Google BigQuery/Cloud SQL, Azure Synapse, or Snowflake) and their associated data services.

  • Understanding of data modeling concepts, dimensional modeling, and database design principles for both OLTP and OLAP systems.

  • Experience with data visualization and BI tools (Metabase, Tableau, Looker) and building data marts for analytical consumption.

  • Strong understanding of data governance, security, and privacy principles, including experience with data lineage and cataloging tools.

  • Excellent problem-solving skills with ability to troubleshoot complex data issues, optimize performance bottlenecks, and scale systems efficiently.

  • Experience working in agile, cross-functional teams with strong communication skills for collaborating with both technical and non-technical stakeholders.

  • Understanding of software engineering best practices including version control, testing, code reviews, and CI/CD pipelines.

  • Experience with time-series databases, developer tooling, or data infrastructure products is a significant advantage.

  • Bachelor's degree in Computer Science, Engineering, Mathematics, or related technical field, or equivalent practical experience.

Our Commitment:
  • We respond to every applicant.

  • We review applications fairly and objectively, and shortlist based on relevant skills and experience.

  • We ensure clear and timely communication throughout your candidate journey.

  • We maintain a rigorous interview process with a high bar, designed to give you the opportunity to meet various team members you'll collaborate with across our organization.

About TigerData🐯

TigerData, formerly Timescale, sets the standard as the fastest PostgreSQL platform for modern workloads. Trusted by more than 2,000 customers across 25+ countries and powering over 3 million active databases, we enable developers and organizations to build real-time, intelligent applications at scale. Backed by $180 million from top-tier investors, TigerData is building the new standard for data infrastructure, built on PostgreSQL, designed for the future.

👉 👉 Want to get a feel for how we work and what we value? Check out our blog post: What It Takes to Thrive at TigerData

We embrace diversity, curiosity, and collaboration. Whether debating the perfect chicken nugget crunch 🍗, sharing workout routines 💪, or discussing your favorite plants 🌱 and pets 🐾, you'll find your community here.

Our Tech Stack:

We don't require previous experience with our tech stack, but enthusiasm for learning is key. Our technologies include PostgreSQL, Tiger Cloud, AWS, Go, Docker, Kubernetes, Python, and innovative features like Hypertables, Hypercore, vector search, and real-time analytics.

Learn more at www.tigerdata.com or follow us on Twitter @TigerDatabase

What We Offer:

(Please note that benefits may vary based on country.)

  • Flexible PTO and comprehensive family leave

  • Fridays off in August 😎

  • Fully remote opportunities globally

  • Stock options for long-term growth

  • Monthly WiFi stipend

  • Professional development and educational resources 📚

  • Premium insurance options for you and your family (US-based employees)

Ready to join the future of PostgreSQL? We can’t wait to meet you. 🚀🐯

Top Skills

Airflow
AWS
Azure
Dbt
Docker
GCP
Kubernetes
Postgres
Python
SQL

Similar Jobs

Yesterday
In-Office or Remote
Miami, FL, USA
100K-130K Annually
Mid level
100K-130K Annually
Mid level
Big Data • Cannabis • eCommerce • Logistics • Database • Business Intelligence • Big Data Analytics
Design and maintain a data analytics platform, build ETL pipelines, standardize and model data, optimize database performance, and ensure data quality.
Top Skills: AirbyteAWSDbtExcelGithub ActionsGlueHevoMetabaseModeMySQLPostgresPower BIPythonSnowflakeSparkSQLTerraformThoughtspot
14 Days Ago
Remote or Hybrid
Fort Walton Beach, FL, USA
78K-132K Annually
Mid level
78K-132K Annually
Mid level
Aerospace • Hardware • Information Technology • Security • Software • Cybersecurity • Defense
Design and deploy BI solutions using AI techniques, develop data pipelines, ensure data quality, and collaborate with stakeholders for data analysis.
Top Skills: AWSAzurePower BIPythonPyTorchRScikit-LearnSQLTableauTensorFlow
15 Days Ago
Easy Apply
Remote or Hybrid
3 Locations
Easy Apply
140K-160K
Senior level
140K-160K
Senior level
AdTech • Big Data • Information Technology • Marketing Tech • Sales • Software
As a Data Engineer III, you will develop and maintain high-performance ETL pipelines and scalable data processing systems on Google Cloud Platform, while collaborating with teams to deliver quality solutions.
Top Skills: AirflowBigQueryDataflowGoogle Cloud PlatformKubernetesPythonSQL

What you need to know about the Charlotte Tech Scene

Ranked among the hottest tech cities in 2024 by CompTIA, Charlotte is quickly cementing its place as a major U.S. tech hub. Home to more than 90,000 tech workers, the city’s ecosystem is primed for continued growth, fueled by billions in annual funding from heavyweights like Microsoft and RevTech Labs, which has created thousands of fintech jobs and made the city a go-to for tech pros looking for their next big opportunity.

Key Facts About Charlotte Tech

  • Number of Tech Workers: 90,859; 6.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Lowe’s, Bank of America, TIAA, Microsoft, Honeywell
  • Key Industries: Fintech, artificial intelligence, cybersecurity, cloud computing, e-commerce
  • Funding Landscape: $3.1 billion in venture capital funding in 2024 (CED)
  • Notable Investors: Microsoft, Google, Falfurrias Management Partners, RevTech Labs Foundation
  • Research Centers and Universities: University of North Carolina at Charlotte, Northeastern University, North Carolina Research Campus

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account