Oscilar Logo

Oscilar

Sr. Data Engineer

Reposted 3 Days Ago
Remote
2 Locations
Senior level
Remote
2 Locations
Senior level
As a Senior Data Engineer, you will design and maintain data infrastructure, building scalable data pipelines for real-time analytics and decisioning, while collaborating with cross-functional teams.
The summary above was generated by AI

Shape the future of trust in the age of AI
At Oscilar, we're building the most advanced AI Risk Decisioning™ Platform. Banks, fintechs, and digitally native organizations rely on us to manage their fraud, credit, and compliance risk with the power of AI. If you're passionate about solving complex problems and making the internet safer for everyone, this is your place.

Why join us:
  • Mission-driven teams: Work alongside industry veterans from Meta, Uber, Citi, and Confluent, all united by a shared goal to make the digital world safer.

  • Ownership and impact: We believe in extreme ownership. You'll be empowered to take responsibility, move fast, and make decisions that drive our mission forward.

  • Innovate at the cutting edge: Your work will shape how modern finance detects fraud and manages risk.

Job Description

As a Senior Data Engineer at Oscilar, you will be responsible for designing, building, and maintaining the data infrastructure that powers our AI-driven decisioning and risk management platform. You will collaborate closely with cross-functional teams, ensuring the delivery of highly reliable, low-latency, and scalable data pipelines and storage solutions that support real-time analytics and mission-critical ML/AI models.

Responsibilities
  • Architect and implement scalable ETL and data pipelines spanning ClickHouse, Postgres, Athena, and diverse cloud-native sources to support real-time risk management and advanced analytics for AI-driven decisioning.

  • Design, develop, and optimize distributed data storage solutions to ensure both high performance (low latency, high throughput) and reliability at scale—serving mission-critical models for fraud detection and compliance.

  • Drive schema evolution, data modeling, and advanced optimizations for analytical and operational databases, including sharding, partitioning, and pipeline orchestration (batch, streaming, CDC frameworks).

  • Own the end-to-end data flow: integrate multiple internal and external data sources, enforce data validation and lineage, automate and monitor workflow reliability (CI/CD for data, anomaly detection, etc.).

  • Collaborate cross-functionally with engineers, product managers, and data scientists to deliver secure, scalable solutions that enable fast experimentation and robust operationalization of new ML/AI models.

  • Champion radical ownership—identify opportunities, propose improvements, and implement innovative technical and process solutions within a fast-moving, remote-first culture.

  • Mentor and upskill team members, cultivate a learning environment, and contribute to a collaborative, mission-oriented culture.

Qualifications
  • 5+ years in data engineering (or equivalent), including architecting and operating production ETL/ELT pipelines for real-time, high-volume analytic platforms.

  • Deep proficiency with ClickHouse, Postgres, Athena, and distributed data systems (Kafka, cloud-native stores); proven experience with both batch and streaming pipeline design.

  • Advanced programming in Python and SQL, with bonus points for Java; expertise in workflow orchestration (Airflow, Step Functions), CI/CD, and automated testing for data.

  • Experience in high-scale, low-latency environments; understanding of security, privacy, and compliance requirements for financial-grade platforms.

  • Strong communication, business alignment, and documentation abilities—capable of translating complex tech into actionable value for customers and stakeholders.

  • Alignment with Oscilar’s values: customer obsession, radical ownership, bold vision, efficient growth, and unified teamwork with a culture of trust and excellence.

Nice-to-have
  • Experience integrating Kafka with analytics solutions like ClickHouse.

  • Knowledge of event-driven architecture and streaming patterns like CQRS and event sourcing.

  • Hands-on experience with monitoring tools (e.g., Prometheus, Grafana, Kafka Manager).

  • Experience automating infrastructure with tools like Terraform or CloudFormation.

  • Proficiency with Postgres, Redis, ClickHouse, and DynamoDB. Experience with data modeling, query optimization, and high-transaction databases.

  • Familiarity with encryption, role-based access control, and secure API development.

Benefits
  • Compensation: Competitive salary and equity packages, including a 401k

  • Flexibility: Remote-first culture — work from anywhere

  • Health: 100% Employer covered comprehensive health, dental, and vision insurance with a top tier plan for you and your dependents (US)

  • Balance: Unlimited PTO policy

  • Technical: AI First company; both Co-Founders are engineers at heart; and over 50% of the company is Engineering and Product

  • Culture: Family-Friendly environment; Regular team events and offsites

  • Development: Unparalleled learning and professional development opportunities

  • Gear: Home office setup assistance

  • Impact: Making the internet safer by protecting online transactions

Top Skills

Airflow
Athena
Clickhouse
CloudFormation
DynamoDB
Grafana
Java
Kafka
Postgres
Prometheus
Python
Redis
SQL
Terraform

Similar Jobs

9 Days Ago
Remote
2 Locations
Senior level
Senior level
Artificial Intelligence • Productivity • Software • Automation
The Senior Data Engineer will design and maintain scalable data systems, optimize workflows, ensure data quality, and mentor other engineers while collaborating across teams to enhance data access and product performance.
Top Skills: DatabricksPythonSparkSQLTypescript
18 Days Ago
Remote
Canada
148K-195K Annually
Mid level
148K-195K Annually
Mid level
Blockchain • Fintech • Payments • Financial Services • Cryptocurrency • Web3
As a Data Engineer at Circle, you'll build and maintain data warehouses and pipelines for blockchain analytics, ensuring data accuracy and improving operational performance.
Top Skills: AirflowAWSAzureBigQueryDagsterDatabricksDbtGCPJavaPythonScalaSnowflakeSQL
18 Days Ago
Remote
Canada
148K-195K Annually
Mid level
148K-195K Annually
Mid level
Blockchain • Fintech • Payments • Financial Services • Cryptocurrency • Web3
The Data Engineer will manage data warehouses and pipelines for blockchain analytics, ensuring data accuracy for product development and reporting. Collaboration with data teams is vital for operational excellence.
Top Skills: AirflowAWSAzureBigQueryDagsterDatabricksDbtGCPJavaPythonScalaSnowflakeSQL

What you need to know about the Charlotte Tech Scene

Ranked among the hottest tech cities in 2024 by CompTIA, Charlotte is quickly cementing its place as a major U.S. tech hub. Home to more than 90,000 tech workers, the city’s ecosystem is primed for continued growth, fueled by billions in annual funding from heavyweights like Microsoft and RevTech Labs, which has created thousands of fintech jobs and made the city a go-to for tech pros looking for their next big opportunity.

Key Facts About Charlotte Tech

  • Number of Tech Workers: 90,859; 6.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Lowe’s, Bank of America, TIAA, Microsoft, Honeywell
  • Key Industries: Fintech, artificial intelligence, cybersecurity, cloud computing, e-commerce
  • Funding Landscape: $3.1 billion in venture capital funding in 2024 (CED)
  • Notable Investors: Microsoft, Google, Falfurrias Management Partners, RevTech Labs Foundation
  • Research Centers and Universities: University of North Carolina at Charlotte, Northeastern University, North Carolina Research Campus

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account