Tiger Analytics Logo

Tiger Analytics

Principal Data Engineer (Azure)

Reposted 6 Days Ago
Be an Early Applicant
In-Office or Remote
Hiring Remotely in Toronto, ON
Senior level
In-Office or Remote
Hiring Remotely in Toronto, ON
Senior level
As a Principal Data Engineer, you will design scalable data ingestion pipelines, implement data lakes, and collaborate with various teams to deliver analytical solutions using Azure and big data technologies.
The summary above was generated by AI
Description

Tiger Analytics is a global AI and analytics consulting firm. With data and technology at the core of our solutions, we are solving problems that eventually impact the lives of millions globally. Our culture is modeled around expertise and respect with a team-first mindset. Headquartered in Silicon Valley, you’ll find our delivery centers across the globe and offices in multiple cities across India, the US, UK, Canada, and Singapore, including a
substantial remote global workforce.

We’re Great Place to Work-Certified™. Working at Tiger Analytics, you’ll be at the heart of an AI revolution. You’ll work with teams that push the boundaries of what is possible and build solutions that energize and inspire.

Requirements

Curious about the role? What your typical day would look like?

As a Principal Data Engineer (Azure), you would have hands on experience working on Azure as cloud, Databricks and some exposure/experience on Data Modelling. You will build and learn about a variety of analytics solutions & platforms, data lakes, modern data platforms, data fabric solutions, etc. using different Open Source, Big Data, and Cloud technologies on Microsoft Azure.

● Design and build scalable & metadata-driven data ingestion pipelines (For Batch and Streaming Datasets)

● Conceptualize and execute high-performance data processing for structured and unstructured data, and data
harmonization

● Schedule, orchestrate, and validate pipelines

● Design exception handling and log monitoring for debugging

● Ideate with your peers to make tech stack and tools-related decisions

● Interact and collaborate with multiple teams (Consulting/Data Science & App Dev) and various stakeholders to meet deadlines, to bring Analytical Solutions to life.

What do we expect?

● Experience in implementing Data Lake with technologies like Azure Data Factory (ADF), PySpark, Databricks, ADLS,

Azure SQL Database

● A comprehensive foundation with working knowledge of Azure Synapse Analytics, Event Hub & Streaming
Analytics, Cosmos DB, and Purview

● A passion for writing high-quality code and the code should be modular, scalable, and free of bugs (debugging
skills in SQL, Python, or Scala/Java).

● Enthuse to collaborate with various stakeholders across the organization and take complete ownership of
deliverables.

● Experience in using big data technologies like Hadoop, Spark, Airflow, NiFi, Kafka, Hive, Neo4J, Elastic Search

● Adept understanding of different file formats like Delta Lake, Avro, Parquet, JSON, and CSV

● Good knowledge of building and designing REST APIs with real-time experience working on Data Lake or
Lakehouse projects.

● Experience in supporting BI and Data Science teams in consuming the data in a secure and governed manner

● Certifications like Data Engineering on Microsoft Azure (DP-203) or Databricks Certified Developer (DE) are
valuable addition.

Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry.

Job Requirement

  • Mandatory: Azure Data Factory (ADF), PySpark, Databricks, ADLS, Azure SQL Database
  • Optional: Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB and Purview.
  • Strong programming, unit testing & debugging skills in SQL, Python or Scala/Java.
  • Some experience of using big data technologies like Hadoop, Spark, Airflow, NiFi, Kafka, Hive, Neo4J, Elastic
    Search.
  • Good Understanding of different file formats like Delta Lake, Avro, Parquet, JSON and CSV.
  • Experience of working in Agile projects and following DevOps processes with technologies like Git, Jenkins & Azure DevOps.
  • Good to have:
  • Experience of working on Data Lake & Lakehouse projects
  • Experience of building REST services and implementing service-oriented architectures.
  • Experience of supporting BI and Data Science teams in consuming the data in a secure and governed manner.
  • Certifications like Data Engineering on Microsoft Azure (DP-203) or Databricks Certified Developer (DE)
Benefits

This position offers an excellent opportunity for significant career development in a fast-growing and challenging entrepreneurial environment with a high degree of individual responsibility.

Top Skills

Adls
Airflow
Avro
Azure Data Factory
Azure Devops
Azure Sql Database
Azure Synapse Analytics
Cosmos Db
Csv
Databricks
Delta Lake
Elastic Search
Event Hub
Git
Hadoop
Hive
Jenkins
JSON
Kafka
Neo4J
Nifi
Parquet
Purview
Pyspark
Spark
Streaming Analytics

Similar Jobs

An Hour Ago
Remote or Hybrid
8 Locations
168K-297K Annually
Senior level
168K-297K Annually
Senior level
Blockchain • Fintech • Mobile • Payments • Software • Financial Services
The Product Lead will oversee account sign-up, login, and security features, leading a full stack development team while driving product strategy and execution to enhance customer trust and access.
Top Skills: ComplianceCustomer ResearchData AnalysisPlatform EngineeringProduct Management
An Hour Ago
Remote
Canada
154K-154K Annually
Junior
154K-154K Annually
Junior
Artificial Intelligence • Blockchain • Fintech • Financial Services • Cryptocurrency • NFT • Web3
The Software Engineer will design and implement financing products, improve developer efficiency, and mentor team members to enhance technical skills.
Top Skills: GoKafkaPostgresProtobufReact
An Hour Ago
Remote or Hybrid
7 Locations
150K-225K Annually
Senior level
150K-225K Annually
Senior level
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
The role involves managing ultra-high volume backend services in Golang for network communications between security cloud and customer resources, focusing on scalability, distributed systems, and performance optimization.
Top Skills: AWSAzureCassandraElasticsearchGCPGoKafka

What you need to know about the Charlotte Tech Scene

Ranked among the hottest tech cities in 2024 by CompTIA, Charlotte is quickly cementing its place as a major U.S. tech hub. Home to more than 90,000 tech workers, the city’s ecosystem is primed for continued growth, fueled by billions in annual funding from heavyweights like Microsoft and RevTech Labs, which has created thousands of fintech jobs and made the city a go-to for tech pros looking for their next big opportunity.

Key Facts About Charlotte Tech

  • Number of Tech Workers: 90,859; 6.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Lowe’s, Bank of America, TIAA, Microsoft, Honeywell
  • Key Industries: Fintech, artificial intelligence, cybersecurity, cloud computing, e-commerce
  • Funding Landscape: $3.1 billion in venture capital funding in 2024 (CED)
  • Notable Investors: Microsoft, Google, Falfurrias Management Partners, RevTech Labs Foundation
  • Research Centers and Universities: University of North Carolina at Charlotte, Northeastern University, North Carolina Research Campus

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account