Humberger Nav
mployee.me logo
Senior Real-time Data Engineer
Accolite
linkedin
Hyderabad, Telangana, India
3-5 years
Not Disclosed
Full time
05 May 2026
Top Skills:
ApacheApiArchitectureBusiness LogicComplex SaleCrmDashboardData CaptureData GovernanceData SystemGraphqlIndexingJsKafkaKubernetesNode.jsPipelinePostgresqlPythonQuery OptimizationRestSaasSaleSqlTerraformUser Interface

96

Get Personalized Job Matches with 1 Click

Job Description iconJob Description
Download Resume iconDownload Resume
About The Role

We are looking for a Senior Real-time Data Engineer to lead the architecture and development of our customer-facing analytics engine. Our Sales Engagement Platform processes millions of events daily—from email interactions to live call metadata. Your mission is to transform this firehose of raw data into high-performance, actionable insights for our users.

You will bridge the gap between backend data systems and the user interface, building a robust Semantic Layer that ensures our customers see the same "Single Source of Truth" across every dashboard, report, and API.

Key Responsibilities

  • Architecture & Scaling: Design and maintain a low-latency analytics stack capable of handling high-concurrency queries from thousands of concurrent SaaS users.
  • Real-time Ingestion: Build and optimize ingestion pipelines that move data from transactional databases (PostgreSQL) and event streams (Kafka/Kinesis) into Apache Pinot.
  • Semantic Layer Development: Use Cube (Cube.js) to model complex sales metrics (e.g., "Sequence Conversion Rate," "Attributed Revenue," "Meeting Booked Rate") ensuring consistency across the application.
  • Performance Engineering: Optimize Apache Pinot tables, indexing strategies, and Cube pre-aggregations to ensure dashboard widgets load in under 300ms.
  • API Strategy: Expose data models via REST/GraphQL APIs, collaborating closely with Frontend Engineers to build world-class data visualizations.
  • Data Governance: Implement multi-tenant security logic within the semantic layer to ensure strict data isolation between different customer accounts.

Technical Requirements

  • OLAP Expertise: 3+ years of experience with Apache Pinot (or similar technologies like ClickHouse/StarRocks) in a production environment.
  • Semantic Modeling: Deep experience with Cube (Cube.js), including advanced features like pre-aggregations, security contexts, and multi-tenant configurations.
  • Data Store Mastery: Expert-level knowledge of PostgreSQL, specifically in the context of analytical query optimization and Change Data Capture (CDC).
  • Streaming & Ingestion: Hands-on experience with real-time data movement (Debezium, Kafka, or Flink).
  • Software Craftsmanship: Proficiency in Node.js or Python, with a focus on building scalable backend services.
  • Language: Mastery of complex SQL and the ability to translate business logic into code-based data schemas.

Bonus Points If You Have

  • Experience building analytics specifically for CRM or Sales Tech ecosystems.
  • Contributions to open-source projects (specifically in the Pinot or Cube communities).
  • Experience with Infrastructure as Code (Terraform, Kubernetes) for managing data clusters.