Humberger Nav
mployee.me logo
Web App Developer
Cushman & Wakefield
naukri
Mumbai
20-20 years
Not Disclosed
Full time
04 May 2026
Top Skills:
LogisticsRestfulUiData GovernanceSparkTypescriptAiDevopsLlmFlaskDockerPythonArchitectureData QualityAzureFinanceGitData LineageBack EndAi ModelFront EndSqlApplication DeveloperGithubDatabricksReactAccess ControlAiApiApi DesignArchitectureAuthenticationAzureAzure DevopsCi/cdCloudCompositionCytoscapeDashboardData GovernanceData QualityDatabricksDependency InjectionDevopsDockerEnterpriseEnterprise Data ManagementFastapiFlaskGitGithubGreenfieldJsMaster Data ManagementOpenapiPerimeterPipelinePythonReactReal EstateReduxResource ManagementRest ApiRest ApisRestful ApiSdkSparkSqlState ManagementTechnical DocumentationToolingToolkitsTypescriptUiWeb ApplicationWeb Development

96

Get Personalized Job Matches with 1 Click

Job Description iconJob Description
Download Resume iconDownload Resume

Job Title

Web App Developer

Job Description Summary

Our Enterprise Data team is building a Databricks-native data platform spanning master data management, AI-powered data quality and lineage, and a suite of internal tools used by data stewards, analytics engineers, and business stakeholders across multiple global regions. We are hiring a mid-level Web Application Developer to design, build, and operate the front-end and API layer of these tools with Databricks Apps as the primary deployment target and Unity Catalog as the data authority.


You will work closely with data engineers, platform architects, and business stakeholders to turn complex data workflows stewardship queues, lineage dashboards, quality monitoring into polished, production-grade web applications that feel enterprise-ready without sacrificing developer velocity.

Job Description

Key Responsibilities

Application Development

Design and build React + FastAPI web applications deployed as Databricks Apps, authenticated via the Databricks SDK

Develop data stewardship and workflow UIs review queues, confidence visualisations, approval/rejection flows

Build interactive dashboards surfacing data lineage, quality scores, and pipeline health metrics from Unity Catalog

Implement REST APIs in FastAPI, integrating with Databricks SQL Connector, Delta tables, and AI Model Serving endpoints

Platform Integration

Integrate front-end applications with Unity Catalog through the Databricks SDK and REST APIs table discovery, access control, lineage graphs

Consume AI-generated outputs (narrative summaries, quality assessments) from model serving endpoints and surface them in the UI

Author and maintain Databricks Asset Bundles (DABs) configurations for CI/CD deployment across DEV, INT, UAT, and PROD environments

Collaborate with platform engineers to ensure apps are stateless, containerised, and scalable within the Databricks workspace security perimeter

Quality Collaboration

Write unit and integration tests; maintain >80% coverage on critical paths

Participate in design reviews, PR reviews, and architecture discussions with globally distributed team members

Produce clear technical documentation API contracts, component libraries, deployment runbooks suitable for handover to platform operations

Contribute to and uphold front-end coding standards, TypeScript typing discipline, and accessibility guidelines

Required Skills Experience

Core Web Development

36 years of professional experience building web applications in a TypeScript / React environment

Proficiency with React hooks, component composition, state management (Redux Toolkit or Zustand), and modern build tooling (Vite / ESBuild)

Python back-end experience with FastAPI or Flask REST API design, async handlers, dependency injection

Solid understanding of RESTful API design and OpenAPI / Swagger documentation

Databricks Data Platform

Hands-on experience building or operating Databricks Apps SDK authentication, workspace resource management, app lifecycle

Working knowledge of Unity Catalog schemas, tables, volumes, grants, and lineage APIs

Familiarity with Delta Lake table formats, Spark SQL, and the DBSQL Connector for Python

Exposure to Databricks Asset Bundles (DABs) for CI/CD and multi-environment deployment patterns

Enterprise Tooling Practices

Experience working within enterprise DevOps pipelines Azure DevOps or GitHub Actions preferred

Comfortable with containerisation (Docker) and cloud-hosted deployment (Azure preferred)

Proficiency with Git, pull-request workflows, and semantic versioning

Ability to read and write basic SQL; comfort querying Databricks SQL warehouses from application code

Desirable Skills

Experience integrating with LLM / Generative AI APIs Anthropic Claude, OpenAI, or cloud-hosted models via Databricks Model Serving

Familiarity with data quality tooling Great Expectations, dbt tests, or custom DQ frameworks on Delta

Experience building MDM, data stewardship, or data governance UIs

Azure cloud certifications (AZ-900 / AZ-204) or Databricks Associate / Professional credentials

Exposure to graph visualisation libraries (D3.js, Cytoscape.js) for lineage and dependency mapping

Familiarity with geospatial libraries or mapping toolkits (Deck.gl, Mapbox, H3)

Domain knowledge in any data-intensive industry finance, real estate, logistics, retail, or similar

What We Offer

Meaningful ownership of a greenfield platform used by data professionals across a large global enterprise

Visibility to senior data and technology leadership your work informs strategic decisions

Opportunity to build at the intersection of enterprise data management and applied AI / agentic workflows

Competitive compensation benchmarked to the India technology talent market; performance-linked variable pay

Flexible remote-first working within India, with a collaborative team spanning multiple global regions

Learning and certification budget for Databricks, Azure, and related platforms







INCO: Cushman Wakefield