Humberger Nav
mployee.me logo
Gcp Data Engineer
Tata Consultancy Services
naukri
Chennai
6-11 years
Not Disclosed
Full time
03 May 2026
Top Skills:
PubsubBigqueryDataprocData FlowGcp CloudAccess ControlAgileAirflowApacheBigqueryCi/cdCi/cd PipelineCloudCloud StorageCost OptimizationData LakeData ModelingData PipelineData ProcessingData QualityData SecurityData WarehouseData WarehousingDataflowDockerEtlGcpGitGoogle Cloud PlatformGovernanceKubernetesNosql DatabasePipelinePysparkPythonScalaSparkSqlStakeholder Management

96

Get Personalized Job Matches with 1 Click

Job Description iconJob Description
Download Resume iconDownload Resume

TCS Scheduled Face-to-Face Drive | Chennai


GCP Data Engineer

Company: Tata Consultancy Services (TCS)
Experience: 6 - 12 Years
Interview Mode: In-Person (Face-to-Face)
Interview Date: Saturday, 9th May
Location:
Taj Wellington Mews, Chennai
TRIL Infopark Limited, Rajiv Gandhi IT Expy,
Near Ramanujan Intellion Park, Tharamani,
Chennai 600113


Job Summary

We are seeking a skilled GCP Data Engineer to design, build, and maintain scalable data pipelines and analytics solutions on Google Cloud Platform. The ideal candidate will have strong experience working with large datasets, cloud-native data services, and modern data engineering practices.

Key Responsibilities

  • Design, develop, and maintain data pipelines on GCP
  • Build and optimize ETL/ELT workflows using Cloud Dataflow, Dataproc, or Cloud Composer
  • Develop solutions using BigQuery for data warehousing and analytics
  • Ingest data from multiple sources using Pub/Sub, Cloud Storage, and APIs
  • Ensure data quality, reliability, and performance
  • Implement data security, governance, and access controls
  • Collaborate with data scientists, analysts, and product teams
  • Monitor and troubleshoot pipeline failures and performance issues
  • Follow Agile development and CI/CD best practices

Required Skills & Qualifications

GCP & Data Engineering

  • Strong hands-on experience with Google Cloud Platform
  • Expertise in BigQuery
  • Experience with Cloud Dataflow (Apache Beam) and/or Dataproc (Spark)
  • Knowledge of Pub/Sub, Cloud Storage, and Cloud Composer (Airflow)
  • Strong understanding of data modeling, schema design, and partitioning

Programming & Tools

  • Proficiency in Python and/or SQL
  • Experience with Spark, PySpark, or Scala (preferred)
  • Hands-on experience with ETL/ELT tools
  • Experience with Git and CI/CD pipelines

Databases

  • Experience with SQL and NoSQL databases
  • Knowledge of data lakes and data warehouses

Good to Have

  • GCP certifications (e.g., Professional Data Engineer)
  • Experience with Kubernetes and Docker
  • Knowledge of ML pipelines and feature engineering
  • Exposure to real-time/streaming data processing
  • Experience in cost optimization on GCP

Soft Skills

  • Strong analytical and problem-solving skills
  • Good communication and stakeholder management abilities
  • Ability to work independently and in a team environment
  • Proactive mindset and attention to detail