Overall Experience - 5+
Location - Chennai
Interview Mode - Walkin
Interview Date - 9-May-26
Key Responsibilities
- Design, develop, and deploy data pipelines and ETL processes using GCP services such as BigQuery, Dataflow, Dataproc, and Pub/Sub.
- Implement data modeling, data integration, and data warehousing solutions to support analytics and business intelligence initiatives.
- Optimize and monitor data workflows for performance, reliability, and cost-effectiveness.
- Collaborate with cross-functional teams to define data requirements and deliver end-to-end solutions.
- Ensure data security, privacy, and compliance with relevant policies and standards.
- Troubleshoot and resolve issues related to data processing and cloud infrastructure.
Qualifications
- Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related field.
- Proven experience working with GCP data services (BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, etc.).
- Strong proficiency in SQL, Python, and data processing frameworks such as Apache Beam or Spark.
- Experience with data modeling, ETL development, and data warehousing concepts.
- Familiarity with cloud security best practices and data governance.
- Excellent problem-solving skills and the ability to work independently or as part of a team.
- GCP Professional Data Engineer certification is a plus.