Key Responsibilities
- Design, develop, and maintain scalable data pipelines using Node.js on GCP
- Build and optimize ETL/ELT workflows for batch and streaming data
- Work with BigQuery, Cloud Storage, Pub/Sub, Dataflow, and Cloud Functions
- Develop RESTful and eventdriven services in Node.js for data ingestion and processing
- Ensure data quality, reliability, and performance across pipelines
- Optimize SQL queries and data models for analytics and reporting
- Implement CI/CD pipelines and follow DevOps best practices
- Monitor, troubleshoot, and tune production data systems
- Collaborate with product owners, analysts, and application teams
- Ensure compliance with security, governance, and privacy standards
Technical Skills (Must Have)
- Experience in Data Engineering
- Strong proficiency in Node.js (async programming, performance optimization)
- Handson experience with Google Cloud Platform, including:
- BigQuery
- Cloud Storage
- Pub/Sub
- Dataflow
- Cloud Functions / Cloud Run
- Strong SQL skills and experience with large datasets
- Experience designing data models for analytics and reporting
- Familiarity with REST APIs and eventdriven architectures
- Experience with Git, CI/CD, and Agile methodologies