Humberger Nav
mployee.me logo
ETL Developer with Pyspark
Logic Planet
foundit
Bengaluru / Bangalore, India
1-5 years
4L-8L
Full time
04 May 2026
Top Skills:
PythonOraclePysparkData WarehousingData PipelineSql ServerPl/SqlData ProcessingContinuous ImprovementData ArchitectureData IntegrationData PipelineData ProcessingData QualityData SystemEtl ProcessPyspark

96

Get Personalized Job Matches with 1 Click

Job Description iconJob Description
Download Resume iconDownload Resume

Key Responsibilities:

  • Design, develop, and implement data integration solutions using PySpark.
  • Collaborate with cross-functional teams to gather and prioritize requirements.
  • Develop and maintain large-scale data pipelines and data architectures.
  • Troubleshoot and resolve data processing and integration issues.
  • Ensure data quality, integrity, and validation through testing procedures.
  • Optimize data workflows for performance and scalability.
  • Support continuous improvement of ETL processes and data systems