Consultant Data Engineer
Tools & Technology : Snowflake, Snowsql, AWS, DBT, Snowpark, Airflow, DWH, Unix, SQL, Shell Scripting, Pyspark, GIT, Visual Studio, Service Now.
Understand business requirement and designing, developing & maintaining scalable automated data pipelines & ETL processes to ensure efficient data processing and storage.
Create a robust, extensible architecture to meet the client/business requirements
Snowflake objects with integration with AWS services and DBT
Involved in different type of data ingestion pipelines as per requirements.
Development in DBT (Data Build Tool) for data transformation as per the requirements Working on multiple AWS services integration with Snowflake.
Working with integration of structured data & Semi-Structure data sets
Work on Performance Tuning and cost optimization
Work on implementing CDC or SCD type 2
Design and build solutions for near real-time stream as well as batch processing.
Implement best practices for data management, data quality, and data governance.
Responsible for data collection, data cleaning & pre-processing using Snowflake and DBT
Investigate production issues and fine-tune our data pipelines
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery.
Co-ordinates and Support software developers, database architects, data analysts and data scientists on data initiatives
Orchestrate the pipeline using Airflow
Suggests improvements to processes, products and services.
Interact with users, management, and technical personnel to clarify business issues, identify problems, and suggest changes/solutions to business and developers.
Create technical documentation on confluence to aim knowledge sharing.
-Associate Data Engineer
Tools & Technology: Snowflake, DBT, AWS, Airflow, ETL, Datawarehouse, shell Scripting, SQL, Git, Confluence, Python
• Duties and Responsibility
Act as offshore Data engineer and enhancement & testing.
Design and build solutions for near real-time stream processing as well as batch processing.
Development in snowflake objects with there unique features implemented
Implementing data integration and transformation workflows using DBT
Integration with AWS services with snowflake
Participate in implementation plan, respond to production issues
Responsible for data collection, data cleaning & pre-processing
Experience in developing UDF, Snowflake Procedures, Streams, and Tasks.
Involved in troubleshooting customer data issues, manual load if any data missed, data duplication checking and handling with RCA
Investigate Productions jobs failure with including investigation till find out RCA.
Development of ETL processes and data integration solutions.
Understanding the business needs of the client and provide technical solution
Monitoring the overall functioning of process, identifying improvement area and implement help of scripting.
Handling major outages effectively along with effective communication to business, users & development partners.
Defines and creates Run Book entries and knowledge articles based on incidents experienced in the production
- Associate Engineer
• Tools and Technology: UNIX, ORACLE, Shell scripting, ETL, Hadoop, Spark, Sqoop, Hive, Control-m, Techtia, SQL, Jira, HDFS, Snowflake, DBT, AWS
• Duties and Responsibility
Worked as an Senior Production /Application Support Engineer
Working as Production support member for Loading, Processing and Reporting of files and generating Reports.
Monitoring multiple Batches, Jobs, Processes and analyzing issue related the job failures and Handling FTP failure, Connectivity issue of Batch/Job failure.
Performing data analysis on files and generating files and sending files to destination server depends on functionality of job.
Creating Shell Script for automating the daily task or Service Owner Requested.
Involved in tuning the Jobs to improve performance and performing daily checks.
Coordinating with Middleware, DWH, CRM and most of teams in case of any issue of CRQ.
Monitoring the overall functioning of process, identifying improvement area and implement help of scripting.
Involved in tuning the Jobs to improve performance and raising PBI after approval from Service owner.
Involved in performance improvement Automation activities to decrees manual workload
Data ingestion from RDBMS system to HDFS/Hive through SQOOP
Understand customer problems and provide appropriate technical solutions.
Handling major outages effectively along with effective communication to business, users & development partners.
Coordinating with Client, On- Site persons and joining the bridge call for any issues.
Handling daily issues based on application and jobs performance.

96
Get Personalized Job Matches with 1 Click
Consultant Data Engineer
Tools & Technology : Snowflake, Snowsql, AWS, DBT, Snowpark, Airflow, DWH, Unix, SQL, Shell Scripting, Pyspark, GIT, Visual Studio, Service Now.
Understand business requirement and designing, developing & maintaining scalable automated data pipelines & ETL processes to ensure efficient data processing and storage.
Create a robust, extensible architecture to meet the client/business requirements
Snowflake objects with integration with AWS services and DBT
Involved in different type of data ingestion pipelines as per requirements.
Development in DBT (Data Build Tool) for data transformation as per the requirements Working on multiple AWS services integration with Snowflake.
Working with integration of structured data & Semi-Structure data sets
Work on Performance Tuning and cost optimization
Work on implementing CDC or SCD type 2
Design and build solutions for near real-time stream as well as batch processing.
Implement best practices for data management, data quality, and data governance.
Responsible for data collection, data cleaning & pre-processing using Snowflake and DBT
Investigate production issues and fine-tune our data pipelines
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery.
Co-ordinates and Support software developers, database architects, data analysts and data scientists on data initiatives
Orchestrate the pipeline using Airflow
Suggests improvements to processes, products and services.
Interact with users, management, and technical personnel to clarify business issues, identify problems, and suggest changes/solutions to business and developers.
Create technical documentation on confluence to aim knowledge sharing.
-Associate Data Engineer
Tools & Technology: Snowflake, DBT, AWS, Airflow, ETL, Datawarehouse, shell Scripting, SQL, Git, Confluence, Python
• Duties and Responsibility
Act as offshore Data engineer and enhancement & testing.
Design and build solutions for near real-time stream processing as well as batch processing.
Development in snowflake objects with there unique features implemented
Implementing data integration and transformation workflows using DBT
Integration with AWS services with snowflake
Participate in implementation plan, respond to production issues
Responsible for data collection, data cleaning & pre-processing
Experience in developing UDF, Snowflake Procedures, Streams, and Tasks.
Involved in troubleshooting customer data issues, manual load if any data missed, data duplication checking and handling with RCA
Investigate Productions jobs failure with including investigation till find out RCA.
Development of ETL processes and data integration solutions.
Understanding the business needs of the client and provide technical solution
Monitoring the overall functioning of process, identifying improvement area and implement help of scripting.
Handling major outages effectively along with effective communication to business, users & development partners.
Defines and creates Run Book entries and knowledge articles based on incidents experienced in the production
- Associate Engineer
• Tools and Technology: UNIX, ORACLE, Shell scripting, ETL, Hadoop, Spark, Sqoop, Hive, Control-m, Techtia, SQL, Jira, HDFS, Snowflake, DBT, AWS
• Duties and Responsibility
Worked as an Senior Production /Application Support Engineer
Working as Production support member for Loading, Processing and Reporting of files and generating Reports.
Monitoring multiple Batches, Jobs, Processes and analyzing issue related the job failures and Handling FTP failure, Connectivity issue of Batch/Job failure.
Performing data analysis on files and generating files and sending files to destination server depends on functionality of job.
Creating Shell Script for automating the daily task or Service Owner Requested.
Involved in tuning the Jobs to improve performance and performing daily checks.
Coordinating with Middleware, DWH, CRM and most of teams in case of any issue of CRQ.
Monitoring the overall functioning of process, identifying improvement area and implement help of scripting.
Involved in tuning the Jobs to improve performance and raising PBI after approval from Service owner.
Involved in performance improvement Automation activities to decrees manual workload
Data ingestion from RDBMS system to HDFS/Hive through SQOOP
Understand customer problems and provide appropriate technical solutions.
Handling major outages effectively along with effective communication to business, users & development partners.
Coordinating with Client, On- Site persons and joining the bridge call for any issues.
Handling daily issues based on application and jobs performance.