https://bayt.page.link/FzCDgWXoYceRpvGi9
أنشئ تنبيهًا وظيفيًا للوظائف المشابهة

الوصف الوظيفي

Job Description:Seeking a motivated Data Ingestion and Integration Engineer to join our team as an individual contributor.
In this role, resource will be responsible for primarily support and designing, building, and optimizing automated data pipelines to support data technology
integration and ingestion needs.Qualifications:

Position Objective:
Informatica Platform Administration, Development & Production Support Review & analyze business data requirements working with IT and Business stake holders for Data warehousing, Data integration, Reporting projects. Prepare Technical ETL design document, develop, & support ETL processes using Informatica Power center, Informatica Cloud, Syncsort DMX-h (precisely) tools and production support 24/7 with on call support.
 
Competencies/Skills: Individual Contributor Competencies
 
Proficient in the following
Any ETL tool knowledge (eg: Precisely/Informatica)
Datalake/Databricks Concepts
Pyspark
SQL/Unix (basics)
Strong Understanding of LakeHouse concepts (eg : Delta format ,Workflow creation etc)
 
Knowledge:
Strong knowledge of ETL concepts especially as applied to Informatica, Informatica cloud.
Strong knowledge of data analysis, data transformation, conceptual data modeling, data transformation &metadata management.
Strong knowledge of data design for various applications.
Strong knowledge of SQL.
Good knowledge of various RDBMS, Mainframe
Good knowledge of Project Life Cycles methodologies.
Strong verbal & written communication skills.
Strong knowledge of Unix & Unix Shell scripting
Experience with Enterprise Scheduling tools such as Control-M is a plus.
 
Job responsibilities:
Provide Production Support, on call support for all ETL platforms.
Review & understand business data related ETL requirements for database, application and reporting projects.
Develop & make enhancement on new & existing data systems.
Participate in the development plans & resource estimation for task planning.
Assist with the development of logical data models.
Monitor production jobs & resolve any issues in a timely manner.
Partner with other IT areas in resolving issues & improving processes
Develop Informatica or ETL Workflows.
Develop ETL code both in Informatica & at the shell script level.
Design, build, and maintain automated data pipelines that ensure smooth data ingestion and integration from various sources.
Develop scalable and efficient solutions for data ingestion, using best practices in ETL/ELT processes.
Implement data validation and quality checks to ensure data integrity throughout the pipeline.
Optimize data pipelines to handle large volumes of data in real-time and batch processing environments.
Collaborate with data engineers, analysts, and other stakeholders to understand data requirements and deliver seamless solutions.
Monitor pipeline performance, troubleshoot issues, and continuously improve pipeline efficiency.
Document pipeline processes, configuration, and maintenance routines to facilitate knowledge sharing and onboarding.
 


Location:

This position can be based in any of the following locations:


Chennai, Gurgaon

Current Guardian Colleagues: Please apply through the internal Jobs Hub in Workday


لقد تجاوزت الحد الأقصى لعدد التنبيهات الوظيفية المسموح بإضافتها والذي يبلغ 15. يرجى حذف إحدى التنبيهات الوظيفية الحالية لإضافة تنبيه جديد
تم إنشاء تنبيه للوظائف المماثلة بنجاح. يمكنك إدارة التنبيهات عبر الذهاب إلى الإعدادات.
تم إلغاء تفعيل تنبيه الوظائف المماثلة بنجاح. يمكنك إدارة التنبيهات عبر الذهاب إلى الإعدادات.