https://bayt.page.link/shcqwXxs8uaFySaZA
Create a job alert for similar positions

Job Description

Summary role description:


Hiring for a Snowflake Architect - ADF for a top-tier global Systems Integration / IT Services major.


Company description:


Our client is a top-tier global Systems Integration, IT Services, Consulting and Digital Solutions company, that helps hundreds of customers secure competitive advantage through technology. Their comprehensive Digital Transformation platform, drives and accelerates the Mobile, Analytics & AI / ML, IoT / Industry 4.0, Cloud and Social journeys of their customers.


Role details:


  • Title / Designation: Snowflake and ADF Architect
  • Location: Chennai, Bengaluru, Hyderabad, Coimbatore, Mumbai, Pune, New Delhi, Kolkata, Bhubaneswar
  • Work Mode: Hybrid

Role & responsibilities:


  • Lead the design, development, programming, testing, deployment, and maintenance of data quality procedures within Snowflake and ADF environments.
  • Collaborate with stakeholders to ensure the architecture addresses data management and quality standards in line with client requirements.
  • Build and implement Data Quality (DQ) rules, standardization, and validation processes to enhance data accuracy, such as phone and customer data validation.
  • Assess current installations, domain configurations, and provide recommendations for best practices.
  • Profile source data, define or validate metadata, and cleanse and standardize data sources.
  • Serve as the single point of contact for clients throughout engagements, from pre-sales to post-delivery, ensuring smooth communication and project alignment.
  • Supervise and mentor junior to mid-level team members, providing guidance and training as needed.
  • Create and present regular reports to communicate project status to both internal and external stakeholders.
  • Drive end-to-end DQ assessment, identifying key challenges, gaps, and issues in the data quality ecosystem.
  • Define the data profiling process, review profiling reports, propose DQ rules, and finalize remediation recommendations after stakeholder consultations.
  • Own the Data Quality improvement and monitoring strategy, ensuring its implementation and proper documentation.

Candidate requirements:


  • 12+ years of experience in data quality and data integration solutions, with at least 1 year as an architect.
  • Proven experience in leading Data Quality and Data Integration projects, specifically within Snowflake and Azure Data Factory (ADF).
  • Experience working with Informatica, Talend, and bespoke data quality solutions.
  • Strong knowledge of Python and PySpark, with hands-on expertise in building custom DQ solutions.
  • Experience with SQL, Azure, and data profiling tools.
  • Familiarity with root cause analysis for data quality issues and the automation of data integration and profiling processes.

Selection process:


  • 2 Technical Discussions
  • HR Discussion


Job Details

Job Location
India
Company Industry
Other Business Support Services
Company Type
Recruitment Agency
Employment Type
Unspecified
Monthly Salary Range
Unspecified
Number of Vacancies
Unspecified

Do you need help in adding the right mix of strong keywords to your CV?

Let our experts design a Professional CV for you.

You have reached your limit of 15 Job Alerts. To create a new Job Alert, delete one of your existing Job Alerts first.
Similar jobs alert created successfully. You can manage alerts in settings.
Similar jobs alert disabled successfully. You can manage alerts in settings.