https://bayt.page.link/ioZhV6hNwkeBTQDFA
Create a job alert for similar positions

Job Description

Overview The primary focus would be to lead data architecture activities for critical projects. This role will be responsible for architecting, designing, and implementing Advanced Analytics capabilities and platform oversight within the Azure Data Lake, Databricks, and other related ETL technologies. Satisfy project requirements adhering to enterprise architecture standards. This role will translate business requirements into technical specifications, including data streams, integrations, transformations, databases, and data warehouses, and implement cutting edge solutions by building Semantic and Virtual data layers to provide faster and federated query execution Responsibilities ACCOUNTABILITIES" •Lead Data Architecture for critical data and analytics projects •Drive and Deliver Data Architecture & Solution architecture deliverables such as conceptual, logical, and physical architecture •Partner with Enterprise Architecture (Data & Analytics) and ensure the usage of standard patterns •Partner with project leads, IT Leads, Security and Enterprise Architecture team members in architecting end to end solutions •Gain Architecture alignment and sign-off and guide the project team during implementation Qualifications MANDATORY TECHNICAL SKILLS 10+ years of experience in Teradata and Hadoop ecosystem (Ex: Hive, Spark, Kafka, HBase) , Azure cloud technologies •3 to 5 years of hands-on experience in architecting, designing, and implementing data ingestion pipes for batch, real-time, and streams on the Azure cloud platform at scale. •1 to 3 years of experience in using ETL tools like Informatica, ADF or similar tools, especially for a large volume of data. •1 to 3 years of hands-on experience on Databricks. •3 to 5 years of working experience on Azure cloud technologies like Spark, IoT, Synapse, Cosmos dB, Log analytics, ADF, ADLS, Blob storage, etc. •1 to 2 years of experience on distributed querying tools like Presto or similar tools. •1 to 2 years of experience on virtualization tools like Denodo or similar tools. •1 to 3 years of experience in evaluating emerging technologies is required. •1 to 3 years of experience in Python/Pyspark/Scala to build data processing applications. •Having experience in extracting/querying/Joining large amounts of data sets at scale MANDATORY TECH SKILLS •Highly analytical, motivated, decisive thought leader with solid critical thinking ability to quickly connect technical and business ‘dots’ •Has strong communication and organizational skills and can deal with ambiguity while juggling multiple priorities and projects at the same time. MANDATORY NON TECH SKILLS •Highly analytical, motivated, decisive thought leader with solid critical thinking ability to quickly connect technical and business ‘dots’ •Has strong communication and organizational skills and can deal with ambiguity while juggling multiple priorities and projects at the same time DIFFRENTIATING COMPETENCIES: •Experience in data wrangling, advanced analytic modelling, and AI/ML capabilities is preferred •Finance functional domain expertise

Job Details

Job Location
India
Company Industry
Other Business Support Services
Company Type
Unspecified
Employment Type
Unspecified
Monthly Salary Range
Unspecified
Number of Vacancies
Unspecified

Do you need help in adding the right mix of strong keywords to your CV?

Let our experts design a Professional CV for you.

You have reached your limit of 15 Job Alerts. To create a new Job Alert, delete one of your existing Job Alerts first.
Similar jobs alert created successfully. You can manage alerts in settings.
Similar jobs alert disabled successfully. You can manage alerts in settings.