https://bayt.page.link/j2FfYr5RFvvupfAC7
Create a job alert for similar positions

Job Description

Introduction
In this role, you’ll work in one of our IBM Consulting Client Innovation Centres (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centres offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.

At IBM, work is more than a job – it’s a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you’ve never thought possible. Are you ready to lead in this new era of technology and solve some of the world’s most challenging problems? If so, lets talk.

Your Role and Responsibilities


  • Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform
  • Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation
  • Contribute to reusable components / asset / accelerator development to support capability development
  • Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies
  • Participate in customer PoCs to deliver the outcomes
  • Participate in delivery reviews / product reviews, quality assurance and work as design authority


Required Technical and Professional Expertise


  • Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems
  • Experience in data engineering and architecting data platforms
  • Experience in architecting and implementing Data Platforms Azure Cloud Platform
  • Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow
  • Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks


Preferred Technical and Professional Expertise


  • Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem
  • Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric
  • Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc

You have reached your limit of 15 Job Alerts. To create a new Job Alert, delete one of your existing Job Alerts first.
Similar jobs alert created successfully. You can manage alerts in settings.
Similar jobs alert disabled successfully. You can manage alerts in settings.