https://bayt.page.link/b6C8gKczbnzgFUE58
Create a job alert for similar positions

Job Description

Overview As a member of the data & analytics team, you will be the key technical expert developing and overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be an empowered member of a team of MLOps engineers who build MLOps pipelines to deploy and orchestrate AI/ML Models to support ML lifecycle. As a member of the MLOps team, you will lead the development of very large and complex MLOps pipelines into cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around finance. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems Responsibilities Build and maintain the Machine Learning pipelines using Azure DevOps and Azure data factory Simplify the Machine Learning project development and deployment cycle Design and implement scalable solutions for data scientists to deploy their ML models and monitor their health on multi-tenant platform. Identify and develop MLOps features-Azure Devops Pipelines,Azure Data factory,Synapse. Create tools, processes, or workflows that enable our Data Scientists to operate more efficiently. Participate in entire Machine Learning Lifecycle, deploying code to various environments, CI/CD integration, performance monitoring and change management. Build pipelines, feedback loops, ML workflows, and APIs. Azure certifications/knowledge will be preferred. Responsible for sustainability of already live pipelines in production environment. Qualifications Bachelor’s degree in computer science, MIS, Business Management, or related field 3-5 + years’ experience in Information Technology 3+ years of cloud experience (Azure , AWS , GCP etc) – Preferably Azure Proficiency with Python & pyspark programming in Databricks Environment. Comfortable working with complex SQL Experience with ML model serving technologies like Kubeflow Serving, MLFlow etc. Experience with common data science frameworks (i.e. Pandas, Numpy, Scikit-learn, Tensorflow) Experience in creating Azure Data factory & Azure Devops Pipelines. Experience in supporting QA and production deployments and developing end to end CI/CD Pipelines. Experience of azure infrastructure and platform in context to AI/ML Well versed with automations and deployments using cutting-edge azure technologies. Good written and verbal communication skills along with collaboration and listening skills. Nice to have experience related to Big Data Mandatory “Non-Technical” Skills Excellent remote collaboration skills Experience working in a matrix organization with diverse priorities. Enthusiast for learning functional knowledge specific to finance business. Ability to work with virtual teams (remote work locations); within team of technical resources (employees and contractors) based in multiple global locations. Participate in technical discussions, driving clarity of complex issues/requirements to build robust solutions. Strong communication skills to meet with delivery teams and business-facing teams, understand sometimes ambiguous, needs, and translate to clear, aligned requirements.

You have reached your limit of 15 Job Alerts. To create a new Job Alert, delete one of your existing Job Alerts first.
Similar jobs alert created successfully. You can manage alerts in settings.
Similar jobs alert disabled successfully. You can manage alerts in settings.