QUALIFICATIONS
- Bachelor's or master's degree in computer science, engineering, or a related field
- 3+ years of professional experience as a Cloud or Data Engineer, with an ability to support infrastructure platform configurations, resources, and provisioning for cloud and data services
- Experience in building and optimizing Data Engineering pipeline, architectures and data sets
- Hands-on experience in developing and implementing data/hot fixes to address urgent data issues and ensure compliance with data governance and security policies
- Extensive working experience with backend programing languages like SQL & Python is a must
- Experience with big data systems such as Spark, PySpark, Databricks etc. is desired
WHO YOU'LL WORK WITH
Secure Foundation Cloud Services is an entity within McKinsey's global IT organization that provides IT support services to McKinsey firm members and clients. The team is responsible for managing the firm’s global IT infrastructure and applications. Secure Foundation Cloud Services provides “Data Operation and Engineering” services for McKinsey’s business operations. This group is responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. It's a team of skilled engineers supporting our software developers, data analysts and data scientists, with initiatives to ensure optimal data delivery architecture and consistency during engagements. Beyond technology skills, our data operations engineers bring skills essential to client service, including Problem Solving, Critical Thinking, Process Orientation and Business savvy. Team members are currently distributed globally across three time zones (India, USA, Costa Rica).
WHAT YOU'LL DO
In your role as an Engineer, you will develop capabilities which will serve as baseline for our clients and improve developer experience. You will contribute and master multiple technology skills like AWS Cloud, Data Pipelines with a strong focus on open-source products.
You'll contribute your expertise of data and technical approaches to help structure and solve business problems by coding, testing and delivering tools and technology assets for high quality analytics.
You'll design, develop, optimize, and maintain data platforms and ETL pipelines to meet business use cases and objectives. You'll optimize and maintain ETL pipelines to support data integration and transformation.
You will work with stakeholders to assist with data-related technical issues and conduct thorough RCA to identify the source of data problems. You'll lead evaluation, implementation, and deployment of tools and process to improve the way we work.