Purpose of the role
To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure.
Accountabilities
Assistant Vice President Expectations
All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.
Join us as an adequatelyskilled and highly motivated Data Engineer to design, build, and maintain scalable and efficient data pipelines while collaborating with cross-functional teams to deliver high-quality, actionable regulatory data solutions as well as insightful business analytics.
You may be assessed on the key critical skills relevant for success in role, such ashands-on and strong modern data stack experiences in Python, SQL, DBT, AWS, S3 storage, Databricks or Snowflake.
To be successful as a Data Engineer, you should have experience with:
Building scalable and efficient ETL/ELT pipelines to ingest, transform, and load data for Reg reporting / analytical use cases.
Optimize Performance: Tune and optimize data pipelines, SQL queries, and data models for performance and scalability.
Python Proficiency for data manipulation, integration and with Pandas or PySpark experience.
Strong SQL skills for complex queries & experience with database platforms.
Hands-on experience building and managing DBT models & with DBT Cloud or DBT core.
Experience building and managing the object store.
Experience with AWS services for data engineering (e.g., S3, Glue, Lambda, Redshift).
Utilize Starburst, Databricks or Snowflake for big data processing and advanced analytics.
Partner with project managers, data analysts, data scientists, and business stakeholders to deliver end-to-end data solutions.
Develop monitoring and validation routines to ensure data accuracy, consistency, and security.
Work closely with data architect and data modelers/ analysts to design and implement data models that support business intelligence and analytics.
Desirable skillsets / good to have:
Familiarity with Databricks or Snowflake for big data processing and analytics is a plus.
Proficient with Git for version control and collaboration.
Familiarity with tools like Airflow, Dagster, or similar workflow automation platforms.
Experience in working with RESTful or GraphQL APIs for data integration.
Strong problem-solving and analytical skills with a focus on delivering high quality data solutions.
Excellent communication and collaboration skills to work with cross-functional teams.
Ability to adapt to a fast-paced and dynamic environment.
Self-motivated, detail-oriented, and passionate about data engineering.
This role will be based out of Nirlon Knowledge Park / Altimus office, Mumbai.