https://bayt.page.link/nMekm3FDu8b9Pdpv8
Create a job alert for similar positions

Job Description

Job Description

Purpose of the role


To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. 


Accountabilities


  • Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data.
  • Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures.
  • Development of processing and analysis algorithms fit for the intended data complexity and volumes.
  • Collaboration with data scientist to build and deploy machine learning models.

Analyst Expectations


  • Execute work requirements as identified in processes and procedures, collaborating with and impacting on the work of closely related teams.
  • Check work of colleagues within team to meet internal and stakeholder requirements.
  • Provide specialist advice and support pertaining to own work area.
  • Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct.
  • Maintain and continually build an understanding of how all teams in area contribute to the objectives of the broader sub-function, delivering impact on the work of collaborating teams.
  • Continually develop awareness of the underlying principles and concepts on which the work within the area of responsibility is based, building upon administrative / operational expertise.
  • Make judgements based on practise and previous experience.
  • Assess the validity and applicability of previous or similar experiences and evaluate options under circumstances that are not covered by procedures.
  • Communicate sensitive or difficult information to customers in areas related specifically to customer advice or day to day administrative requirements.
  • Build relationships with stakeholders/ customers to identify and address their needs.

All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.


Join us as an ETL Developer at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to evolutionise our digital offerings, ensuring unparalleled customer experiences.


To be successful as an ETL Developer you should have experience with:


  • Develop and maintain scalable data pipelines using Python and PySpark.
  • Collaborate with data engineers and data scientists to understand and fulfill data processing needs.
  • Optimize and troubleshoot existing PySpark applications for performance improvements.
  • Write clean, efficient, and well-documented code following best practices.
  • Participate in design and code reviews.
  • Develop and implement ETL processes to extract, transform, and load data.
  • Ensure data integrity and quality throughout the data lifecycle.
  • Stay current with the latest industry trends and technologies in big data and cloud computing.

Some other highly valued skills may include:


  • Data Warehousing Concepts
  • Apache Spark
  • Abinitio

This is based out of Pune.


You have reached your limit of 15 Job Alerts. To create a new Job Alert, delete one of your existing Job Alerts first.
Similar jobs alert created successfully. You can manage alerts in settings.
Similar jobs alert disabled successfully. You can manage alerts in settings.