Job Description
Title: Lead Data Engineer
Department: ISS
Location: Gurgaon, India or Dalian, China
Reports To: Head of Data Platform - ISS
Grade : 7
We’re proud to have been helping our clients build better financial futures for over 50 years. How have we achieved this? By working together - and supporting each other - all over the world. So, join our team and feel like you’re part of something bigger.
Department Description
ISS Data Engineering Chapter is an engineering group comprised of three sub-chapters - Data Engineers, Data Platform and Data Visualisation – that supports the ISS Department.
Fidelity is embarking on several strategic programmes of work that will create a data platform to support the next evolutionary stage of our Investment Process. These programmes span across asset classes and include Portfolio and Risk Management, Fundamental and Quantitative Research and Trading.
Purpose of your role
This role sits within the ISS Data Platform Team. The Data Platform team is responsible for building and maintaining the platform that enables the ISS business to operate.
This role is appropriate for a Lead Data Engineer capable of taking ownership and a delivering a subsection of the wider data platform.
Key Responsibilities
Design, develop and maintain scalable data pipelines and architectures to support data ingestion, integration and analytics.
Be accountable for technical delivery and take ownership of solutions.
Lead a team of senior and junior developers providing mentorship and guidance.
Collaborate with enterprise architects, business analysts and stakeholders to understand data requirements, validate designs and communicate progress.
Drive technical innovation within the department to increase code reusability, code quality and developer productivity.
Challenge the status quo by bringing the very latest data engineering practices and techniques.
Essential Skills and Experience
Core Technical Skills
Expert in leveraging cloud-based data platform (Snowflake, Databricks) capabilities to create an enterprise lake house.
Advanced expertise with AWS ecosystem and experience in using a variety of core AWS data services like Lambda, EMR, MSK, Glue, S3.
Experience designing event-based or streaming data architectures using Kafka.
Advanced expertise in Python and SQL. Open to expertise in Java/Scala but require enterprise experience of Python.
Expert in designing, building and using CI/CD pipelines to deploy infrastructure (Terraform) and pipelines with test automation.
Data Security & Performance Optimization: Experience implementing data access controls to meet regulatory requirements.
Experience using both RDBMS (Oracle, Postgres, MSSQL) and NOSQL (Dynamo, OpenSearch, Redis) offerings.
Experience implementing CDC ingestion.
Experience using orchestration tools (Airflow, Control-M, etc..)
Bonus technical Skills:
Strong experience in containerisation and experience deploying applications to Kubernetes.
Strong experience in API development using Python based frameworks like FastAPI.
Key Soft Skills:
Problem-Solving: Leadership experience in problem-solving and technical decision-making.
Communication: Strong in strategic communication and stakeholder engagement.
Project Management: Experienced in overseeing project lifecycles working with Project Managers to manage resources.
Feel rewarded
For starters, we’ll offer you a comprehensive benefits package. We’ll value your wellbeing and support your development. And we’ll be as flexible as we can about where and when you work – finding a balance that works for all of us. It’s all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.
For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.