Job Description
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level
As a Lead Software Engineer at JPMorgan Chase within the Commercial & Investment Banking's Global Bank Technology Team, you'll serve as a seasoned member of an agile team to design and deliver trusted market-leading Data Engineering Solutions and Data products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.
Job Responsibilities:
- Design and implement scalable data architectures using Databricks at an enterprise-scale.
- Design and implement Databricks integration and interoperability with other cloud providers such as AWS, Snowflake, Immuta, and OpenAI.
- Collaborate with data scientists, analysts and business stakeholders to understand requirements and deliver solutions.
- Develop and maintain data architecture standards, including data product interfaces, data contracts, and governance frameworks.
- Implement data governance and security measures to ensure data quality and compliance with industry and regulatory standards.
- Monitor and optimize the performance and scalability of data products and infrastructure.
- Provide training and support to domain teams on data mesh principles and cloud data technologies.
- Stay up-to-date with industry trends and emerging technologies in data mesh and cloud computing.
Required qualifications, capabilities, and skills:
- Formal training or certification on software engineering concepts and 5+ years applied experience
- 12+ years applied experience in Data Engineering space using enterprise tools, home grown frameworks and 5+ years of speciality in Databricks implementation from start to end.
- 5+ years of experience in AWS cloud environment & Databricks.
- Experience as a Databricks solution architect or tech lead or similar role in an enterprise environment.
- Hands-on practical experience delivering system design, application development, testing, and operational stability
- Influencer with a proven record of successfully driving change and transforming across organizational boundaries
- Ability to present and effectively communicate to Senior Leaders and Executives.
- Experience in Python, Spark & Streaming (Spark Streaming or KAFKA or Kinesis) is a must
- Deep understanding of Apache Spark, Delta Lake, DLT and other big data technologies
Preferred qualifications, capabilities, and skills:
- Databricks and AWS certification
- Experience of working in a development teams, using agile techniques and Object Oriented development and scripting languages, is preferred.
- Experience with LLM & AI/ML is preferred.