You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. How will you make an impact in this role? As part of Global Servicing Group Technology, we develop data engineering pipelines to deliver the GSG business strategic metrics, Credit / Servicing colleague performance and operational metrics. While consuming data from variety of source systems within AMEX, we clean and transform it to apply business logic in our bigdata ETL/ELT pipelines. Deploying Machine learning concepts to generate satisfaction score of the customer-CCP interactions and preparing data for VOCM (Voice of Customer surveys) is the other vertical of customer listening that we host within our group.This team is on path of migrating our data engineering from enterprise cloud data warehouse (Cornerstone) to google cloud-based LUMI platform. Also, the vision is to enable DaaS (Data-as-a-Service) to all our enterprise data consumers based on their GSG consumer personas of Servicing, Fraud, Credit and Collections etc.Roles & Responsibilities:-
Be versatile and be able to collaborate with business stakeholders, product teams and cross functional technology groups to drive continuous delivery
Data Process Design and Coding: Design, code, and document data processes for broad-scale data access.
Maintain Data Integrity: Develop validation and monitoring processes for data compilations.
Efficient Data Source Development: Collaborate with teams to design resilient data sources for business needs.
Data Integration into Products: Incorporate data sources into user-facing features.
Cloud Technology Utilization: Leverage cloud platforms for data engineering solutions.
Utilize Big Data Technologies: Employ technologies like PySpark and Spark, Scala, Hive, SQL for data processing.
API Design and Management: Develop and maintain APIs for data access and manipulation.
ETL and Data Pipeline Development: Proficient in ETL (Extract, Transform, Load) processes, tools, and developing data pipelines.
CI/CD Implementation: Implement and manage Continuous Integration and Continuous Deployment (CI/CD) processes.
Data Center Process Optimization: Optimize data processes across data centers.
Data Governance Collaboration: Work with governance teams on data definition and mapping issues.
Banking Data Expertise: Utilize knowledge of banking data for relevant data engineering tasks.
Reporting Tools Expertise: Develop reports using various reporting tools like Tableau/PowerBI etc.
Production Support Coordination: Coordinate and support data engineering solutions in a production environment.
Qualifications & Skills:
Degree in Computer Science with3+ years of experience in Big Data Platform Proficiency: Skilled in working with Big Data platforms (Hadoop, Kafka, PySpark, Hive) and Big Data lakes.
Proficient in tools and systems for data analysis/big data are written in Java (Hadoop, Apache Hive) and Scala (Kafka, Apache Spark).
Proficient in SQL and Unix Shell Scripting: Skilled in SQL and Unix shell scripting.
Experience in developing, hosting solutions on Google Cloud Platform- Google Cloud Storage, BigQuery and Vertex AI and leveraging GCP services for specific feature implementations.
Cloud Knowledge: Proficient in cloud computing technologies.
Proficient in Programming Languages: Proficiency in programming languages such as Python, Java, Scala, and others.
ETL and Data Pipeline Skills: Proficient in ETL (Extract, Transform, Load) processes and tools and data modelling.
Good knowledge and hands on experience in RDBMS and NoSQL databases such as Oracle, MSSQL, PostgreSQL, MongoDB, Cassandra