Job Description
About PayUPayU, a leading payment and Fintech company in 50+ high-growth markets throughout Asia, Central and Eastern Europe, Latin America, the Middle East and Africa, part of Prosus group, one of the largest technology investors in the world is redefining the way people buy and sell online for our 300.000
+ merchants and millions of consumers. As a leading online payment service provider, we deploy more than 400 payment methods and PCI-certified platforms to process approximately 6 million payments every single day. Thinking of becoming a PayUneer and you are curious to know more about us? Read more about the life in PayU here
Job responsibilities:
· As a technology leader, you are responsible for hiring, building, growing and nurturing impactful business focused data teams
· Drive the technical and strategic vision for the embedded pods and foundational enablers to meet current and future needs for scale and interoperability
· Strive for continuous improvement of data architecture and development process
· Think of quick wins while planning for long term strategy and engineering excellence. You should be excited about breaking down large systems into easy to use data assets and reusable components
· Required to cross collaborate with stakeholders, external partners and peer data leaders
· You should be planner and executioner. Well versed with tools to plan for short term and long term team and stakeholder success
· Reliability and quality is must have for all data and ML applications
Requirements to be successful in this role:
- B.S., M.S., or PhD. in Computer Science or equivalent
- 12+ years of experience working in data engineering, data platform or a related domain
- 3+ years of hands-on management experience of leading engineers and engineering mangers.
- Exceptional communication and leadership skills, experience hiring and growing teams with a proven ability to operate in a fast moving environment
- Experienced with performance management, coaching, mentoring and growing teams
- Familiar with programming languages Python and SQL
- Prior experience with Spark/Airflow/Trino/Pinot
- ·Prior experience with AWS ecosystem and large scale batch/real time ETL orchestration using Airflow, Kafka, Spark Streaming, etc.
- Familiarity with various data lake file formats such as Delta Lake, Apache Hudi, Glue Catalog, S3,etc
- Experience in building systems directly powering online applications
- Exposure to various databases such as PostgreSQL, MongoDB and Cassandra
- Proficient in system design and with a strong understanding and application of AI solutions in data space