Careem is building the Everything App for the greater Middle East, making it easier than ever to move around, order food and groceries, manage payments, and more. Careem is led by a powerful purpose to simplify and improve the lives of people and build an awesome organisation that inspires. Since 2012, Careem has created earnings for over 2.5 million Captains, simplified the lives of over 70 million customers, and built a platform for the region’s best talent to thrive and for entrepreneurs to scale their businesses. Careem operates in over 70 cities across 10 countries, from Morocco to Pakistan.
About the Team
The Careem Data Platform team’s mission is to provide a platform to abstract big data complexities and enable fast, reliable and secure access to data. As a member of this team, you will be at the forefront of fulfilling this mission. You will be working with the top talent of the region, leveraging modern big data tools and techniques to solve the region’s day to day problems, on top of our own in-house data platform, serving users in real-time.
This role will be part of the Data processing and computation platform team. We are heavily invested in open source technologies like Apache Spark, Apache Kafka, Apache Trino etc. You would also have an opportunity to contribute and collaborate with the open source community.
What you'll do
- Bringing an innovative and creative mindset to data engineering challenges to develop a modern data platform with efficient reusable components
- Design, architect, solutioning, implement and test rapid prototypes that demonstrate value of the data and present them to diverse audiences.
- Your focus will be on making code more efficient to run, optimizing resources across the cluster, and speeding up the compute workloads we face
- Continuously improve our engineering processes, tests, and systems that allow us to scale the code base and productivity of the team
- Collaborate with teams globally and operate in a fast paced environment
- Responsible for creating reusable and scalable data pipelines.
What you’ll need
- 4+ years of hands-on experience in software development.
- Bachelor's degree in Computer Science or a related technical field.
- Strong expertise in Scala, Java, or similar programming languages.
- Proven track record of building distributed systems or working on comparable large-scale projects.
- Deep understanding of cloud-native Big Data technologies.
- Solid foundation in software engineering principles and design best practices.
- Passion for developing high-quality, maintainable, and performant software within a collaborative, high-performing global team.
- Experience with cloud control planes (AWS, GCP, etc.) or database internals, including query optimization.
- Contributions to open-source projects will be preferred.
- Experience with Docker and Kubernetes is a plus.
-
What we’ll provide you
We offer colleagues the opportunity to drive impact in the region while they learn and grow. As a full time Careem colleague, you will be able to:
- Work and learn from great minds by joining a community of inspiring colleagues.
- Put your passion to work in a purposeful organisation dedicated to creating impact in a region with a lot of untapped potential.
- Explore new opportunities to learn and grow every day.
- Work 4 days a week in office & 1 day from home, and remotely from any country in the world for 30 days a year with unlimited vacation days per year. (If you are in an individual contributor role in tech, you will have 2 office days a week and 3 to work from home.)
- Access to healthcare benefits and fitness reimbursements for health activities including gym, health club, and training classes.