https://bayt.page.link/WXYD8Y21aoeCe3E29
Back to the job results
Other Business Support Services
Create a job alert for similar positions

Job Description

Company Description

HungerStation is part of the Delivery Hero Group, the world’s pioneering local delivery platform, our mission is to deliver an amazing experience—fast, easy, and to your door. We operate in over 70+ countries worldwide. Headquartered in Berlin, Germany. Delivery Hero has been listed on the Frankfurt Stock Exchange since 2017 and is part of the MDAX stock market index.



Job Description

We are looking for a Data Engineer to join our team. Contribute to the creation of multiple data products, helping HungerStation to keep growing and helping vendors all across the globe. If you're a creative problem solver who is eager to deliver solutions and hungry for a new adventure, an international workplace is waiting for you in the heart of Saudi!



Your Mission


  • You will draw data architecture, processes and be responsible for end-to-end data pipelines


  • Work closely with data integrations, data quality and data contracts 


  • Communicating with different teams regarding data consistency and data availability


  • Perform on-going reviews of our current applications and provide recommendations to improve them


  • Create proofs of concepts with new technologies and drive innovation


  • You will develop monitoring and alerting tools to ensure high quality of collected data


Your Skills


  • You have 3+ years of experience building data pipelines in a professional working environment


  • You are pragmatic engineer who understands what is needed to get things done in a collaborative manner


  • You’re a self-organizing proactive person


  • You’re eager to work in a fast-paced, fault-tolerant and agile environment


  • You have experience with processing of large amounts of structured and unstructured data


    • SQL, Python, Java.


  • Practical experience with Databases, Data Modelling and Data Architectures


    • Batch, Lambda and Kappa architectures;


    • Experience with OLTP and OLAP;


  • Experience with GCP (or AWS or Azure) clouds services


    • GCS, BigQuery, Pub/sub, Kubernetes Engine, etc.


  • Practical experience with data ingestion/integration


  • Experience working with containerised applications Docker and Kubernetes


  • Knowledge of IoC  (preference: Terraform) 


  • Good communication skills and fluent English


    Nice to Have: 
    • Applicated knowledge on Data Mesh


    • Experience with setting up and configuring CI/CD pipelines


    • Familiar with common logging, monitoring, alerting tools such as Datadog, Grafana, NewRelic, Prometheus, Kibana, etc.





You have reached your limit of 15 Job Alerts. To create a new Job Alert, delete one of your existing Job Alerts first.
Similar jobs alert created successfully. You can manage alerts in settings.
Similar jobs alert disabled successfully. You can manage alerts in settings.