https://bayt.page.link/gf6hcz1QotpmtRjR8
أنشئ تنبيهًا وظيفيًا للوظائف المشابهة

الوصف الوظيفي

Yassir is the leading super App in the Maghreb region set to changing the way daily services are provided. It currently operates in 45 cities across Algeria, Morocco and Tunisia with recent expansions into France, Canada and Sub-Saharan Africa. It is backed (~$200M in funding) by VCs from Silicon Valley, Europe and other parts of the world.
We offer on-demand services such as ride-hailing and last-mile delivery. Building on this infrastructure, we are now introducing financial services to help our users pay, save and borrow digitally.
Helping usher the continent into a digital economy era. We’re not just about serving people - we’re about creating a marketplace to bring people what they need while infusing social values.

Responsibilities


  • Build a centralized data lake on GCP Data services by integrating diverse data sources throughout the enterprise.
  • Develop, maintain, and optimize SPARK-powered batch and streaming data processing pipelines. Leverage GCP data services for complex data engineering tasks and ensure smooth integration with other platform components
  • Design and implement data validation and quality checks to ensure data's accuracy, completeness, and consistency as it flows through the pipelines.
  • Work with the Data Science and Machine Learning teams to engage in advanced analytics.
  • Collaborate with cross-functional teams, including data analysts, business users, operational and marketing teams, to extract insights and value from data.
  • Collaborate with the product team to design, implement, and maintain the data models for analytical use cases.
  • Design, develop, and upkeep data dashboards for various teams using Looker Studio.
  • Engage in technology explorations, research and development, POC’s and conduct deep investigations and troubleshooting.
  • Design and manage ETL/ELT processes, ensuring data integrity, availability, and performance.
  • Troubleshoot data issues and conduct root cause analysis when reporting data is in question.

Required Technical Skills


  • PySpark
  • GCP - Big Query, Dataproc, Dataflow, Dataplex, Pub-Sub and Cloud Storage
  • Advance SQL knowledge
  • NoSQL (Preferably MongoDB)
  • Programming languages - Scala/Python
  • Great Expectation - similar DQ framework
  • Familiarity with workflow management tools like Airflow, Prefect or Luigi.
  • Understanding of Data Governance, DWH and Data Modelling

Good to have skills


  • Infrastructure as Code - Terraform
  • Docker and Kubernetes
  • Looker Studio
  • AI and ML engineering knowledge

لقد تجاوزت الحد الأقصى لعدد التنبيهات الوظيفية المسموح بإضافتها والذي يبلغ 15. يرجى حذف إحدى التنبيهات الوظيفية الحالية لإضافة تنبيه جديد
تم إنشاء تنبيه للوظائف المماثلة بنجاح. يمكنك إدارة التنبيهات عبر الذهاب إلى الإعدادات.
تم إلغاء تفعيل تنبيه الوظائف المماثلة بنجاح. يمكنك إدارة التنبيهات عبر الذهاب إلى الإعدادات.