https://bayt.page.link/9mc3JPxKFFEfvZsP9
Create a job alert for similar positions

Job Description

Company Description

Over the many years of its pursuit for investment diversification, the Ghobash Group has capitalized on opportunities in sectors with encouraging growth potential by either buying out established operating companies or founding new businesses to extend value into those markets. As the portfolio of these fully, or majority-owned operating companies grew and became more diversified, the Group established ABAN INVESTMENT in 2008 to administer and facilitate its smooth operation. Each operating company general manager reports into the CEO of Aban Investment and as such it is an actively managed portfolio of companies.


ABAN Investment currently has stakes in Technology, Oil & Gas, Pharmaceuticals, Industrial Chemicals and Supplies, Hospitality, Healthcare, and Consumer Services. It is committed to building high-value businesses, from start-ups and early stage ventures to mergers and acquisitions for more mature businesses.



Job Description

The Data Architect, being a specialist in Azure Data Factory, Databricks & Data Pipelines, is responsible for designing, implementing, and managing the company's data architecture. This includes overseeing data warehousing, managing data pipelines, and ensuring the efficient operation of Azure Data Factory and Databricks. The specialist will play a critical role in transforming raw data into valuable insights, ensuring data integrity, and supporting business intelligence activities across the organization.


  • Azure Database Management: Design, implement, and manage Azure SQL databases and related services.
  • Data Warehousing: Develop and maintain enterprise data warehouses to support business intelligence and analytics needs.
  • Azure Data Factory: Create, schedule, and manage data pipelines for ETL (Extract, Transform, Load) processes.
  • Data ETL: Design and implement ETL processes to ingest data from various sources, transform it as needed, and load it into target systems. Administer and maintain relational database management systems, ensuring high availability and performance.
  • Databricks: Utilize Databricks for big data processing, advanced analytics, and machine learning workflows.
  • Data Pipelines Management: Monitor and optimize data pipelines to ensure efficient data flow and minimal downtime.
  • Data Security: Implement robust security measures to protect sensitive data and ensure compliance with relevant regulations.
  • Documentation: Maintain detailed documentation of data architectures, ETL processes, and data management practices.
  • Collaboration: Work closely with data analysts, data scientists, and other stakeholders to understand data requirements and deliver solutions.

IT Policies and IT Processes impacted/addressed by this position:


  • Database managment process: Adherence to data governance processes & policies to ensure data accuracy and security.
  • Data Security Policy: Ensuring compliance with data protection and security guidelines.
  • Change Management Policy: Following procedures for managing changes in technology and processes.
  • Business Continuity Policy: Supporting business continuity planning and disaster recovery processes.
  • IT Incident Management: Participating in the resolution of IT incidents that involve RPA and digitization solutions.

Qualifications
  • Bachelor’s degree in computer science, Information Technology, Business Administration or a related field.
  • Advanced certifications / Trainings in Azure Data Engineering or similar are preferred.

Additional Information

Experience:


  • Minimum of 5 years of experience in data architecture, data warehousing, and ETL processes.
  • Proven experience with Azure Data Factory, Databricks, and RDBMS administration.
  • Experience in managing data pipelines and big data processing.
  • Experience in working with cross-functional teams and managing multiple stakeholders.
  • Experience in managing client presentations, Experience in project management and delivering automation projects.

Skills & Abilities:


  • Strong knowhow of automation methodologies, software and tools:
    • Essential Software: Microsoft SQL Server, MySQL, PostgreSQL
    • Platforms: Azure Data Factory, Azure Synapse Analytics, and MS Power BI
    • Tech-Frameworks: Apache Spark, and Hadoop
    • Tech-Stacks: Awareness of Microsoft .NET
  • Attention to detail and accuracy.
  • Strong organizational and time management skills.


Job Details

Job Location
Dubai United Arab Emirates
Company Industry
Other Business Support Services
Company Type
Unspecified
Employment Type
Unspecified
Monthly Salary Range
Unspecified
Number of Vacancies
Unspecified

Do you need help in adding the right mix of strong keywords to your CV?

Let our experts design a Professional CV for you.

You have reached your limit of 15 Job Alerts. To create a new Job Alert, delete one of your existing Job Alerts first.
Similar jobs alert created successfully. You can manage alerts in settings.
Similar jobs alert disabled successfully. You can manage alerts in settings.