Lead the design and implementation of processes to ingest data from various sources into the Databricks Lakehouse platform, ensuring alignment with architectural and engineering standards.
Oversee the development, maintenance, and optimization of data models and ETL pipelines that support the Medallion Architecture (Bronze, Silver, Gold layers) to enhance data processing efficiency and facilitate data transformation.
Utilize Databricks to integrate, consolidate, and cleanse data, ensuring accuracy and readiness for analysis, while leveraging Delta Lake for versioned data management.
Implement and manage Unity Catalog for centralized data governance, ensuring proper data access, security, and compliance with organizational policies and regulations.
Collaborate with business analysts, data scientists, and stakeholders to understand their data requirements and deliver tailored solutions that leverage the capabilities of the Databricks Lakehouse platform.
Promote available data and analytics capabilities to business stakeholders, educating them on how to effectively leverage these tools and the Medallion Architecture for their analytical needs.
Lead the migration of existing ETL processes from Informatica IICS and SSIS to cloud-based data pipelines within the Databricks environment, ensuring minimal disruption and maximum efficiency.
Expertise in scaling ETL pipelines for performance and cost-effectiveness.
Act as a technical lead during sprints, providing guidance and support to team members, and ensuring adherence to best practices in data engineering.
Engage with clients and stakeholders to support architectural designs, address technical queries, and provide strategic guidance on utilizing the Databricks Lakehouse platform effectively.
Stay updated on industry trends and emerging technologies in data engineering, particularly those related to Databricks, cloud data solutions, and ETL migration strategies, continuously enhancing your skills and knowledge.
Demonstrate excellent problem-solving skills, with an ability to see and solve issues before they affect business productivity.
Participate in all aspects of the software development lifecycle for AWS solutions, including planning, requirements, development, testing, and quality assurance
Demonstrate thought leadership by contributing to the development of best practices, standards, and documentation for data engineering processes within the organization.
Strong understanding of DevOps tools and best practices for data engineering, including CI/CD, unit and integration testing, automation, and orchestration.