https://bayt.page.link/tkEwKgJRsceAPyKw7
Back to the job results
Other Business Support Services
Create a job alert for similar positions

Job Description

Job Description:


The Data Engineer will lead the design and architecture of AWS-based data solutions, develop end-to-end data pipelines, manage data lakes, and optimize data platforms for performance and scalability. Responsibilities include writing and testing Python, SQL or other code, conducting code reviews, implementing end to end ETL/ELT processes, enforcing data governance policies, and driving innovation in data engineering practices. This role also involves collaboration with cross-functional teams, aligning with stakeholders on technical requirements, providing technical leadership, and mentoring team members to enhance overall team efficiency.


Key Responsibilities:


  • Design and implement scalable data solutions on AWS, including data lakes, warehouses, and streaming systems.
  • Develop, optimize, and maintain data pipelines using AWS services.
  • Implement robust ETL/ELT processes and event-driven data ingestion.
  • Establish and enforce data governance policies, ensuring data quality, security, and compliance.
  • Optimize cloud resources for performance, availability, and cost-efficiency.
  • Partner with cross-functional teams to gather requirements and deliver comprehensive cloud-based solutions.
  • Identify opportunities to enhance systems, processes, and technologies while troubleshooting complex technical challenges.

Our current tech-stack:


  • AWS: Glue, Lambda, Step Function, Batch, ECS, Quicksight, Machine Learning, Sagemaker, etc.
  • DevOps: Cloudformation, Terraform, Git, CodeBuild
  • Database: Redshift, PostgreSQL, DynamoDB, Athena
  • Language: Bash, Python, SQL

Qualifications:


  • Bachelor's degree in Computer Science, Engineering, or related field. Master's degree preferred.
  • Expertise in AWS platforms, including data services. Basic knowledge of Azure is preferred.
  • Extensive experience in data and cloud engineering roles.
  • Expertise in AWS platforms, including data services.
  • Strong competence in ETL processes, data warehousing, and big data technologies.
  • Advanced skills in scripting, Python, SQL, and infrastructure automation tools.
  • Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes).
  • Experience with data visualization tools (e.g., QuickSight) is a plus.

You have reached your limit of 15 Job Alerts. To create a new Job Alert, delete one of your existing Job Alerts first.
Similar jobs alert created successfully. You can manage alerts in settings.
Similar jobs alert disabled successfully. You can manage alerts in settings.