Job Description
We are looking for an experienced data engineer responsible for expanding and optimizing our big data pipelines. You will be working alongside other engineers and developers working on different layers of the infrastructure. Therefore, a commitment to collaborative problem solving, sophisticated design, and the creation of quality products is essential.
Responsibilities:
- Collaborate with Big Data Solution Architects to design, prototype, implement, and optimize data ingestion pipelines so that data is shared effectively across various business systems.
- Build ETL/ELT Ingestion pipelines and analytics solutions using cloud and on-premises technologies.
- Ensure the design, code and procedural aspects of solution are production ready, in terms of operational, security and compliance standards.
- Participate in day-to-day project and production delivery status meetings, and provide technical support for faster resolution of issues.
Requirements:
- Bachelor's degree from an accredited college/university or equivalent work experience.
- 5+ years of experience performing data engineering, warehousing, publishing and visualization throughout the full data lifecycle
- At least two (2) years of experience as an Azure Data Engineer, or in a similar role in a technology or IT consulting environment.
- 3 years of experience Data profiling, cataloguing and mapping to enable the design and build of technical data flows
- 3 years of experience defining ELT/ETL architecture and process design
- 2 years of experience performing end-to-end implementation of data warehousing analytics solutions built on MS or Azure platforms
- 2 years of experience working with Azure
- Experience in creating data pipelines using ETL tools like Airflow, ADF
- Experienced in distributing computing technologies like Apache Spark, Databricks
- Experienced with stream-processing systems like Spark Streaming, Eventhub
- Proficient in one of the object-oriented and functional languages: Python, Java, Scala, etc.
- Experience of writing effective and maintainable unit and integration tests for ingestion pipelines.
- Experience of using static analysis and code quality tools.
- Experience in using code version control repositories like Git.
- Experienced in security best practices including encryption of sensitive data, encryption at rest and encryption in flight.
- Experience of building CI/CD pipelines & DevOps orchestration tool like Jenkins.
- Familiarity with Data Lakehouse, Delta Lake, Apache Hudi.
We have an amazing team of 700+ individuals working on highly innovative enterprise projects & products. Our customer base includes Fortune 100 retail and CPG companies, leading store chains, fast-growth fintech, and multiple Silicon Valley startups.
What makes Confiz stand out is our focus on processes and culture. Confiz is ISO 9001:2015 (QMS),ISO 27001:2022 (ISMS), ISO 20000-1:2018 (ITSM) and ISO 14001:2015 (EMS) Certified. We have a vibrant culture of learning via collaboration and making workplace fun.
People who work with us work with cutting-edge technologies while contributing success to the company as well as to themselves.
To know more about Confiz Limited, visit: https://www.linkedin.com/company/confiz-pakistan/