We are undergoing a significant data transformation to ensure that accurate and consistent data is available precisely when and where it's needed. Your role will involve participating in QA activities, such as testing sprint tickets, new features and providing progress updates on assigned tasks. You will be encouraged to deepen your domain knowledge and to think innovatively to enhance your daily tasks, working SMART (Specific, Measurable, Achievable, Relevant, and Time-bound). A clear understanding of the domain is essential to efficiently meet testing requirements.
Responsibilities:
- Develop and execute test scripts to validate data pipelines, transformations, and integrations.
- Formulate and maintain test strategies—including smoke, performance, functional, and regression testing—to ensure data processing and ETL jobs meet requirements.
- Collaborate with development teams to assess changes in data workflows and update test cases to preserve data integrity.
- Design and run tests for data validation, storage, and retrieval using Azure services like Data Lake, Synapse, and Data Factory, adhering to industry standards.
- Continuously enhance automated tests as new features are developed, ensuring timely delivery per defined quality standards.
- Participate in data reconciliation and verify Data Quality frameworks to maintain data accuracy, completeness, and consistency across the platform.
- Share knowledge and best practices by collaborating with business analysts and technology teams to document testing processes and findings.
- Communicate testing progress effectively with stakeholders, highlighting issues or blockers, and ensuring alignment with business objectives.
- Maintain a comprehensive understanding of the Azure Data Lake platform's data landscape to ensure thorough testing coverage.
Requirements:
- 3-6 years of QA experience with a strong focus on Big Data testing, particularly in Data Lake environments on Azure's cloud platform.
- Proficient in Azure Data Factory, Azure Synapse Analytics and Databricks for big data processing and scaled data quality checks.
- Proficiency in SQL, capable of writing and optimizing both simple and complex queries for data validation and testing purposes.
- Develop and execute test scripts to validate data pipelines, transformations, and integrations.
- Formulate and maintain test strategies—including smoke, performance, functional, and regression testing—to ensure data processing and ETL jobs meet requirements.
- Collaborate with development teams to assess changes in data workflows and update test cases to preserve data integrity.
- Design and run tests for data validation, storage, and retrieval using Azure services like Data Lake, Synapse, and Data Factory, adhering to industry standards.
- Continuously enhance automated tests as new features are developed, ensuring timely delivery per defined quality standards.
- Participate in data reconciliation and verify Data Quality frameworks to maintain data accuracy, completeness, and consistency across the platform.
- Share knowledge and best practices by collaborating with business analysts and technology teams to document testing processes and findings.
- Communicate testing progress effectively with stakeholders, highlighting issues or blockers, and ensuring alignment with business objectives.
- Maintain a comprehensive understanding of the Azure Data Lake platform's data landscape to ensure thorough testing coverage.
- Proficient in PySpark, with experience in data manipulation and transformation, and a demonstrated ability to write and execute test scripts for data processing and validation.
- Hands-on experience with Functional & system integration testing in big data environments, ensuring seamless data flow and accuracy across multiple systems.
- Knowledge and ability to design and execute test cases in a behaviour-driven development environment.
- Fluency in Agile methodologies, with active participation in Scrum ceremonies and a strong understanding of Agile principles.
- Familiarity with tools like Jira, including experience with X-Ray or Jira Zephyr for defect management and test case management.
- Proven experience working on high-traffic and large-scale software products, ensuring data quality, reliability, and performance under demanding conditions.
- Analytical Problem-Solving: Identify and resolve issues in complex Big Data environments, ensuring data accuracy and system efficiency.
- Proactive Engagement: Initiate and manage tasks, delivering high-quality results within deadlines in a dynamic setting.
- Continuous Improvement: Confidently challenge existing assumptions and processes to enhance data engineering and QA practices.
- Systematic Approach: Maintain organized testing schedules and documentation, ensuring thorough coverage of all scenarios.
- Collaborative Teamwork: Exhibit strong interpersonal skills, working effectively in cross-functional, multicultural teams with minimal supervision.
- Agile Adaptability: Comfortable in Agile and Scrum environments, proficient with tools like Jira, and adaptable to changing priorities.
- Efficient Multitasking: Manage multiple projects simultaneously, demonstrating strong planning and organizational skills to ensure timely task completion.
We have an amazing team of 700+ individuals working on highly innovative enterprise projects & products. Our customer base includes Fortune 5 retail and CPG companies, leading store chains, fast-growth fintech, and multiple Silicon Valley startups.
What makes Confiz stand out is our focus on processes and culture. Confiz is ISO 9001:2015 certified. We have a vibrant culture of learning via collaboration and making the workplace fun.
People who work with us work with cutting-edge technologies while contributing success to the company as well as to themselves.
To know more about Confiz Limited, visit: https://web.facebook.com/lifeatconfiz