https://bayt.page.link/JPYvaXQmQHVp6Hmn8
Create a job alert for similar positions

Job Description

What if the work you did every day could impact the lives of people you know? Or all of humanity?At Illumina, we are expanding access to genomic technology to realize health equity for billions of people around the world. Our efforts enable life-changing discoveries that are transforming human health through the early detection and diagnosis of diseases and new treatment options for patients.Working at Illumina means being part of something bigger than yourself. Every person, in every role, has the opportunity to make a difference. Surrounded by extraordinary people, inspiring leaders, and world changing projects, you will do more and become more than you ever thought possible.

Position Summary:
We are seeking a highly skilled Senior Data Engineer Developer with 5+ years of experience to join our talented team in Bangalore. In this role, you will be responsible for designing, implementing, and optimizing data pipelines, ETL processes, and data integration solutions using Python, Spark, SQL, Snowflake, dbt, and other relevant technologies. Additionally, you will bring strong domain expertise in operations organizations, with a focus on supply chain and manufacturing functions.



If you're a seasoned data engineer with a proven track record of delivering impactful data solutions in operations contexts, we want to hear from you.



Responsibilities:


  • Lead the design, development, and optimization of data pipelines, ETL processes, and data integration solutions using Python, Spark, SQL, Snowflake, dbt, and other relevant technologies.
  • Apply strong domain expertise in operations organizations, particularly in functions like supply chain and manufacturing, to understand data requirements and deliver tailored solutions.
  • Utilize big data processing frameworks such as Apache Spark to process and analyze large volumes of operational data efficiently.
  • Implement data transformations, aggregations, and business logic to support analytics, reporting, and operational decision-making.
  • Leverage cloud-based data platforms such as Snowflake to store and manage structured and semi-structured operational data at scale.
  • Utilize dbt (Data Build Tool) for data modeling, transformation, and documentation to ensure data consistency, quality, and integrity.
  • Monitor and optimize data pipelines and ETL processes for performance, scalability, and reliability in operations contexts.
  • Conduct data profiling, cleansing, and validation to ensure data quality and integrity across different operational data sets.
  • Collaborate closely with cross-functional teams, including operations stakeholders, data scientists, and business analysts, to understand operational challenges and deliver actionable insights.
  • Stay updated on emerging technologies and best practices in data engineering and operations management, contributing to continuous improvement and innovation within the organization.

All listed requirements are deemed as essential functions to this position; however, business conditions may require reasonable accommodations for additional task and responsibilities.



Preferred Experience/Education/Skills:


  • Bachelor's degree in Computer Science, Engineering, Operations Management, or related field.
  • 5+ years of experience in data engineering, with proficiency in Python, Spark, SQL, Snowflake, dbt, and other relevant technologies.
  • Strong domain expertise in operations organizations, particularly in functions like supply chain and manufacturing.
  • Strong domain expertise in life sciences manufacturing equipment, with a deep understanding of industry-specific challenges, processes, and technologies.
  • Experience with big data processing frameworks such as Apache Spark and cloud-based data platforms such as Snowflake.
  • Hands-on experience with data modeling, ETL development, and data integration in operations contexts.
  • Familiarity with dbt (Data Build Tool) for managing data transformation and modeling workflows.
  • Familiarity with reporting and visualization tools like Tableau, Powerbi etc.
  • Good understanding of advanced data engineering and data science practices and technologies like pypark, sagemaker, cloudera MLflow etc.
  • Experience with SAP, SAP HANA and Teamcenter applications is a plus.
  • Excellent problem-solving skills, analytical thinking, and attention to detail.
  • Strong communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams and operations stakeholders.
  • Eagerness to learn and adapt to new technologies and tools in a fast-paced environment.

Illumina believes that everyone has the ability to make an impact, and we are proud to be an equal opportunity employer committed to providing employment opportunity regardless of sex, race, creed, color, gender, religion, marital status, domestic partner status, age, national origin or ancestry, physical or mental disability, medical condition, sexual orientation, pregnancy, military or veteran status, citizenship status, and genetic information.
You have reached your limit of 15 Job Alerts. To create a new Job Alert, delete one of your existing Job Alerts first.
Similar jobs alert created successfully. You can manage alerts in settings.
Similar jobs alert disabled successfully. You can manage alerts in settings.