https://bayt.page.link/E2s1VyjtSVYesJBi8
Create a job alert for similar positions

Job Description

Primary Responsibilities



  • Ownership of the entire back end of the application including the design, implementation, test, and troubleshooting of the core application logic, databases, data ingestion and transformation, data processing and orchestration of pipelines, APIs, CI/CD integration and other processes




  • Fine-tune and optimize queries using Snowflake platform and databases techniques



  • Optimize ETL/data pipelines to balance performance, functionality, and other operational requirements



  • Assess and resolve data pipeline issues to ensure performance and timeliness of execution



  • Assist with technical solution discovery to ensure technical feasibility



  • Assist in setting up and managing CI/CD pipelines and development of automated tests



  • Developing and managing microservices using python



  • Conduct peer reviews for quality, consistency, and rigor for production level solution



  • Design application architecture for efficient concurrent user handling, ensuring optimal performance during high usage periods



  • Promote best practices and standards for code management, automated testing, and deployments



  • Own all areas of the product lifecycle: design, development, test, deployment, operation, and support



  • Create detail documentation on Confluence to be able to support and maintain codebase and its functionality



About you



Qualifications



  • 3+ years of relevant experience developing backend, integration, data pipelining, and infrastructure



  • Bachelor’s degree in computer science, engineering, or similar quantitative field of study



  • Expertise in database optimization and performance improvement



  • Expertise in Python, PySpark, and Snowpark




  • Experience data warehouse and object-relational database (Snowflake and PostgreSQL) and writing efficient SQL queries



  • Experience in cloud-based data platforms (Snowflake, AWS)



  • Proficiency in developing robust, reliable APIs using Python and FastAPI Framework



  • Understanding of data structures and algorithms



  • Experience in modern testing framework (SonarQube, K6 is a plus)



  • Strong collaboration skills, willingness to work with others to ensure seamless integration of the server-side and client-side



  • Knowledge of DevOps best practices and associated tools, a plus, especially in the setup, configuration, maintenance, and troubleshooting of associated tools:



  • Containers and containerization technologies (Kubernetes, Argo, Red HatOpenShift)



  • Infrastructure as code (Terraform)



  • Monitoring and Logging (CloudWatch, Grafana)



  • CI/CD Pipelines (JFrog Artifactory)



  • Scripting and automation (Python, GitHub, Github actions)




Pursue progress, discover extraordinary

Better is out there. Better medications, better outcomes, better science. But progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. So, let’s be those people.


At Sanofi, we provide equal opportunities to all regardless of race, colour, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, ability or gender identity.


Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com!


Job Details

Job Location
India
Company Industry
Other Business Support Services
Company Type
Unspecified
Employment Type
Unspecified
Monthly Salary Range
Unspecified
Number of Vacancies
Unspecified

Do you need help in adding the right mix of strong keywords to your CV?

Let our experts design a Professional CV for you.

You have reached your limit of 15 Job Alerts. To create a new Job Alert, delete one of your existing Job Alerts first.
Similar jobs alert created successfully. You can manage alerts in settings.
Similar jobs alert disabled successfully. You can manage alerts in settings.