At Elanco (NYSE: ELAN) – it all starts with animals!
As a global leader in animal health, we are dedicated to innovation and delivering products and services to prevent and treat disease in farm animals and pets. We’re driven by our vision of ‘Food and Companionship Enriching Life’ and our approach to sustainability – the Elanco Healthy Purpose™ – to advance the health of animals, people, the planet and our enterprise.
At Elanco, we pride ourselves on fostering a diverse and inclusive work environment. We believe that diversity is the driving force behind innovation, creativity, and overall business success. Here, you’ll be part of a company that values and champions new ways of thinking, work with dynamic individuals, and acquire new skills and experiences that will propel your career to new heights.
Making animals’ lives better makes life better – join our team today!
Role & Responsibilities
- Provide data engineering subject matter expertise and hands-on data- capture, ingestion, curation, and pipeline development expertise on Azure to deliver cloud optimized data solutions.
- Provide expert data PaaS on Azure storage; big data platform services; server-less architectures; Azure SQL DB; NoSQL databases and secure, automated data pipelines.
- Participate in data/data-pipeline architectural discussions to help build cloud native solutions or migrate existing data applications from on premise to Azure platform. Perform current state “AS-IS” and future state “To-Be” analysis.
- Participate and help develop data engineering community of practice as a global go-to expert panel/resource.
- Develop and evolve new or existing data engineering methods and procedures to create possible alternative, agile solutions to moderately complex problems.
- Stay abreast with new and emerging data engineering technologies, tools, methodologies, and patterns on Azure and other major public clouds.
- Demonstrate ownership in understanding the organization’s strategic direction as it relates to your team and individual goals.
Work collaboratively and use sound judgment in developing robust solution while seeking guidance on complex problems.
Basic Qualifications (Must have)
- Bachelors or higher degree in Computer Science or a related discipline.
- At least 3+ years of data pipeline and data product design, development, delivery experience and deploying ETL/ELT solutions on Azure Data Factory.
- Azure native data/big-data tools, technologies and services experience including – Storage BLOBS, ADLS, Azure SQL DB, COSMOS DB, NoSQL and SQL Data Warehouse.
- Sound problem solving skills in developing data pipelines using Data Bricks, Stream Analytics and PowerBI.
- Minimum of 2 years of hands-on experience in programming languages, Azure and Big Data technologies such as PowerShell, C#, Java, Python, Scala, SQL, ADLS/Blob, Hadoop, Spark/SparkSQL, Hive, and streaming technologies like Kafka, EventHub etc.
- Knowledge on Distributed System
Elanco is an EEO/Affirmative Action Employer and does not discriminate on the basis of age, race, color, religion, gender, sexual orientation, gender identity, gender expression, national origin, protected veteran status, disability or any other legally protected status
Elanco is an EEO/Affirmative Action Employer and does not discriminate on the basis of age, race, color, religion, gender, sexual orientation, gender identity, gender expression, national origin, protected veteran status, disability or any other legally protected status