Job Description
At Nielsen, we believe that career growth is a partnership. You ultimately own, fuel and set the journey. By joining our team of nearly 14,000 associates, you will become part of a community that will help you to succeed. We champion you because when you succeed, we do too. Embark on a new initiative, explore a fresh approach, and take license to think big, so we can all continuously improve. We enable your best to power our future.
Normal
About The Job
- Around here we’re all numbers people, but it’s the 1s and 0s behind our data that make what we
- do possible. Software engineers strike a balance between precision and disruption, between
- reliability and innovation. Nielsen is a tech company backed by nearly a century of forward
- momentum to show the world what’s next—and we couldn’t have done it without our engineers.
- In this role, you will be working alongside data scientists and engineers within Nielsen’s Market
- Mix Modeling (MMM) business, to build a data platform on AWS to ingest, combine and process
- data from external sources. The role requires you to have experience working with large
- datasets and complex schemas, a can-do approach towards automation, with an emphasis on
- the implementation of best practice cloud security principles.
RESPONSIBILITIES
- Collaborate with product and data science to clarify requirements and design processing
- flows
- Collaborate in cross-functional teams to implement, test and deploy features.
- Implement efficient, scalable, robust data pipelines along with automated data validation
- Identify potential process improvements and recommend suitable tools and changes to
- address them
- Troubleshoot and optimize data pipelines in a distributed environment
- Develop and maintain good programming standards and practices across the ecosystem
- Perform code reviews
QUALIFICATIONS
- Bachelors in Computer Engineering or related field (Masters preferred)
- 4+ years of experience programming in Python (C++, Java experts aspiring to work on
- Python can also be considered)
- 4+ years experience with SQL and advanced data systems (Spark, Presto, EMR)
- 2+ years of experience with public cloud technologies (AWS, GCP, Azure)
- Experience building and optimizing data pipelines in a distributed environment
- Ability to implement application components without detailed guidance
- Experience with common developer tools and practices (Gitlab, Jira, Unit Testing)
- Proficiency working in Linux environments
- Excellent communication skills and the ability to present complex ideas in a clear and
- concise manner to a variety of audiences
Job Details
- Job Location
- India
- Company Industry
- General Engineering Consultancy
- Company Type
- Unspecified
- Employment Type
- Unspecified
- Monthly Salary Range
- Unspecified
- Number of Vacancies
- Unspecified