At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. At Nielsen, we believe that career growth is a partnership. You ultimately own, fuel and set the journey. By joining our team of nearly 14,000 associates, you will become part of a community that will help you to succeed. We champion you because when you succeed, we do too. Embark on a new initiative, explore a fresh approach, and take license to think big, so we can all continuously improve. We enable your best to power our future. As a Data Engineer, you will work alongside data scientists and engineers, to build a data platform on AWS to ingest data from external sources and perform ETL tasks. The role requires you to have experience working with large datasets with complex schemas, a can-do approach towards automation, with an emphasis on the implementation of best practice cloud security principles. Our culture is inclusive and we have a healthy work life balance. If you're passionate about problem solving, enjoy continuous learning, like building new things; we want to hear from you!
Responsibilities
Collaborate with product owners to understand requirements and design new components
Collaborate in cross-functional teams to implement, test and deploy features.
Perform code reviews
Build and maintain development environments and CI/CD workflows.
Preferred Qualifications
BS or MS in Computer Science.
3+ years with mainstream programming languages (Python, C++, C, Java, etc.).
Strong knowledge of programming concepts and paradigms.
Basic understanding of data lakes / data warehousing.
Experience with SQL and relational database systems.
Experience writing automated tests.
Excellent communication skills.
Experience working in Cloud environments.
Experience working in an agile environment.
Experience with popular AWS services (S3, RDS, EC2, etc.).
Experience with popular developer tools (Git, Docker, Github/Gitlab).
Familiarity with orchestration tools such as Apache Airflow.
Familiarity with big data processing tools such as Spark, EMR, Presto.