At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. As a Software Data Engineer you will be responsible for data pipelines to work on both scheduled and real time use cases. Schedule pipelines run typically in Airflow and real time processing could be airflow or integrated querying in backend services.
Responsibilities
Work closely with team leads and backend developers to design and develop functional, robust data pipelines to support internal and customer needs
Write unit, integration, and DQA tests, and develop automation tools for daily tasks
Develop high quality, well documented and efficient code
Manage and optimize scalable pipelines in the cloud
Optimize internal and external applications for performance and scalability
Communicate regularly with project managers, quality engineers, and other developers regarding progress on long-term technology roadmap
Recommend systems solutions by comparing advantages and disadvantages of custom development and purchased alternatives
Key Skills
Domain Expertise
4-8 years of experience as a software/data engineer
Bachelor’s degree in Computer Science, MIS, or Engineering
Technical Skills
Advanced experience with relational database systems (RDBMS), with advanced proficiency in SQL for data querying, manipulation. Additionally complex data analysis, optimization, and performance tuning (MySQL, PostgresSQL, Amazon RDS).
Advanced proficiency with data integration and ETL (Extract, Transform, Load) processes to move and transform data between systems.
Experience (2+ years) with data pipeline orchestration tools like Apache Airflow for workflow management and automation.
In-depth knowledge of data warehouse design principles and best practices, including schema design, indexing strategies, and partitioning techniques.
Experience with analytical data stores and strong understanding of distributed computing principles (e.g., Spark, Presto, Pandas)
Advanced proficiency with cloud service technologies, ex: AWS (RDS, EC2, S3, Athena, Lambda)
Expertise in designing and optimizing data architectures for scalability, performance, and cost efficiency in cloud environments.
Experience with Docker container technology, familiar with kubernetes.
Experience with GIT (Gitlab preferred) source control systems
Familiarity with data governance, security, and compliance standards in cloud environments.
Experience with data visualization tools such as Tableau, Power BI, or Looker for creating interactive dashboards and reports
Mindset and attributes
Leadership and Mentoring skills to assist junior members of the data engineering team.
Proven track record of successfully completing end-to-end data projects from conception to implementation, including requirements gathering, design, development, testing, and deployment.
Strong problem-solving abilities to identify and resolve complex technical challenges in data processing and analytics.
Excellent project management skills to prioritize tasks, manage timelines, and deliver results within scope and budget constraints.
Effective communication skills to collaborate with stakeholders across different departments and levels of the organization.
Strategic thinking and ability to align data engineering initiatives with business objectives and organizational goals.
Continuous learning mindset to stay updated with emerging technologies, industry trends, and best practices in data engineering and analytics.