https://bayt.page.link/LVrMnE4hGjPfumt38
Back to the job results
General Engineering Consultancy
Create a job alert for similar positions

Job Description

At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future.
About the Role:
We are seeking a highly motivated and versatile Platform Engineer to contribute to all facets of our Data Platform. This role encompasses both Data Engineering responsibilities, including database querying, query optimization, and ETL pipeline development, and Platform Tooling development to enhance the overall efficiency and scalability of our data infrastructure. We are looking for a candidate having good Data Engineering concepts and experience with tools such as Apache Airflow, AWS Glue, and related technologies combined with understanding and the right learning attitude of backend coding. You will play a critical role in building and maintaining a robust and performant data platform that empowers data-driven decision-making across the organization.

Responsibilities: API Development


  • Design, develop, and maintain RESTful APIs using languages such as Java, Go, NodeJS, or Python (experience with at least one required).Write clean, well-documented, and testable code.
  • Implement authentication, authorization, and security best practices.
  • Optimize API performance and scalability.
  • Participate in code reviews and provide constructive feedback.

Data Engineering


  • Design, build, and maintain data pipelines for collecting, processing, and storing large datasets.
  • Work with databases (SQL and NoSQL) to ensure data integrity and performance.
  • Implement data quality checks and monitoring.
  • Develop and maintain ETL (Extract, Transform, Load) processes.
  • Explore and implement new data technologies to improve our data infrastructure.
  • Contribute to the design and maintenance of our overall data architecture, ensuring efficient data flow between the data lake and data warehouse.

Platform Maintenance & Improvement


  • Monitor and troubleshoot production systems.
  • Identify and resolve performance bottlenecks.
  • Contribute to the development of platform standards and best practices.
  • Participate in on-call rotation (if applicable).

Qualifications


  • Bachelor's degree in Computer Science or a related field.
  • Minimum of 2 years of experience in backend API development and data engineering.
  • Proficiency in at least one of the following programming languages: Java, Go, NodeJS, or Python.
  • Experience with RESTful API design and development.
  • Strong understanding of data structures, algorithms, and database concepts (SQL and NoSQL).
  • Experience with data pipeline technologies (e.g., Apache Kafka, Apache Spark, Apache Airflow, AWS Glue).
  • Experience with cloud platforms such as AWS, Azure, or GCP.
  • Experience with version control systems (e.g., Git).
  • Experience with testing frameworks and methodologies.
  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration skills.
  • Experience working with data lakes and data warehouses.

Bonus Points


  • Experience with containerization technologies (e.g., Docker, Kubernetes).
  • Experience with infrastructure-as-code tools (e.g., Terraform, CloudFormation).
  • Experience with monitoring and logging tools (e.g., Prometheus, Grafana, ELK stack).
  • Contributions to open-source projects.
  • Experience with Agile development methodologies.


You have reached your limit of 15 Job Alerts. To create a new Job Alert, delete one of your existing Job Alerts first.
Similar jobs alert created successfully. You can manage alerts in settings.
Similar jobs alert disabled successfully. You can manage alerts in settings.