https://bayt.page.link/oEkYBQXgVb9f4Qt2A
Create a job alert for similar positions

Job Description

About us


dmg events is an international exhibitions and publishing company. We attract more than 425,000 visitors to our global portfolio of 80 exhibitions each year. Through our global events, our aim is simple. We want to accelerate business through face-to-face events, which is why we work so hard to bring people together, creating opportunities for them to network, learn and do business. dmg events is part of the Daily Mail Group Trust (DMGT). DMGT Manages a diverse, multinational portfolio of companies with total revenues of £2bn, that provide businesses and consumers with compelling information, analysis, insight, events, news, and entertainment.


Introduction


We are looking for an experienced AWS Data Engineer to join our team to support the development of the Customer Data Platform (CDP) project. As a Data Engineer, you will work closely with the Lead Engineer, Business Analysts and third-party professional services to lay the groundwork to enable the business to truly become data-driven, creating the one source of truth for our customer data. 


Responsibilities


  • Designing, building and maintaining the data lake solution and associated pipelines
  • Develop and own the data strategy on coding best practices
  • Contribute to the overall architecture by identifying gaps and efficiencies in the design
  • Ensuring that data quality is considered at every point of the data journey and working closely with the business to ensure the correct rules and identifiers are in place
  • Coaching the junior members of the Data team to be more cloud engineering focussed
  • Maintaining, testing and implementing disaster recovery procedures.

Skills/Qualifications


  • 5+ years as a Data Engineer within the AWS cloud environment
  • Hands-on experience with the following:
    • S3/Redshift
    • AWS Glue
    • AWS Lambda
    • API Gateway
    • Amazon AppFlow (Desirable)
    • FindMatches (Desirable)
  • Expertise in moving and transforming data using Python, Spark & Scala
  • Experience in using REST APIs for data transfer
  • Solid understanding of master data management and its integration into the broader infrastructure
  • Knowledge and experience with testing, releasing, and CI/CD pipeline deployments into AWS using tools like Bitbucket, Jenkins, ServiceNow
  • Understanding of Agile methodologies, processes and procedures
  • Demonstrated ability to be a self-starter and take ownership of their work
  • Experience in using SalesForce would be considered an advantage

'Please take the time to read the job description, you must meet all the criteria set out above for your application to be considered. We do check all applications, and suitable candidates will be contacted within 5 days. If you are not contacted by us within 5 working days, please consider your application unsuccessful at this time.'


#LI-DNI



You have reached your limit of 15 Job Alerts. To create a new Job Alert, delete one of your existing Job Alerts first.
Similar jobs alert created successfully. You can manage alerts in settings.
Similar jobs alert disabled successfully. You can manage alerts in settings.