https://bayt.page.link/rNJnrgtcxF5EhMav9
Create a job alert for similar positions

Job Description

As a full spectrum GCP integrator, we assist hundreds of companies to realize the value, efficiency, and productivity of the cloud. We take customers on their journey to enable, operate, and innovate using cloud technologies – from migration strategy to operational excellence and immersive transformation. 
If you like a challenge, you’ll love it here, because we solve complex business problems every day, building and promoting great technology solutions that impact our customers’ success. The best part is, we’re committed to you and your growth, both professionally and personally. 
Overview
Our Big Data Engineers are experienced technologists with technical depth and breadth, along with strong interpersonal skills. In this role, you will work directly with customers and our team to help enable innovation through continuous, hands-on, deployment across technology stacks. You will work to build data pipelines and by developing data engineering code (as well as writing complex data queries and algorithms). 
If you get a thrill working with cutting-edge technology and love to help solve customers’ problems, we’d love to hear from you. It’s time to rethink the possible. Are you ready? 

What You’ll Be Doing:


  • Build complex ETL code.
  • Build complex SQL queries using MongoDB, Oracle, SQL Server, MariaDB, MySQL.
  • Work on Data and Analytics Tools in the Cloud.
  • Develop code using Python, Scala, R languages.
  • Work with technologies such as Spark, Hadoop, Kafka, etc.
  • Build complex Data Engineering workflows.
  • Create complex data solutions and build data pipelines.
  • Establish credibility and build impactful relationships with our customers to enable them to be cloud advocates.
  • Capture and share industry best practices amongst the community.
  • Attend and present valuable information at Industry Events.

Qualifications & Experience:


  • Bachelor's degree in computer science, Information Technology, or a related field.
  • 3+ years design & implementation experience with distributed applications.
  • 2+ years of experience in database architecture and data pipeline development.
  • Strong understanding of data modeling, ELT, ETL processes, data lakes, and data warehousing concepts.
  • Proficiency with GCP services such as BigQuery, Dataflow, Pub/Sub, GCS, Composer, Dataproc, and Cloud Functions.
  • Knowledge of big data technologies and frameworks (e.g., Hadoop, Spark).
  • Experience with programming languages such as SQL, Python, or Scala.
  • Presentation skills with a high degree of comfort speaking with executives, IT management, and developers.
  • Excellent communication skills with an ability to hold the right level conversations.
  • Demonstrated ability to adapt to modern technologies and learn quickly.
  • Certification in GCP (e.g., Professional Data Engineer) is preferred
  • #LI-JB2
  • #LI-Remote

You have reached your limit of 15 Job Alerts. To create a new Job Alert, delete one of your existing Job Alerts first.
Similar jobs alert created successfully. You can manage alerts in settings.
Similar jobs alert disabled successfully. You can manage alerts in settings.