https://bayt.page.link/RPghSXv52HohgrVR8
Create a job alert for similar positions

Job Description

Platform DevOps Engineers for Databricks on AWS based platform


To join the team developing and operating modern Databricks on AWS based data and analytics platform in Pune in India. Teams purpose is to develop and operate a scalable and secure cloud-based data and analytics platform on which other data and analytics teams can with ease develop and run their workloads. Our data and analytics platform is key enabler in digital transformation creating ability to develop new scalable analytics, AI and digital use cases by leveraging data across the whole KONE organization.


In our work we design, architect, engineer and operate the cloud foundation for data and analytics domains, develop guardrails for secure use of cloud native services and execute technology management to align our cloud foundation with business goals and objectives to maximize the efficiency and effectiveness of technological resources. Our platform stack is fully cloud native using AWS and Databricks serverless components and highly automated using infrastructure as a code (IaaC) services.


The Platform DevOps Engineer role is a hands-on, roll up your sleeves position that requires a passion for technology, cloud infrastructure, security, engineering and DevOps. In this role you will be doing various cloud platform engineering tasks to ensure a cyber secure, stable and future proof cloud infrastructure for our AWS and Databricks based data & analytics platforms. Scope of work includes new development, optimization, support and maintenance focusing on Databricks and AWS platform core capabilities. In the Platform DevOps Engineer role you need to quickly master complex technical documentation and keep up-to-date with new developments and features added to Databricks and AWS, and suggest changes which could enhance developer experience and improve cloud platform capabilities.


We offer a chance to work hands-on with state-of-the-art cloud technology in a global, multi-cultural work environment. We expect you to have solid engineering experience and DevOps mindset in continuously striving to improve the team performance. An ideal candidate has a strong background in either cloud infrastructure and connectivity, software development, data engineering, analytics and above all will to commit to a DevOps mindset, agile ways of working and reaching goals together. 


We are searching for an enthusiastic person to join the team who is excited about developing own professional skills even further, learning new things and contributing to team success. We are expecting you to take self-driven, proactive approach to your work, find & implement solutions, solve problems, and share the learnings to colleagues. We want to work with people who enjoy teamwork, are not afraid to step out of their comfort zone, want to help others and share information.


To succeed in this role, following professional experience will play a key role:


Bachelor’s or Master's degree in either in engineering, computer science, information systems, or a related field Platform DevOps Engineer professional experience


Way of working professional experience


 The position is based in Pune in India.


  • Deep knowledge of AWS services, AWS cloud infrastructure, networking, Identity and access management and cyber security. Knowledge of Azure, Fabric and PowerBI is a plus.
  • Working experience with S3, lambda, SQS, SNS, Cloudwatch, Glue, Dynamo DB, EC2, ECS, ELB, KMS, VPC, patch manager, SSM, API gateway, Lakeformation, Cost explorer, aws cli
  • Working knowledge of cyber security vulnerability detection tools such as aws inspector, security hub, defender, blackduck, Coverity, detectify or similar is a plus.
  • Previous experience in Databricks platform administration including solid understanding on data engineering, machine learning and AI.
    • Databricks platform administration - UnityCatalog administration, databricks cli, Notebooks, REST APIs
    • Data engineering – Spark, dbt, dataops
    • Machine learning - ML engineering, MLFlow, MLOps, Model Serving
    • Open big-data file formats – Delta lake, Iceberg, Parquet
  • Previous experience with programming languages such as sql, python, shell scripting, and experience with configuration languages such as yaml, json is a big plus.
  • Proficiency with deploying and managing common AWS infrastructure and services.
  • Proficiency in infrastructure-as-code with aws python cdk and terraform expertise, deep knowledge on aws cloudformation is a plus.
  • Fluent in industry-standard DevOps practices (code reviews, version control, trunk-based development, CI/CD automations, testing).
  • Passion to utilize agile development methodologies and tools (Jira, Confluence).
  • Inbuilt cybersecurity awareness.
  • Ability to work in global multi-cultural team and effectively collaborate within the team.
  • Ability to self-organize and be proactive, seek feedback, be courageous and resilient, and have excellent problem-solving skills.
  • Proficiency in spoken and written English language, and strong facilitation and communication skills.

At KONE, we are focused on creating an innovative and collaborative working culture where we value the contribution of each individual. Employee engagement is a key focus area for us and we encourage participation and the sharing of information and ideas. Sustainability is an integral part of our culture and the daily practice. We follow ethical business practices and we seek to develop a culture of working together where co-workers trust and respect each other and good performance is recognized. In being a great place to work, we are proud to offer a range of experiences and opportunities that will help you to achieve your career and personal goals and enable you to live a healthy and balanced life.


Read more on www.kone.com/careers



You have reached your limit of 15 Job Alerts. To create a new Job Alert, delete one of your existing Job Alerts first.
Similar jobs alert created successfully. You can manage alerts in settings.
Similar jobs alert disabled successfully. You can manage alerts in settings.