Bosch Global Software Technologies Private Limited is a 100% owned subsidiary of Robert Bosch GmbH, one of the world's leading global supplier of technology and services, offering end-to-end Engineering, IT and Business Solutions. With over 22,700 associates, it’s the largest software development center of Bosch, outside Germany, indicating that it is the Technology Powerhouse of Bosch in India with a global footprint and presence in the US, Europe and the Asia Pacific region.
The Digital Data Platform is one main pillar of Bosch Rexroth Digital Strategy serving as data backbone for several digital initiatives.It’s key component is an Azure Cloud based data lake platform enabling cross domain & advanced data analytics use cases as well as governed self service by distributed teams.
Experience and Skills Required:
8-12 years in data engineering with significant experience in Cloud Lakehouse technology.
Azure Data Engineering hands-on experience with Azure Synapse Analytics, Azure Data Factory, Azure Entra, Azure SQL Database, Azure DevOps, and Power Platform (Power Apps/Power Automate) as best fit.
Experience with MS Fabric is an added advantage.
Programming Expertise : Demonstrated expertise in one or more programming languages, such as Python, JavaScript, etc., including knowledge of their syntax, libraries, frameworks, and tools.
Database Engineering : Proficiency in designing, building, operationalizing, securing, and monitoring data pipelines and data stores.
Data Ops Experience : Experience in Data Ops, emphasizing efficient and reliable data operations. Experience in CI/CD.
Responsibilities:
Data Pipeline Development: Apply data engineering standards and tools to create and maintain data pipelines. Perform extract, transform, and load (ETL) processes to ensure the security, compliance, scalability, efficiency, reliability, flexibility, and portability of maintained data. Conduct data quality checks and remediations.
Data Solutions Implementation: Evaluate, design, develop, and implement data pipelines, data flows, and data stores. Acquire and prepare data using on-premise, Azure cloud-based, and hybrid data engineering solutions.
Data Integration: Structure, store, and maintain data to facilitate connections within and between data stores, applications, and organizations. Integrate data from various sources, including databases, APIs, third-party services, and external data providers.
Specialist Knowledge Development: Develop and maintain specialist knowledge of database and data warehouse concepts, design principles, architectures, software, and facilities.
Collaboration: Collaborate with the Tech Lead and customers to understand data requirements and provide appropriate solutions.
Impact Analysis: Understand the DC Analytics platform and perform impact analysis based on functional and technical requirements.
Code Review: Perform peer reviews of code developed by other team members to ensure quality and compliance with standards.
B.E/B.Tech, MCA..
8-12 years