https://bayt.page.link/RgNv7ZzToRRiC2cX8
Create a job alert for similar positions

Job Description

Introduction
At IBM, work is more than a job – it’s a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you’ve never thought possible. Are you ready to lead in this new era of technology and solve some of the world’s most challenging problems? If so, lets talk.

Your Role and Responsibilities
Data Engineer in IBM’s CIO organization, supporting a data warehouse that consolidates IBM’s global real estate data into IBM’s Cognitive Enterprise Data Platform (CEDP), including integrations with enterprise systems (TRIRIGA, Maximo, Envizi, and others). Data includes comprehensive information about IBM’s global internal real estate portfolio, including properties, space, leases, energy consumption, construction/renovation projects, and environmental compliance.

Responsibilities:


  • Manage integrations for data ingestion from multiple source systems via APIs, queries, and Apache Spark and Airflow workflows. Launch Spark jobs in Airflow as needed.
  • Support data transformations and aggregations in a Cloud Object Storage (COS) integration zone, and subsequent data feeds to a DB2 warehouse.
  • Maintain a Cirrus platform hosting Cloud Object Storage, along with inbound & outbound data flows. Initiate activities such as opening firewall flows, defining entitlements, and managing roles & user access as needed. Identify and perform other activities required to ensure reliable operation of the cluster,
  • Identify and promptly address issues with data and integrations. Implement and optimize monitoring, and troubleshoot errors through in-depth reviews of logs, code, data, and integration components.
  • Document and share data architectures and flows (including schemas, tables, queries, and scheduled activities) with data analysts and Cognos developers.
  • Develop new capabilities for data ingestion and transformation as needed. Create algorithms, develop new transfers using Spark/Airflow, APIs, SQL, etc. Perform comprehensive testing of individual components as well as end-to-end solution.


Required Technical and Professional Expertise


  • 5+ years of experience with managing data warehouses and integration solutions
  • API integration experience
  • Apache Spark and Airflow experience
  • SQL development, relational database table structure, and database design
  • Expertise in working with structured and unstructured data
  • Excellent communication skills (written and verbal). Able to clearly communicate with leadership and colleagues about technical capabilities, limitations, issues, and recommendations.
  • Highly organized, detail oriented, independent, and resourceful.
  • Able to manage complex technical projects with diverse global stakeholders and detailed, interdependent requirements
  • Experience with Agile practices and associated tools, including Jira


Preferred Technical and Professional Expertise


  • Cognos Analytics experience, including creating/maintaining data modules and developing reports & dashboards
  • Javascript development experience
  • BS or BA in a related discipline

Job Details

Job Location
Bengaluru India
Company Industry
Other Business Support Services
Company Type
Employer (Private Sector)
Employment Type
Unspecified
Monthly Salary Range
Unspecified
Number of Vacancies
Unspecified
You have reached your limit of 15 Job Alerts. To create a new Job Alert, delete one of your existing Job Alerts first.
Similar jobs alert created successfully. You can manage alerts in settings.
Similar jobs alert disabled successfully. You can manage alerts in settings.