https://bayt.page.link/VK3pS5z81MY8FheE9
Create a job alert for similar positions

Job Description

Job Summary
• This role is responsible for analyzing and interpreting complex data sets using statistical techniques to identify trends, patterns, and insights that can be used to form business decisions and improve organizational performance. The role designs and develops methods, processes, and systems to consolidate and analyze unstructured, diverse big data sources to generate actionable insights and solutions. The role develops programs, algorithms, and automated processes to cleanse, integrate and evaluate large datasets from multiple disparate sources.
 


Description


Responsible for all stages of design and development for complex products and platforms, including solution design, analysis, coding, testing, and integration. Exercises independent judgment within generally defined policies and practices to identify and select a solution. Ability to handle most unique situations. Designs and establishes secure and performant data architectures, enhancements, updates, and programming changes for portions and subsystems of data product pipelines, repositories, or models for structured/unstructured data.


·Responsibilities


  • High level of initiative, with an ability to plan and manage tasks, ability to work collaboratively with a group of peers, both within and outside one’s own group.
  • Designs, analyses, programs, debugs, troubleshoots, and modifies software applications for enhancements and new products


Education & Experience Recommended
• Four-year or Graduate Degree in Mathematics, Statistics, Economics, Computer Science, Data Science, or any other related discipline or commensurate work experience or demonstrated competence.
• Typically has 4-7 years of work experience, preferably in data analytics, statistical modeling, machine learning, or a related field or an advanced degree with 3-5 years of work experience.
Preferred Certifications
• Programming Language/s Certification (SQL, Python, or similar)
Knowledge & Skills
Strong knowledge in DataBricks


Strong in PySpark and Python


Strong in SQL/MySQL, Relational data modelling


Strong Notebooks environment experience (Jupyter, DataBricks)


Fluent in complex, distributed and massively parallel cloud systems (AWS, GCP, AZURE)


Experience with workflow orchestration tools (Airflow, Jenkins)


Basics of AWS/Azure technologies


Proficient in using Git & Tera Forms


Experience collecting requirements from Partners and choosing the right technologies to meet end to end data flow requirements that are required.  


Strong analytical and problem solving skills.



Cross-Org Skills
• Effective Communication
• Results Orientation
• Learning Agility
• Digital Fluency
• Customer Centricity
Impact & Scope
• Impacts multiple teams and may act as a team or project leader providing direction to team activities and facilitates information validation and team decision making process.
Complexity
• Responds to moderately complex issues within established guidelines.
Disclaimer
• This job description describes the general nature and level of work performed in this role. It is not intended to be an exhaustive list of all duties, skills, responsibilities, knowledge, etc. These may be subject to change and additional functions may be assigned as needed by management.



Job Details

Job Location
Bengaluru India
Company Industry
Other Business Support Services
Company Type
Unspecified
Employment Type
Unspecified
Monthly Salary Range
Unspecified
Number of Vacancies
Unspecified
You have reached your limit of 15 Job Alerts. To create a new Job Alert, delete one of your existing Job Alerts first.
Similar jobs alert created successfully. You can manage alerts in settings.
Similar jobs alert disabled successfully. You can manage alerts in settings.