Advanced Data Scientist (B3) - (ML/AI)
Position Description:
Honeywell's Value Engineering (VE) and Component Engineering (CE) Center of Excellence (COE) is a dynamic collective of professionals dedicated to refining product development through innovative engineering and strategic component selection.
You will be part of Honeywell's VE/CE CoE Advanced Tech team with focus on developing and deploying Gen AI, data science, visualization offerings to exceed VECE revenue targets.
Key Responsibilities:
•Understand Business problem and formulate into appropriate AI / ML model.
•Build Visualizations to develop compelling stories & Insights from data
•Clean and preprocess large volume of data
•Implement Analytics & Dashboards solutions for VE/CE COE
•Develop ML/AI Models to analyze and mine large amount of Data.
•Develop scripts to automate and deploy Production ready ML solutions.
•Trouble shooting, Packaging and maintenance of ML solutions.
YOU MUST HAVE:
•Bachelor’s / master’s degree in engineering, Applied Mathematics or related field
•4 or more years of relevant experience in Data Science and analytics
•Have worked on 4+ production grade data analytics applications
•Strong expertise in scripting and querying languages, such as Python, R, SQL
•Demonstrated experience in popular Python and R packages like SciKit, TensorFlow, Pytorch, Dash, Shiny etc
•Experience in building AI agent frameworks like LangChain, LangGraph and LLM models like Gemini, GPT-4, Llama
•Strong knowledge in 1-2 SQL databases and No-Sql databases
•Working knowledge in data warehouse platforms (Snowflake)
•Strong knowledge of data structures, algorithms, and software engineering principles.
•Strong analytics skills: ability to assess data, drive insights, and make recommendations.
•Experience in using data integration tools for ETL processes
•Experience in CI/CD & Dev Ops & ML Ops Process
•Experience with both Structured and Unstructured data
•Experience in Deep Learning and classical Machine Learning
HTSIND2022
Nice to have
-Experience in Google Vertex AI and GCP services
-Experience working with Apache Spark, Azure, Hadoop, etc.
-Knowledge of Agile development methodology
-Working knowledge in visualization and reporting tools like Tableau, Power BI
-Exposure to Database schema design and ETL tools
-Exposure to enterprise tools like SAP, SFDC, Teamcenter etc.