Predicting knowledge in an ontology stream
Freddy Lécué, Jeff Z. Pan
IJCAI 2013
In the realm of business automation, conversational assistants are emerging as the primary method for making automation software accessible to users in various business sectors. Access to automation primarily occurs through application programming interface (APIs) and robotic process automation (RPAs). To effectively convert APIs and RPAs into chatbots on a larger scale, it is crucial to establish an automated process for generating data and training models that can recognize user intentions, identify questions for conversational slot filling, and provide recommendations for subsequent actions. In this paper, we present a technique for enhancing and generating natural language conversational artifacts from API specifications using large language models (LLMs). The goal is to utilize LLMs in the “build” phase to assist humans in creating skills for digital assistants. As a result, the system does not need to rely on LLMs during conversations with business users, leading to efficient deployment. Along with enabling digital assistants, our system employs LLMs as proxies to simulate human interaction and automatically evaluate the digital assistant's performance. Experimental results highlight the effectiveness of our proposed approach. Our system is deployed in the IBM Watson Orchestrate product for general availability.
Freddy Lécué, Jeff Z. Pan
IJCAI 2013
John R. Kender, Rick Kjeldsen
IEEE Transactions on Pattern Analysis and Machine Intelligence
Barry K. Rosen
SWAT 1972
Ismail Akhalwaya, Shashanka Ubaru, et al.
ICLR 2024