Using Deep Learning for Implementing Paraphrasing in a Social Robot

Download: BibTeX | Plain Text

Description

Social robotics is intended for interacting and communicating with people. This communication needs to be natural and fluid, similar to a human conversation. For instance, people do not always use exactly the same words when transmitting the same message in different situations. One way to imitate this is through paraphrase generation, allowing the robot to use different sentences while keeping the same meaning. In this paper we propose the application of deep learning models to generate these paraphrases in a social robot speaking in Spanish. This application has been integrated into the Mini robot both in English and Spanish, and can be run locally or from an external server. We integrated and evaluated 3 different models. The T5 and Pegasus Transformers have been considered as a base, and we have chosen three different models finetuned for paraphrase generation, one based on Pegasus, and the other two on T5 (Finetuned T5 and Parrot). Compared to the Parrot and Pegasus models, that provide similar results, the Finetuned T5 model can generate paraphrases that do not vary as much from the original sentence, but keep more of the meaning. However, the main advantage of the Finetuned T5 model is its average execution time, around one second faster than the other considered models.

This site is registered on wpml.org as a development site. Switch to a production site key to remove this banner.