Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
Publications

Publications by Nuno Escudeiro

2015

Real time bidirectional translator of Portuguese sign language

Authors
Escudeiro, P; Escudeiro, N; Reis, R; Rodrigues, P; Lopes, J; Norberto, M; Baltasar, AB; Barbosa, M; Bidarra, J;

Publication
WEBIST 2015 - 11th International Conference on Web Information Systems and Technologies, Proceedings

Abstract
The communication with deaf by means of written text is not as efficient as it might seem. In fact, there is a very deep gap between sign language and spoken/written language. The deployment of tools to assist the daily communication between deaf people and the rest may be a significant contribution to the social inclusion of the deaf community. The work described in this paper addresses the development of a bidirectional translator between Portuguese Sign Language and Portuguese text and a serious game to promote the learning of the Portuguese Sign Language. The translator from sign language to text employs two devices, namely the Microsoft Kinect and 5DT Sensor Gloves in order to gather data about the motion and shape of the hands. The hands configurations are classified using Support Vector Machines. The classification of the movement and orientation of the hands is achieved through the use of Dynamic Time Warping algorithm. The translator exhibits a precision higher than 90%. In the other direction, the translation of Portuguese text to Portuguese Sign Language is supported by a 3D avatar which interprets the entered text and performs the corresponding animations. As a complement, we also present a serious game directed to assist in the difficult task of learning the Portuguese Sign Language.

2017

Recognition of hand configuration: a critical factor in automatic sign language translation

Authors
Escudeiro, N; Escudeiro, P; Soares, F; Litos, O; Norberto, M; Lopes, J;

Publication
2017 12TH IBERIAN CONFERENCE ON INFORMATION SYSTEMS AND TECHNOLOGIES (CISTI)

Abstract
Identifying hand configuration is a critical feature of sign language translation. In this paper, we describe our approach to recognize hand configurations in real time with the purpose of providing accurate predictions to be used in automatic sign language translation. To capture the hand configuration we rely on data gloves with 14 sensors that measure finger joints bending. These inputs are sampled at a frequency of 100Hz and fed to a classifier that predicts the current hand configuration. The classification model is created from an annotated sample of hand configurations previously acquired. We expect this approach to be accurate and robust in the sense that the performance of the classification model should not vary significantly when the classifier is being used by one or another user. The results from our experimental evaluation show that there is a very high accuracy, meaning that data gloves are a good approach to capture the descriptive features of hand configurations. However, the robustness of such an approach is not as good as desirable since the accuracy of the classifier depends on the user, i.e., the accuracy is high when the classifier is used by a user who trained it but decreases in other cases.

2019

Automatic Sign Language Translation to Improve Communication

Authors
Oliveira, T; Escudeiro, P; Escudeiro, N; Rocha, E; Barbosa, FM;

Publication
PROCEEDINGS OF 2019 IEEE GLOBAL ENGINEERING EDUCATION CONFERENCE (EDUCON)

Abstract
Over the last years, there has been an increase in hearing-impaired students who use sign language as their main form of communication attending higher education institutions around the world. The knowledge that their comprehension of texts is reduced due to sentence structure differences causes a need for more solutions to improve communication and support students in environments where they are unable to be accompanied by sign interpreters. This article details the improvements and current structure of the VirtualSign platform, a bidirectional sign language to text translation tool that has been in development since 2015. The platform is divided into two main parts, sign to text and text to sign, and both components are described and explained. The solution has received positive feedback on several tests and a pilot experiment, and is being developed with partnerships with sign interpreters from six different European countries. Some planned improvements and future functionalities for the tool are also mentioned and detailed.

2019

The VirtualSign Channel for the Communication Between Deaf and Hearing Users

Authors
Oliveira, T; Escudeiro, N; Escudeiro, P; Rocha, E; Barbosa, FM;

Publication
IEEE REVISTA IBEROAMERICANA DE TECNOLOGIAS DEL APRENDIZAJE-IEEE RITA

Abstract
Deaf students, who use sign language as their mother language, continuously experience difficulties to communicate with non-deaf in their daily lives. This is a severe handicap in education settings seriously jeopardizing deaf people chances to progress in their professional career. Deaf people's comprehension of texts is limited due to grammar differences between sign and oral languages. There is a need to improve the communication between deaf and non-deaf and to support deaf students in environments where they are unable to be accompanied by sign interpreters. This article details the improvements and current structure of the VirtualSign platform, a bidirectional sign language to text translation tool in development since 2015. The platform has two main components, sign to text and text to sign, that are both described. Translation from text to sign relies on a 3D avatar. Translation from sign to text relies on a set of data gloves and Kinect. In this paper we discuss the relevance of different types of data gloves. VirtualSign is being developed in cooperation with the deaf communities from six different European countries and Brazil. This solution to support deaf students in educational settings has received positive feedback on several tests and pilot experiments. Some planned improvements and future functionalities for the tool are also mentioned and detailed.

2018

Blind/Deaf Comunication API for Assisted Translated Educational Digital Content

Authors
Ulisses, J; Oliveira, T; Rocha, E; Escudeiro, PM; Escudeiro, N; Barbosa, FM;

Publication
2018 28TH EAEEIE ANNUAL CONFERENCE (EAEEIE)

Abstract
With the rise of usage in digital content in education, deaf and blind communities face communication barriers which as a result makes education less inclusive. These barriers do not allow them to integrate within the larger scholarly communities as most tools used for information dissemination remain inaccessible to them. This paper presents BDC-API (Blind/Deaf Communications API), a free-to-use modular toolkit that will ease accessibility for the blind and deaf communities to digital education content. This content includes the use cases of Massive Online Open Courses and Serious Games used in education. BDC-API incorporates the use of state of the art technologies such as, 3D sign language translator, grammar translation, voice recognition and text-to-speech. This paper demonstrates in greater detail, how these technologies culminate in the creation of an API ready to use for any educational digital content and how the BDC-API can ensure higher quality of digital content.

2015

Virtual Sign-A Serious Game for Deaf People

Authors
Escudeiro, P; Escudeiro, N; Lopes, J; Norberto, M;

Publication
2015 INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND ARTIFICIAL INTELLIGENCE (CAAI 2015)

Abstract
The work described in this paper addresses the development of a serious game to promote the learning of the Portuguese Sign Language supported by an automatic bi-directional translator between Portuguese Sign Language and Portuguese written text. The translator from sign language to text is supported by two devices, namely the Microsoft Kinect and 5DT Sensor Gloves in order to gather data about the motion and shape of the hands. The hands configurations are classified using Support Vector Machines. The classification of the movement and orientation of the hands is achieved through the use of Dynamic Time Warping algorithm. The translation of Portuguese text to Portuguese Sign Language is supported by a 3D avatar which interprets the entered text and performs the corresponding animations. A serious game directed to assist in the difficult task of learning the Portuguese Sign Language is presented.

  • 3
  • 8