Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
About

About

Luís Coelho has a degree and MsC in Electronics Engineering, at the Faculty of Engineering of Porto University, since 2000 and 2005 respectively. In 2012 he was awarded with the international PhD degree, in Telecommunications and Signal Processing from the University of Vigo, Spain. In 2001 he began teaching at Polytechnic Institute of Setubal, being in charge of the algorithms, data structures and computer programming courses, for desktop and web. In 2004 he moved to the Polytechnic Institute of Porto, the largest in Portugal, where he mainly teaches signal and image processing courses. He has been involved with the coordination of the Biomedical Engineering degree and master and of the Healthcare Management course. He is/was involved in several national and international projects and has supervised more than 200 internships with private companies in national and international context. He has also worked as a consultant at Microsoft Portugal contributing with knowledge and experience in signal processing related projects. As a researcher he has published more than 90 scientific articles in conferences and journals. He actively collaborates with the scientific community as participant, reviewer, organizer of scientific conferences or as journal editor. He has research interest on image and signal processing, human-machine interaction and healthcare management.

Interest
Topics
Details

Details

  • Name

    Luis Coelho
  • Role

    Senior Researcher
  • Since

    10th February 2023
Publications

2023

Development of a Collaborative Robotic Platform for Autonomous Auscultation

Authors
Lopes, D; Coelho, L; Silva, MF;

Publication
APPLIED SCIENCES-BASEL

Abstract
Listening to internal body sounds, or auscultation, is one of the most popular diagnostic techniques in medicine. In addition to being simple, non-invasive, and low-cost, the information it offers, in real time, is essential for clinical decision-making. This process, usually done by a doctor in the presence of the patient, currently presents three challenges: procedure duration, participants' safety, and the patient's privacy. In this article we tackle these by proposing a new autonomous robotic auscultation system. With the patient prepared for the examination, a 3D computer vision sub-system is able to identify the auscultation points and translate them into spatial coordinates. The robotic arm is then responsible for taking the stethoscope surface into contact with the patient's skin surface at the various auscultation points. The proposed solution was evaluated to perform a simulated pulmonary auscultation in six patients (with distinct height, weight, and skin color). The obtained results showed that the vision subsystem was able to correctly identify 100% of the auscultation points, with uncontrolled lighting conditions, and the positioning subsystem was able to accurately position the gripper on the corresponding positions on the human body. Patients reported no discomfort during auscultation using the described automated procedure.

2023

Working on empathy with the use of extended reality scenarios: the Mr. UD project

Authors
Laska-Lesniewicz, A; Kaminska, D; Zwolinski, G; Coelho, L; Raposo, R; Vairinhos, M; Haamer, E;

Publication
INTERNATIONAL JOURNAL OF COMPUTER APPLICATIONS IN TECHNOLOGY

Abstract
Empathy has become a central part of design and is loudly manifested in several frameworks such as universal design, inclusive design or human-centred design. This paper presents five independent Extended Reality (XR) scenarios that put potential users in the shoes of people with special needs such as vision impairments, autism spectrum disorder, mobility impairments, pregnancy state and some problems of the elderly. All exercises occur in a supermarket environment and the application is prepared for Oculus Quest 2 platform and is supported in some cases by tangible equipment (geriatric suit, pregnancy belly simulator, wheelchair). The proposed simulations were validated by experts who evaluated the quality of the proposed tasks and the possibility of simulating selected limitations or issues in XR. Ongoing development and testing of the XR application will provide further in-depth views on its usefulness, acceptance and impact in increasing empathy towards the challenges faced by the personas portrayed.

2023

How Artificial Intelligence Is Shaping Medical Imaging Technology: A Survey of Innovations and Applications

Authors
Pinto Coelho, L;

Publication
BIOENGINEERING-BASEL

Abstract
The integration of artificial intelligence (AI) into medical imaging has guided in an era of transformation in healthcare. This literature review explores the latest innovations and applications of AI in the field, highlighting its profound impact on medical diagnosis and patient care. The innovation segment explores cutting-edge developments in AI, such as deep learning algorithms, convolutional neural networks, and generative adversarial networks, which have significantly improved the accuracy and efficiency of medical image analysis. These innovations have enabled rapid and accurate detection of abnormalities, from identifying tumors during radiological examinations to detecting early signs of eye disease in retinal images. The article also highlights various applications of AI in medical imaging, including radiology, pathology, cardiology, and more. AI-based diagnostic tools not only speed up the interpretation of complex images but also improve early detection of disease, ultimately delivering better outcomes for patients. Additionally, AI-based image processing facilitates personalized treatment plans, thereby optimizing healthcare delivery. This literature review highlights the paradigm shift that AI has brought to medical imaging, highlighting its role in revolutionizing diagnosis and patient care. By combining cutting-edge AI techniques and their practical applications, it is clear that AI will continue shaping the future of healthcare in profound and positive ways.

2023

Enhancing learning expériences through artificial intelligence: Classroom 5.0

Authors
Coelho, L; Reis, S;

Publication
Fostering Pedagogy Through Micro and Adaptive Learning in Higher Education: Trends, Tools, and Applications

Abstract
Artificial Intelligence (AI) has evolved rapidly since its inception in the 1950s, from simple rule-based systems to today's advanced deep learning models. AI has impacted society in many ways, ranging from revolutionizing the way we live, work, and interact with technology, to creating new job opportunities, improving decision-making and automating tasks, and solving complex problems in fields like healthcare, finance, and transportation. However, it has also raised concerns about job displacement, privacy and security, and ethical considerations. The evolution of AI is ongoing, and it is expected to continue to shape and transform society in new and profound ways. The impact of AI in education has also been substantial, offering new and innovative ways to personalize learning, enhance educational resources, and improve educational outcomes. In this chapter we will cover the most important aspects related with the teaching-learning process, from a physiological perspective to the different strategies. © 2023, IGI Global. All rights reserved.

2023

INCLUSION AND ADAPTATION BEYOND DISABILITY: USING VIRTUAL REALITY TO FOSTER EMPATHY

Authors
Pinto Coelho, L; Laska Lesniewicz, A; Pereira, ET; Sztobryn Giercuszkiewicz, J;

Publication
MEDYCYNA PRACY

Abstract
Background: Virtual reality (VR) has the potential to be a powerful tool in promoting empathy towards inclusion, particularly for individuals with impairments such as mobility difficulties, vision deficits, or autism but also about pregnancy, which can create temporary difficulties. By immersing users in simulated environments that replicate the experiences of those with different abilities, VR can create a sense of understanding and empathy for those who face challenges in their daily lives. For example, VR experiences can simulate the experience of navigating space as someone with a mobility impairment, providing a new perspective and appreciation for the difficulties that others face. Similarly, VR experiences can simulate the experience of vision impairment, pregnancy, or autism, providing a window into the challenges faced by those with these conditions and fostering empathy and understanding. Material and Methods: During the development of this study, field experts were consulted to ensure the robustness of the methods employed. Then, questionnaires were specifically developed to explore disabilities and challenges related to inclusion and were administered to a large population. Additionally, guided interviews were conducted with individuals who possess specific impairments to gather first-hand insights. Results: The results obtained from the questionnaires and interviews provide a comprehensive overview of the inclusion challenges that necessitate attention and resolution. By drawing on the expertise of both experts and individuals with lived experiences, a holistic landscape of inclusion challenges has been established. Conclusions: The VR emerges as a powerful tool for promoting inclusion and fostering understanding among individuals. Its capacity to create immersive experiences that facilitate empathy has the potential to reshape society into a more compassionate and empathetic one. By leveraging the unique capabilities of VR, we can bridge the gap between different perspectives, fostering greater understanding, acceptance, and inclusivity.