2025
Autores
Vaz, CB; Galvao, A; Pais, C; Pinheiro, M;
Publicação
ADVANCED RESEARCH IN TECHNOLOGIES, INFORMATION, INNOVATION AND SUSTAINABILITY, ARTIIS 2024 INTERNATIONAL WORKSHOPS, PT I
Abstract
This paper presents the development process of the mobile App D.R.E.A.M., Design-thinking to Reach-out, Embrace and Acknowledge Mental health, which is a tool for self-assessment and self-care in promoting the mental health of higher education students. In Portugal, the program for promoting Mental Health in higher education advocates the development and use of digital tools, such as apps and/or social networks and platforms, aimed at promoting wellbeing and with the potential for use to be more accessible to higher education students. The objective of this app is to promote the mental health and wellbeing of higher education students. Design Thinking was used as the methodology for building the app, which was developed using a combination of low-code/no-code tools, Flutter/Dart coding, and Google's Firebase capabilities and database functionalities. In the first semester of the 2023/2024 academic year, 484 students downloaded the app, and 22 emails were received for psychological consultations. A dynamic update of the app is required, with modules on time management and study organization, structured physical activity programs, development of socio-entrepreneurial skills, and vocational area.
2025
Autores
Almeida, F; Deutsch, N;
Publicação
Urban Governance
Abstract
2025
Autores
Sousa, RB; Sobreira, HM; Martins, JG; Costa, PG; Silva, MF; Moreira, AP;
Publicação
2025 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS, ICARSC
Abstract
Multimodal perception systems enhance the robustness and adaptability of autonomous mobile robots by integrating heterogeneous sensor modalities, improving long-term localisation and mapping in dynamic environments and human-robot interaction. Current mobile platforms often focus on specific sensor configurations and prioritise cost-effectiveness, possibly limiting the flexibility of the user to extend the original robots further. This paper presents a methodology to integrate multimodal perception into a ground mobile platform, incorporating wheel odometry, 2D laser scanners, 3D Light Detection and Ranging (LiDAR), and RGBD cameras. The methodology describes the electronics design to power devices, firmware, computation and networking architecture aspects, and mechanical mounting for the sensory system based on 3D printing, laser cutting, and bending metal sheet processes. Experiments demonstrate the usage of the revised platform in 2D and 3D localisation and mapping and pallet pocket estimation applications. All the documentation and designs are accessible in a public repository.
2025
Autores
Cunha, A; Macedo, N;
Publicação
CoRR
Abstract
2025
Autores
Leite, D; Marques, P; Pádua, L; Sousa, JJ; Morais, R; Cunha, A;
Publicação
PFG-JOURNAL OF PHOTOGRAMMETRY REMOTE SENSING AND GEOINFORMATION SCIENCE
Abstract
Accurate segmentation of grapevines in imagery acquired from unmanned aerial vehicles (UAVs) is important for precision viticulture, as it supports vineyard management by monitoring grapevine health, growth, and environmental stress. However, the structural diversity of vineyards, including differences in training systems, row curvatures, and foliage density, presents challenges for grapevine segmentation methods. This study evaluates the performance of deep learning (DL) models-Feature Pyramid Network (FPN), Pyramid Scene Parsing Network (PSPNet) and U-Net-each combined with different backbones for grapevine segmentation in UAV-based RGB orthophoto mosaics. Data were collected under a range of vineyard conditions and scenarios from Portugal's Douro and Vinhos Verdes regions, providing a representative dataset across multiple vineyard configurations. The DL models were trained, tested, and evaluated using orthorectified RGB imagery, and their segmentation accuracy was compared to thresholding techniques. The results show that DL models, particularly U-Net, achieved accurate grapevine segmentation and reduced over-segmentation and false detections that are common in thresholding methods. FPN models with Inception-v4 and Xception backbones performed well in vineyards with inter-row vegetation, while PSPNet models showed segmentation limitations. Overall, DL-based segmentation models demonstrated advantages over thresholding approaches, demonstrating their suitability for UAV-based grapevine segmentation in diverse and challenging vineyard environments. These results support the scalability of DL-based segmentation for vineyard monitoring applications and indicate that improved segmentation accuracy can contribute to decision support in precision viticulture.
2025
Autores
Cumbane, SP; Gidófalvi, G; Cossa, OF; Madivadua, AM; Sousa, N; Branco, F;
Publicação
BIG DATA AND COGNITIVE COMPUTING
Abstract
Understanding people's face-to-face interactions is crucial for effective infectious disease management. Traditional contact tracing, often relying on interviews or smartphone applications, faces limitations such as incomplete recall, low adoption rates, and privacy concerns. This study proposes utilizing anonymized Call Detail Records (CDRs) as a substitute for in-person meetings. We assume that when two individuals engage in a phone call connected to the same cell tower, they are likely to meet shortly thereafter. Testing this assumption, we evaluated two hypotheses. The first hypothesis-that such co-located interactions occur in a workplace setting-achieved 83% agreement, which is considered a strong indication of reliability. The second hypothesis-that calls made during these co-location events are shorter than usual-achieved 86% agreement, suggesting an almost perfect reliability level. These results demonstrate that CDR-based co-location events can serve as a reliable substitute for in-person interactions and thus hold significant potential for enhancing contact tracing and supporting public health efforts.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.