2025
Authors
Tosin, R; Rodrigues, L; Santos Campos, M; Gonçalves, I; Barbosa, C; Santos, F; Martins, R; Cunha, M;
Publication
Smart Agricultural Technology
Abstract
This study demonstrates the application of a tomography-like (TL) method to monitor grape maturation dynamics over two growing seasons (2021–2022) in the Douro Wine Region. Using a Vis-NIR point-of-measurement sensor, which employs visible and near-infrared light to penetrate grape tissues non-destructively and provide spectral data to predict internal composition, this approach captures non-destructive measurements of key physicochemical properties, including soluble solids content (SSC), weight-to-volume ratio, chlorophyll and anthocyanin levels across internal grape tissues - skin, pulp, and seeds - over six post-veraison stages. The collected data were used to generate detailed metabolic maps of maturation, integrating topographical factors such as altitude and NDVI-based (normalised difference vegetation index) vigour assessments, which revealed significant (p < 0.05) variations in SSC, chlorophyll, and anthocyanin levels across vineyard zones. The metabolic maps generated from the TL method enable high-throughput data to reveal the impact of environmental variability on grape maturation across distinct vineyard areas. Predictive models using random forest (RF) and self-learning artificial intelligence (SL-AI) algorithms showed RF's robustness, achieving stable predictions with R² = 0.86 and MAPE = 33.83 %. To illustrate the TL method's practical value, three hypothetical decision models were developed for targeted winemaking objectives based on SSC, chlorophyll in the pulp, and anthocyanin in the skin and seeds. These models underscore the TL method's ability to support site-specific management (SSM) by providing actionable agricultural practices (e.g. harvest) into vineyard management, guiding winemakers to implement tailored interventions based on metabolic profiles rather than only cultivar characteristics. This precision viticulture (PV) approach enhances wine quality and production efficiency by aligning vineyard practices with specific wine quality goals. © 2025 The Author(s)
2025
Authors
Horst Orsolits; Katrin Clauss; P. B. de Moura Oliveira;
Publication
Computer Aided Systems Theory – EUROCAST 2024
Abstract
2025
Authors
Nandi, S; Malta, MC; Maji, G; Dutta, A;
Publication
KNOWLEDGE AND INFORMATION SYSTEMS
Abstract
Influential nodes are the important nodes that most efficiently control the propagation process throughout the network. Among various structural-based methods, degree centrality, k-shell decomposition, or their combination identify influential nodes with relatively low computational complexity, making them suitable for large-scale network analysis. However, these methods do not necessarily explore nodes' underlying structure and neighboring information, which poses a significant challenge for researchers in developing timely and efficient heuristics considering appropriate network characteristics. In this study, we propose a new method (IC-SNI) to measure the influential capability of the nodes. IC-SNI minimizes the loopholes of the local and global centrality and calculates the topological positional structure by considering the local and global contribution of the neighbors. Exploring the path structural information, we introduce two new measurements (connectivity strength and effective distance) to capture the structural properties among the neighboring nodes. Finally, the influential capability of a node is calculated by aggregating the structural and neighboring information of up to two-hop neighboring nodes. Evaluated on nine benchmark datasets, IC-SNI demonstrates superior performance with the highest average ranking correlation of 0.813 with the SIR simulator and a 34.1% improvement comparing state-of-the-art methods in identifying influential spreaders. The results show that IC-SNI efficiently identifies the influential spreaders in diverse real networks by accurately integrating structural and neighboring information.
2025
Authors
Lorenzo Grazi; Abel Feijoo Alonso; Adam Gasiorek; Afra Maria Pertusa Llopis; Alejandro Grajeda; Alexandros Kanakis; Ana Rodriguez Vidal; Andrea Parri; Felix Vidal; Ioannis Ergas; Ivana Zeljkovic; Javier Pamies Durá; Javier Perez Mein; Konstantinos Katsampiris-Salgado; Luís F. Rocha; Lorena Núñez Rodriguez; Marcelo R. Petry; Michal Neufeld; Nikos Dimitropoulos; Nina Köster; Ratko Mimica; Sara Varão Fernandes; Simona Crea; Sotiris Makris; Stavros Giartzas; Vincent Settler; Jawad Masood;
Publication
Electronics
Abstract
2025
Authors
Rui Nascimento; Tony Ferreira; Cláudia D. Rocha; Vítor Filipe; Manuel F. Silva; Germano Veiga; Luis Rocha;
Publication
Journal of Intelligent & Robotic Systems
Abstract
2025
Authors
Yamamura, F; Scalassara, R; Oliveira, A; Ferreira, JS;
Publication
U.Porto Journal of Engineering
Abstract
Whispers are common and essential for secondary communication. Nonetheless, individuals with aphonia, including laryngectomees, rely on whispers as their primary means of communication. Due to the distinct features between whispered and regular speech, debates have emerged in the field of speech recognition, highlighting the challenge of effectively converting between them. This study investigates the characteristics of whispered speech and proposes a system for converting whispered vowels into normal ones. The system is developed using multilayer perceptron networks and two types of generative adversarial networks. Three metrics are analyzed to evaluate the performance of the system: mel-cepstral distortion, root mean square error of the fundamental frequency, and accuracy with f1-score of a vowel classifier. Overall, the perceptron networks demonstrated better results, with no significant differences observed between male and female voices or the presence/absence of speech silence, except for improved accuracy in estimating the fundamental frequency during the conversion process. © 2025, Universidade do Porto - Faculdade de Engenharia. All rights reserved.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.