Cookies
O website necessita de alguns cookies e outros recursos semelhantes para funcionar. Caso o permita, o INESC TEC irá utilizar cookies para recolher dados sobre as suas visitas, contribuindo, assim, para estatísticas agregadas que permitem melhorar o nosso serviço. Ver mais
Aceitar Rejeitar
  • Menu
Publicações

Publicações por Filipe Neves Santos

2025

Pollinationbots - A Swarm Robotic System for Tree Pollination

Autores
Castro, JT; Pinheiro, I; Marques, MN; Moura, P; dos Santos, FN;

Publicação
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Abstract
In nature, and particularly in agriculture, pollination is fundamental for the sustainability of our society. In this context, pollination is a vital process underlying crop yield quality and is responsible for the biodiversity and the standards of the flora. Bees play a crucial role in natural pollination; however, their populations are declining. Robots can help maintain pollination levels while humans work to recover bee populations. Swarm robotics approaches appear promising for robotic pollination. This paper proposes the cooperation between multiple Unmanned Aerial Vehicles (UAVs) and an Unmanned Ground Vehicle (UGV), leveraging the advantages of collaborative work for pollination, referred to as Pollinationbots. Pollinationbots is based in swarm behaviors and methodologies to implement more effective pollination strategies, ensuring efficient pollination across various scenarios. The paper presents the architecture of the Pollinationbots system, which was evaluated using the Webots simulator, focusing on path planning and follower behavior. Preliminary simulation results indicate that this is a viable solution for robotic pollination. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.

2025

Plant Leaf Disease Detection Using Deep Learning: A Multi-Dataset Approach

Autores
Krishna, MS; Machado, P; Otuka, RI; Yahaya, SW; Neves dos Santos, F; Ihianle, IK;

Publicação
J

Abstract
Agricultural productivity is increasingly threatened by plant diseases, which can spread rapidly and lead to significant crop losses if not identified early. Detecting plant diseases accurately in diverse and uncontrolled environments remains challenging, as most current detection methods rely heavily on lab-captured images that may not generalise well to real-world settings. This paper aims to develop models capable of accurately identifying plant diseases across diverse conditions, overcoming the limitations of existing methods. A combined dataset was utilised, incorporating the PlantDoc dataset with web-sourced images of plants from online platforms. State-of-the-art convolutional neural network (CNN) architectures, including EfficientNet-B0, EfficientNet-B3, ResNet50, and DenseNet201, were employed and fine-tuned for plant leaf disease classification. A key contribution of this work is the application of enhanced data augmentation techniques, such as adding Gaussian noise, to improve model generalisation. The results demonstrated varied performance across the datasets. When trained and tested on the PlantDoc dataset, EfficientNet-B3 achieved an accuracy of 73.31%. In cross-dataset evaluation, where the model was trained on PlantDoc and tested on a web-sourced dataset, EfficientNet-B3 reached 76.77% accuracy. The best performance was achieved with the combination of the PlanDoc and web-sourced datasets resulting in an accuracy of 80.19% indicating very good generalisation in diverse conditions. Class-wise F1-scores consistently exceeded 90% for diseases such as apple rust leaf and grape leaf across all models, demonstrating the effectiveness of this approach for plant disease detection.

2024

Early plant disease diagnosis through handheld UV-Vis transmittance spectrometer with DD-SIMCA one-class classification and MCR-ALS bilinear decomposition

Autores
Reis-Pereira, M; Mazivila, SJ; Tavares, F; dos Santos, FN; Cunha, M;

Publicação
SMART AGRICULTURAL TECHNOLOGY

Abstract
A novel non-destructive analytical method for early diagnosis of two bacterial diseases, Pseudomonas syringae and Xanthomonas euvesicatoria, in tomato plants, using ultraviolet-visible (UV-Vis) transmittance spectroscopy and chemometric models, is developed. Plant-pathogen interactions caused tissue damage that generated non-linear data patterns compared to the control set (healthy samples), which challenges traditional discrimination models, even when employing non-linear discriminant approaches. Alternatively, an authentication task to conduct oneclass classification relying on a data-driven version of soft independent modeling of class analogy (DD-SIMCA) is a wise choice due to its quadratic approach, proper to deal with non-linear data. DD-SIMCA detached the target class (control healthy plant leaflet tissues) from all other samples (target class and non-target class of plant leaflet tissues inoculated with two bacteria, even before the manifestation of macroscopic lesions associated with the diseases) by capturing the main similarities within the samples of the target class through the full distance that acts as a classification analytical signal, reaching 100 % sensitivity in the training and validation sets. Multivariate curve resolution - alternating least-squares (MCR-ALS) constrained analysis allowed the description of the bacterial inoculation process on diseased tissues through pure spectral signatures. DD-SIMCA results indicate that non-target class of samples with higher proximity to the acceptance boundary suggested that they were at earlier stages of infection when compared to more distant ones, presenting lower full distance values. These findings reveal that a handheld UV-Vis transmittance spectrometer is sufficiently sensitive to be used in acquiring biological data with suitable chemometric models for early disease diagnosis and prompt intervention.

2024

BVE + EKF: A Viewpoint Estimator for the Estimation of the Object's Position in the 3D Task Space Using Extended Kalman Filters

Autores
Magalhães, SC; Moreira, AP; dos Santos, FN; Dias, J;

Publicação
ICINCO (2)

Abstract
RGB-D sensors face multiple challenges operating under open-field environments because of their sensitivity to external perturbations such as radiation or rain. Multiple works are approaching the challenge of perceiving the three-dimensional (3D) position of objects using monocular cameras. However, most of these works focus mainly on deep learning-based solutions, which are complex, data-driven, and difficult to predict. So, we aim to approach the problem of predicting the three-dimensional (3D) objects’ position using a Gaussian viewpoint estimator named best viewpoint estimator (BVE), powered by an extended Kalman filter (EKF). The algorithm proved efficient on the tasks and reached a maximum average Euclidean error of about 32mm. The experiments were deployed and evaluated in MATLAB using artificial Gaussian noise. Future work aims to implement the system in a robotic system.

2025

Pruning End-Effectors State of the Art Review

Autores
Oliveira, F; Tinoco, V; Valente, A; Pinho, T; Cunha, JB; Santos, FN;

Publicação
PROGRESS IN ARTIFICIAL INTELLIGENCE, EPIA 2024, PT I

Abstract
Pruning consists on an agricultural trimming procedure that is crucial in some species of plants to promote healthy growth and increased yield. Generally, this task is done through manual labour, which is costly, physically demanding, and potentially dangerous for the worker. Robotic pruning is an automated alternative approach to manual labour on this task. This approach focuses on selective pruning and requires the existence of an end-effector capable of detecting and cutting the correct point on the branch to achieve efficient pruning. This paper reviews and analyses different end-effectors used in robotic pruning, which helped to understand the advantages and limitations of the different techniques used and, subsequently, clarified the work required to enable autonomous pruning.

2024

Deep learning based approach for actinidia flower detection and gender assessment

Autores
Pinheiro, I; Moreira, G; Magalhaes, S; Valente, A; Cunha, M; dos Santos, FN;

Publicação
SCIENTIFIC REPORTS

Abstract
Pollination is critical for crop development, especially those essential for subsistence. This study addresses the pollination challenges faced by Actinidia, a dioecious plant characterized by female and male flowers on separate plants. Despite the high protein content of pollen, the absence of nectar in kiwifruit flowers poses difficulties in attracting pollinators. Consequently, there is a growing interest in using artificial intelligence and robotic solutions to enable pollination even in unfavourable conditions. These robotic solutions must be able to accurately detect flowers and discern their genders for precise pollination operations. Specifically, upon identifying female Actinidia flowers, the robotic system should approach the stigma to release pollen, while male Actinidia flowers should target the anthers to collect pollen. We identified two primary research gaps: (1) the lack of gender-based flower detection methods and (2) the underutilisation of contemporary deep learning models in this domain. To address these gaps, we evaluated the performance of four pretrained models (YOLOv8, YOLOv5, RT-DETR and DETR) in detecting and determining the gender of Actinidia flowers. We outlined a comprehensive methodology and developed a dataset of manually annotated flowers categorized into two classes based on gender. Our evaluation utilised k-fold cross-validation to rigorously test model performance across diverse subsets of the dataset, addressing the limitations of conventional data splitting methods. DETR provided the most balanced overall performance, achieving precision, recall, F1 score and mAP of 89%, 97%, 93% and 94%, respectively, highlighting its robustness in managing complex detection tasks under varying conditions. These findings underscore the potential of deep learning models for effective gender-specific detection of Actinidia flowers, paving the way for advanced robotic pollination systems.

  • 17
  • 25