2019
Authors
Sharma, P; Bidari, S; Valente, A; Paredes, H;
Publication
CoRR
Abstract
2023
Authors
Pinheiro, I; Moreira, G; da Silva, DQ; Magalhaes, S; Valente, A; Oliveira, PM; Cunha, M; Santos, F;
Publication
AGRONOMY-BASEL
Abstract
The world wine sector is a multi-billion dollar industry with a wide range of economic activities. Therefore, it becomes crucial to monitor the grapevine because it allows a more accurate estimation of the yield and ensures a high-quality end product. The most common way of monitoring the grapevine is through the leaves (preventive way) since the leaves first manifest biophysical lesions. However, this does not exclude the possibility of biophysical lesions manifesting in the grape berries. Thus, this work presents three pre-trained YOLO models (YOLOv5x6, YOLOv7-E6E, and YOLOR-CSP-X) to detect and classify grape bunches as healthy or damaged by the number of berries with biophysical lesions. Two datasets were created and made publicly available with original images and manual annotations to identify the complexity between detection (bunches) and classification (healthy or damaged) tasks. The datasets use the same 10,010 images with different classes. The Grapevine Bunch Detection Dataset uses the Bunch class, and The Grapevine Bunch Condition Detection Dataset uses the OptimalBunch and DamagedBunch classes. Regarding the three models trained for grape bunches detection, they obtained promising results, highlighting YOLOv7 with 77% of mAP and 94% of the F1-score. In the case of the task of detection and identification of the state of grape bunches, the three models obtained similar results, with YOLOv5 achieving the best ones with an mAP of 72% and an F1-score of 92%.
2023
Authors
Pinheiro, I; Aguiar, A; Figueiredo, A; Pinho, T; Valente, A; Santos, F;
Publication
APPLIED SCIENCES-BASEL
Abstract
Currently, Unmanned Aerial Vehicles (UAVs) are considered in the development of various applications in agriculture, which has led to the expansion of the agricultural UAV market. However, Nano Aerial Vehicles (NAVs) are still underutilised in agriculture. NAVs are characterised by a maximum wing length of 15 centimetres and a weight of fewer than 50 g. Due to their physical characteristics, NAVs have the advantage of being able to approach and perform tasks with more precision than conventional UAVs, making them suitable for precision agriculture. This work aims to contribute to an open-source solution known as Nano Aerial Bee (NAB) to enable further research and development on the use of NAVs in an agricultural context. The purpose of NAB is to mimic and assist bees in the context of pollination. We designed this open-source solution by taking into account the existing state-of-the-art solution and the requirements of pollination activities. This paper presents the relevant background and work carried out in this area by analysing papers on the topic of NAVs. The development of this prototype is rather complex given the interactions between the different hardware components and the need to achieve autonomous flight capable of pollination. We adequately describe and discuss these challenges in this work. Besides the open-source NAB solution, we train three different versions of YOLO (YOLOv5, YOLOv7, and YOLOR) on an original dataset (Flower Detection Dataset) containing 206 images of a group of eight flowers and a public dataset (TensorFlow Flower Dataset), which must be annotated (TensorFlow Flower Detection Dataset). The results of the models trained on the Flower Detection Dataset are shown to be satisfactory, with YOLOv7 and YOLOR achieving the best performance, with 98% precision, 99% recall, and 98% F1 score. The performance of these models is evaluated using the TensorFlow Flower Detection Dataset to test their robustness. The three YOLO models are also trained on the TensorFlow Flower Detection Dataset to better understand the results. In this case, YOLOR is shown to obtain the most promising results, with 84% precision, 80% recall, and 82% F1 score. The results obtained using the Flower Detection Dataset are used for NAB guidance for the detection of the relative position in an image, which defines the NAB execute command.
2022
Authors
Berger, GS; Braun, J; Junior, AO; Lima, J; Pinto, MF; Pereira, AI; Valente, A; Soares, SFP; Rech, LC; Cantieri, AR; Wehrmeister, MA;
Publication
OPTIMIZATION, LEARNING ALGORITHMS AND APPLICATIONS, OL2A 2022
Abstract
This research proposes positioning obstacle detection sensors by multirotor unmanned aerial vehicles (UAVs) dedicated to detailed inspections in high voltage towers. Different obstacle detection sensors are analyzed to compose a multisensory architecture in a multirotor UAV. The representation of the beam pattern of the sensors is modeled in the CoppeliaSim simulator to analyze the sensors' coverage and detection performance in simulation. A multirotor UAV is designed to carry the same sensor architecture modeled in the simulation. The aircraft is used to perform flights over a deactivated electrical tower, aiming to evaluate the detection performance of the sensory architecture embedded in the aircraft. The results obtained in the simulation were compared with those obtained in a real scenario of electrical inspections. The proposed method achieved its goals as a mechanism to early evaluate the detection capability of different previously characterized sensor architectures used in multirotor UAV for electrical inspections.
2023
Authors
Berger, GS; Teixeira, M; Cantieri, A; Lima, J; Pereira, AI; Valente, A; de Castro, GGR; Pinto, MF;
Publication
AGRICULTURE-BASEL
Abstract
The recent advances in precision agriculture are due to the emergence of modern robotics systems. For instance, unmanned aerial systems (UASs) give new possibilities that advance the solution of existing problems in this area in many different aspects. The reason is due to these platforms' ability to perform activities at varying levels of complexity. Therefore, this research presents a multiple-cooperative robot solution for UAS and unmanned ground vehicle (UGV) systems for their joint inspection of olive grove inspect traps. This work evaluated the UAS and UGV vision-based navigation based on a yellow fly trap fixed in the trees to provide visual position data using the You Only Look Once (YOLO) algorithms. The experimental setup evaluated the fuzzy control algorithm applied to the UAS to make it reach the trap efficiently. Experimental tests were conducted in a realistic simulation environment using a robot operating system (ROS) and CoppeliaSim platforms to verify the methodology's performance, and all tests considered specific real-world environmental conditions. A search and landing algorithm based on augmented reality tag (AR-Tag) visual processing was evaluated to allow for the return and landing of the UAS to the UGV base. The outcomes obtained in this work demonstrate the robustness and feasibility of the multiple-cooperative robot architecture for UGVs and UASs applied in the olive inspection scenario.
2023
Authors
Berger, GS; Oliveira, A; Braun, J; Lima, J; Pinto, MF; Valente, A; Pereira, AI; Cantieri, AR; Wehrmeister, MA;
Publication
ROBOT2022: FIFTH IBERIAN ROBOTICS CONFERENCE: ADVANCES IN ROBOTICS, VOL 2
Abstract
This work presents a methodology for characterizing ultrasonic and LASER sensors aimed at detecting obstacles within the context of electrical inspections by multirotor Unmanned Aerial Vehicles (UAVs). A set of four ultrasonic and LASER sensor models is evaluated against eight target components, typically found in high-voltage towers. The results show that ultrasonic sensor arrays displaced 25. apart reduce the chances of problems related to crosstalk and angular uncertainty. Within the LASER sensor suite, solar exposure directly affects the detection behavior among lower power sensors. Based on the results obtained, a set of sensors capable of detecting multiple obstacles belonging to a high-voltage tower was identified. In this reasoning, it becomes possible to model sensor architectures for multirotor UAVs to detect multiple obstacles and advance in the state of the art in obstacle avoidance systems by UAVs in inspections of high-voltage towers.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.