2019
Autores
Torres Marques, L; Félix De Castro, A; Torres Marques, B; Carvalho Pereira Silva, J; Gabriel Gadelha Queiroz, P;
Publicação
RENOTE
Abstract
2019
Autores
Dias, TG;
Publicação
IEEE TECHNOLOGY AND SOCIETY MAGAZINE
Abstract
2019
Autores
Masoudi, M; Khafagy, MG; Conte, A; El Amine, A; Francoise, B; Nadjahi, C; Salem, FE; Labidi, W; Sural, A; Gati, A; Bodere, D; Arikan, E; Aklamanu, F; Louahlia Gualous, H; Lallet, J; Pareek, K; Nuaymi, L; Meunier, L; Silva, P; Almeida, NT; Chahed, T; Sjolund, T; Cavdar, C;
Publicação
IEEE ACCESS
Abstract
The heated 5G network deployment race has already begun with the rapid progress in standardization efforts, backed by the current market availability of 5G-enabled network equipment, ongoing 5G spectrum auctions, early launching of non-standalone 5G network services in a few countries, among others. In this paper, we study current and future wireless networks from the viewpoint of energy efficiency (EE) and sustainability to meet the planned network and service evolution toward, along, and beyond 5G, as also inspired by the findings of the EU Celtic-Plus SooGREEN Project. We highlight the opportunities seized by the project efforts to enable and enrich this green nature of the network as compared to existing technologies. In specific, we present innovative means proposed in SooGREEN to monitor and evaluate EE in 5G networks and beyond. Further solutions are presented to reduce energy consumption and carbon footprint in the different network segments. The latter spans proposed virtualized/cloud architectures, efficient polar coding for fronthauling, mobile network powering via renewable energy and smart grid integration, passive cooling, smart sleeping modes in indoor systems, among others. Finally, we shed light on the open opportunities yet to be investigated and leveraged in future developments.
2019
Autores
Aguiar, A; Santos, F; Sousa, AJ; Santos, L;
Publicação
APPLIED SCIENCES-BASEL
Abstract
The main task while developing a mobile robot is to achieve accurate and robust navigation in a given environment. To achieve such a goal, the ability of the robot to localize itself is crucial. In outdoor, namely agricultural environments, this task becomes a real challenge because odometry is not always usable and global navigation satellite systems (GNSS) signals are blocked or significantly degraded. To answer this challenge, this work presents a solution for outdoor localization based on an omnidirectional visual odometry technique fused with a gyroscope and a low cost planar light detection and ranging (LIDAR), that is optimized to run in a low cost graphical processing unit (GPU). This solution, named FAST-FUSION, proposes to the scientific community three core contributions. The first contribution is an extension to the state-of-the-art monocular visual odometry (Libviso2) to work with omnidirectional cameras and single axis gyro to increase the system accuracy. The second contribution, it is an algorithm that considers low cost LIDAR data to estimate the motion scale and solve the limitations of monocular visual odometer systems. Finally, we propose an heterogeneous computing optimization that considers a Raspberry Pi GPU to improve the visual odometry runtime performance in low cost platforms. To test and evaluate FAST-FUSION, we created three open-source datasets in an outdoor environment. Results shows that FAST-FUSION is acceptable to run in real-time in low cost hardware and that outperforms the original Libviso2 approach in terms of time performance and motion estimation accuracy.
2019
Autores
Li, G; Yang, J; Gama, J; Natwichai, J; Tong, Y;
Publicação
DASFAA (2)
Abstract
2019
Autores
Lezama, F; de Cote, EM; Farinelli, A; Soares, JP; Pinto, T; Vale, ZA;
Publicação
Progress in Artificial Intelligence - 19th EPIA Conference on Artificial Intelligence, EPIA 2019, Vila Real, Portugal, September 3-6, 2019, Proceedings, Part I
Abstract
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.