2023
Authors
Paiva, S; Amaral, A; Pereira, T; Barreto, L;
Publication
SMART ENERGY FOR SMART TRANSPORT, CSUM2022
Abstract
Inclusive mobility represents an essential component of the smart and sustainable mobility ecosystem. Moreover, smart parking has gained greater importance given the vital contribution to reducing the carbon footprint. However, currently, existing solutions are not yet inclusive as they do not include the required information for the comfort and safety of people with reduced mobility, for whom the time it takes to park the vehicle is sometimes not the most important factor when compared to the suitability of the parking space considering the displacement objectives. The main contribution of this paper is a conceptual and technological architecture for an inclusive and real-time solution for parking assistance in a small urban environment. The architecture uses a crowd-sourcing approach, a Geographic Information System, a set of external APIs, the GPS, and a mobile solution for interaction with the citizen. The solution will be built from a previous work developed in the city of Viana do Castelo in Portugal and intends to be evaluated by the Sustainable Urban Mobility Indicators (SUMI) proposed by the European Commission.
2023
Authors
Lima, R; Barreto, L; Amaral, A; Paiva, S;
Publication
IEEE SENSORS JOURNAL
Abstract
Blindness and visual impairment are commonly associated with social and functional limitations, almost 45 million people in the world have blindness, and 135 million have any visual impairment. This condition has a significant impact on the quality of life and brings many challenges to the individual, one of which is the navigation and positioning tasks. Although there are already apps capable of helping visually impaired people (VIP) for mobility purposes, most of them focus on detecting obstacles and, therefore, on avoiding dangerous situations. However, mobility of VIP involves many more tasks, such as knowing their exact position and staying informed along an entire route. For this purpose, a standalone and customizable solution is proposed that uses traditional visual recognition of landmarks to process the surroundings of the current location of the visually impaired person using a smartphone and informing about the nearby places assuring the user a sense of the site. For feature detection, it used the oriented features from accelerated segment test (FAST) and rotated binary robust-independent elementary feature (BRIEF) (ORB) algorithm, and for feature matching, it used the brute-force method with the k-nearest neighbor (KNN) algorithm. Results show that the proposed solution can analyze pictures in fractions of a second with satisfactory accuracy.
2023
Authors
Majewska, M; Mazur-Wierzbicka, E; Duarte, N; Niezurawska, J;
Publication
Przeglad Organizacji
Abstract
2023
Authors
Liang, T; Duarte, N; Yue, GX;
Publication
International Journal of Emerging Technologies in Learning (iJET)
Abstract
2023
Authors
Carneiro, D; Palumbo, G;
Publication
NEW TRENDS IN DISRUPTIVE TECHNOLOGIES, TECH ETHICS AND ARTIFICIAL INTELLIGENCE, DITTET 2023
Abstract
In recent years, the EU has been pushing forward ground-breaking legislation that covers new digital environments and services, with a strong focus on Ethics and AI. This includes legislation such as the Artificial Intelligence Act, the Digital Services Act or the General Data Protection Regulation. This legislation is, however, often written in very general and high-level terms, leaving a lot of space for interpretation, and a gap concerning how it could or should be implemented, realistically. In this paper we look specifically at the principle of Transparency in the Digital Services Act. Specifically, we discuss the requirements concerning Transparency in the regulation, we identify the gaps, and propose concrete measures that can be considered to facilitate and guide its implementation.
2023
Authors
Novais, P; Inglada, VJ; Hornos, MJ; Satoh, I; Carneiro, D; Carneiro, J; Alonso, RS;
Publication
ISAmI
Abstract
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.