Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
Publications

2024

Enhancing Forest Fire Detection and Monitoring Through Satellite Image Recognition: A Comparative Analysis of Classification Algorithms Using Sentinel-2 Data

Authors
Brito, T; Pereira, AI; Costa, P; Lima, J;

Publication
Communications in Computer and Information Science - Optimization, Learning Algorithms and Applications

Abstract

2024

Image Transfer over MQTT in IoT: Message Segmentation and Encryption for Remote Indicator Panels

Authors
Valente, D; Brito, T; Correia, M; Carvalho, JA; Lima, J;

Publication
Communications in Computer and Information Science - Optimization, Learning Algorithms and Applications

Abstract

2024

Sample Size Analysis for a Production Line Study of Time

Authors
da Silva, MI; Vaz, CB;

Publication
Lecture Notes in Mechanical Engineering

Abstract
Setting labor standards is an important topic to operational and strategic planning which requires the time studies establishment. This paper applies the statistical method for the definition of a sample size in order to define a reliable cycle time for a real industrial process. For the case study it is considered a welding process performed by a single operator that does the load and unload of components in 4 different welding machines. In order to perform the time studies, it is necessary to collect continuously data in the production line by measuring the time taken for the operator to perform the task. In order to facilitate the measurements, the task is divided into small elements with visible start and end points, called Measurement Points, in which the measurement process is applied. Afterwards, the statistical method enables to determine the sample size of observations to calculate the reliable cycle time. For the welding process presented, it is stated that the sample size defined through the statistical method is 20. Thus, these time observations of the task are continuously collected in order to obtain a reliable cycle time for this welding process. This time study can be implemented in similar way in other industrial processes. © 2024, The Author(s), under exclusive license to Springer Nature Switzerland AG.

2024

Automatic Quality Assessment of Wikipedia Articles-A Systematic Literature Review

Authors
Moas, PM; Lopes, CT;

Publication
ACM COMPUTING SURVEYS

Abstract
Wikipedia is the world's largest online encyclopedia, but maintaining article quality through collaboration is challenging. Wikipedia designed a quality scale, but with such a manual assessment process, many articles remain unassessed. We review existing methods for automatically measuring the quality of Wikipedia articles, identifying and comparing machine learning algorithms, article features, quality metrics, and used datasets, examining 149 distinct studies, and exploring commonalities and gaps in them. The literature is extensive, and the approaches follow past technological trends. However, machine learning is still not widely used by Wikipedia, and we hope that our analysis helps future researchers change that reality.

2024

Condition Invariance for Autonomous Driving by Adversarial Learning

Authors
e Silva, DT; Cruz, PM;

Publication
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Abstract
Object detection is a crucial task in autonomous driving, where domain shift between the training and the test set is one of the main reasons behind the poor performance of a detector when deployed. Some erroneous priors may be learned from the training set, therefore a model must be invariant to conditions that might promote such priors. To tackle this problem, we propose an adversarial learning framework consisting of an encoder, an object-detector, and a condition-classifier. The encoder is trained to deceive the condition-classifier and aid the object-detector as much as possible throughout the learning stage, in order to obtain highly discriminative features. Experiments showed that this framework is not very competitive regarding the trade-off between precision and recall, but it does improve the ability of the model to detect smaller objects and some object classes. © 2024, Springer Nature Switzerland AG.

2024

A cooperative coevolutionary hyper-heuristic approach to solve lot-sizing and job shop scheduling problems using genetic programming

Authors
Zeiträg, Y; Figueira, JR; Figueira, G;

Publication
INTERNATIONAL JOURNAL OF PRODUCTION RESEARCH

Abstract
Lot-sizing and scheduling in a job shop environment is a fundamental problem that appears in many industrial settings. The problem is very complex, and solutions are often needed fast. Although many solution methods have been proposed, with increasingly better results, their computational times are not suitable for decision-makers who want solutions instantly. Therefore, we propose a novel greedy heuristic to efficiently generate production plans and schedules of good quality. The main innovation of our approach represents the incorporation of a simulation-based technique, which directly generates schedules while simultaneously determining lot sizes. By utilising priority rules, this unique feature enables us to address the complexity of job shop scheduling environments and ensures the feasibility of the resulting schedules. Using a selection of well-known rules from the literature, experiments on a variety of shop configurations and complexities showed that the proposed heuristic is able to obtain solutions with an average gap to Cplex of 4.12%. To further improve the proposed heuristic, a cooperative coevolutionary genetic programming-based hyper-heuristic has been developed. The average gap to Cplex was reduced up to 1.92%. These solutions are generated in a small fraction of a second, regardless of the size of the instance.

  • 1
  • 3728