Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
Publications

2016

Learning and Ensembling Lexicographic Preference Trees with Multiple Kernels

Authors
Fernandes, K; Cardoso, JS; Palacios, H;

Publication
2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)

Abstract
We study the problem of learning lexicographic preferences on multiattribute domains, and propose Rankdom Forests as a compact way to express preferences in learning to rank scenarios. We start generalizing Conditional Lexicographic Preference Trees by introducing multiple kernels in order to handle non-categorical attributes. Then, we define a learning strategy for inferring lexicographic rankers from partial pairwise comparisons between options. Finally, a Lexicographic Ensemble is introduced to handle multiple weak partial rankers, being Rankdom Forests one of these ensembles. We tested the performance of the proposed method using several datasets and obtained competitive results when compared with other lexicographic rankers.

2016

Content-Based Image Retrieval by Metric Learning From Radiology Reports: Application to Interstitial Lung Diseases

Authors
Ramos, J; Kockelkorn, TTJP; Ramos, I; Ramos, R; Grutters, J; Viergever, MA; van Ginneken, B; Campilho, A;

Publication
IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS

Abstract
Content-based image retrieval (CBIR) is a search technology that could aid medical diagnosis by retrieving and presenting earlier reported cases that are related to the one being diagnosed. To retrieve relevant cases, CBIR systems depend on supervised learning to map low-level image contents to high-level diagnostic concepts. However, the annotation by medical doctors for training and evaluation purposes is a difficult and time-consuming task, which restricts the supervised learning phase to specific CBIR problems of well-defined clinical applications. This paper proposes a new technique that automatically learns the similarity between the several exams from textual distances extracted from radiology reports, thereby successfully reducing the number of annotations needed. Our method first infers the relation between patients by using information retrieval techniques to determine the textual distances between patient radiology reports. These distances are subsequently used to supervise a metric learning algorithm, that transforms the image space accordingly to textual distances. CBIR systems with different image descriptions and different levels of medical annotations were evaluated, with and without supervision from textual distances, using a database of computer tomography scans of patients with interstitial lung diseases. The proposed method consistently improves CBIR mean average precision, with improvements that can reach 38%, and more marked gains for small annotation sets. Given the overall availability of radiology reports in picture archiving and communication systems, the proposed approach can be broadly applied to CBIR systems in different medical problems, and may facilitate the introduction of CBIR in clinical practice.

2016

Dynamic community detection in evolving networks using locality modularity optimization

Authors
Cordeiro, M; Sarmento, RP; Gama, J;

Publication
SOCIAL NETWORK ANALYSIS AND MINING

Abstract
The amount and the variety of data generated by today's online social and telecommunication network services are changing the way researchers analyze social networks. Facing fast evolving networks with millions of nodes and edges are, among other factors, its main challenge. Community detection algorithms in these conditions have also to be updated or improved. Previous state-of-the-art algorithms based on the modularity optimization (i.e. Louvain algorithm), provide fast, efficient and robust community detection on large static networks. Nonetheless, due to the high computing complexity of these algorithms, the use of batch techniques in dynamic networks requires to perform network community detection for the whole network in each one of the evolution steps. This fact reveals to be computationally expensive and unstable in terms of tracking of communities. Our contribution is a novel technique that maintains the community structure always up-to-date following the addition or removal of nodes and edges. The proposed algorithm performs a local modularity optimization that maximizes the modularity gain function only for those communities where the editing of nodes and edges was performed, keeping the rest of the network unchanged. The effectiveness of our algorithm is demonstrated with the comparison to other state-of-the-art community detection algorithms with respect to Newman's Modularity, Modularity with Split Penalty, Modularity Density, number of detected communities and running time.

2016

AdapTA: Adaptive Timeslot Allocation scheme for IEEE 802.15.4e LLDN mode

Authors
Bitencort, B; Moraes, R; Portugal, P; Vasques, F;

Publication
2016 IEEE 14TH INTERNATIONAL CONFERENCE ON INDUSTRIAL INFORMATICS (INDIN)

Abstract
The LLDN (Low Latency Deterministic Network) mode is a IEEE 802.15.4e amendment specifically designed for industrial applications requiring low latency and low loss rate. It is based on a static TDMA scheme composed of fixed-size slots. One of its limitations regards the support of messages with different sizes and different periodicities. In this paper, a slot allocation scheme is proposed, enabling the support of heterogeneous message streams. The rationale is to compute a suitable timeslot size to communication devices, enabling adaptive control of the superframe without changing the LLDN standard. This paper shows that it is possible to accommodate heterogeneous message streams while maintaining low cycle times when transmitting messages with variable payload.

2016

Measures for Combining Prediction Intervals Uncertainty and Reliability in Forecasting

Authors
Almeida, V; Gama, J;

Publication
PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON COMPUTER RECOGNITION SYSTEMS, CORES 2015

Abstract
In this paper we propose a new methodology for evaluating prediction intervals (PIs). Typically, PIs are evaluated with reference to confidence values. However, other metrics should be considered, since high values are associated to too wide intervals that convey little information and are of no use for decision-making. We propose to compare the error distribution (predictions out of the interval) and the maximum mean absolute error (MAE) allowed by the confidence limits. Along this paper PIs based on neural networks for short-term load forecast are compared using two different strategies: (1) dual perturb and combine (DPC) algorithm and (2) conformal prediction. We demonstrated that depending on the real scenario (e.g., time of day) different algorithms perform better. The main contribution is the identification of high uncertainty levels in forecast that can guide the decision-makers to avoid the selection of risky actions under uncertain conditions. Small errors mean that decisions can be made more confidently with less chance of confronting a future unexpected condition.

2016

Odd-even Pole-pole array and 3D resistivity surveys in urban and historical areas

Authors
Almeida, F; Barraca, N; Moura, R; Matias, MJS;

Publication
22nd European Meeting of Environmental and Engineering Geophysics, Near Surface Geoscience 2016

Abstract
Modern and historical buildings may show some degree of subsidence resulting from foundation deterioration and local geological conditions. Hence, buildings stability can be affected and restoration plans must be envisaged. Resistivity methods have been used to investigate local conditions, providing 3D images of the soil under man made structures and hence contributing to the delimitation of hazardous areas and pathologies. However these techniques require the deployment of a grid of electrodes, which can be difficult to accomplish because of physical limitations and of the buildings nature that cannot be damaged. To overcome these problems special arrays have been used (L, Corner, Square arrays, etc). Here in it is proposed to use the "Odd-Even Pole-Pole Array" to study the ground under a contemporary building and under a high historical value XIV century Abbey, both showing evidence of subsidence. Field data quality is also addressed and it is proposed to identify low quality data to be expunged so that modelling is improved. It is also shown how to estimate resistivity values from data quality tests, to carry out further zonation, locate hazardous areas and to enhance modelling.

  • 2507
  • 4363