Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
Publications

2013

DETERMINANTS OF STUDENTS' WILLINGNESS TO PAY FOR VIOLENT CRIME REDUCTION

Authors
Teixeira, AAC; Soeiro, M;

Publication
SINGAPORE ECONOMIC REVIEW

Abstract
We apply the contingent valuation method to estimate how much a specific group of society, which is relatively prone to falling victim to crime, is willing to pay to reduce the likelihood of being the victim of violent crime. Based on responses from 1122 students, we found that younger and female students revealed that they are more inclined to pay so as to avoid violent crime. Students' field of study, cautious behavior and a strong opinion about policies and payment vehicles with potential to reduce the risk of crime are key determinants of the willingness to pay.

2013

An Institution for Alloy and Its Translation to Second-Order Logic

Authors
Neves, R; Madeira, A; Martins, MA; Barbosa, LS;

Publication
IRI (best papers)

Abstract
Lightweight formal methods, of which Alloy is a prime example, combine the rigour of mathematics without compromising simplicity of use and suitable tool support. In some cases, however, the verification of safety or mission critical software entails the need formore sophisticated technologies, typically based on theorem provers. This explains a number of attempts to connect Alloy to specific theorem provers documented in the literature. This chapter, however, takes a different perspective: instead of focusing on one more combination of Alloy with still another prover, it lays out the foundations to fully integrate this system in the Hets platform which supports a huge network of logics, logic translators and provers. This makes possible for Alloy specifications to “borrow” the power of several, non dedicated proof systems. The chapter extends the authors’ previous work on this subject by developing in full detail the semantical foundations for this integration, including a formalisation of Alloy as an institution, and introducing a new, more general translation of the latter to second-order logic.

2013

Towards an accurate evaluation of deduplicated storage systems

Authors
Paulo, J; Reis, P; Pereira, J; Sousa, A;

Publication
COMPUTER SYSTEMS SCIENCE AND ENGINEERING

Abstract
Deduplication has proven to be a valuable technique for eliminating duplicate data in backup and archival systems and is now being applied to new storage environments with distinct requirements and performance trade-offs. Namely, deduplication system are now targeting large-scale cloud computing storage infrastructures holding unprecedented data volumes with a significant share of duplicate content. It is however hard to assess the usefulness of deduplication in particular settings and what techniques provide the best results. In fact, existing disk I/O benchmarks follow simplistic approaches for generating data content leading to unrealistic amounts of duplicates that do not evaluate deduplication systems accurately. Moreover, deduplication systems are now targeting heterogeneous storage environments, with specific duplication ratios, that benchmarks must also simulate. We address these issues with DEDISbench, a novel micro-benchmark for evaluating disk I/O performance of block based deduplication systems. As the main contribution, DEDISbench generates content by following realistic duplicate content distributions extracted from real datasets. Then, as a second contribution, we analyze and extract the duplicates found on three real storage systems, proving that DEDISbench can easily simulate several workloads. The usefulness of DEDISbench is shown by comparing it with Bonnie++ and IOzone open-source disk I/O micro-benchmarks on assessing two open-source deduplication systems, Opendedup and Lessfs, using Ext4 as a baseline. Our results lead to novel insight on the performance of these file systems.

2013

Lane Background Removal for the Classification of Thin-Layer Chromatography Images

Authors
Moreira, BM; Sousa, AV; Mendonça, AM; Campilho, A;

Publication
IMAGE ANALYSIS AND RECOGNITION

Abstract
This paper describes a methodology to remove the background of the lanes in Thin Layer Chromatography (TLC) images, aiming at improving band detection and classification. The storage of the biological samples to be analyzed by TLC is usually done via plastic containers. Filter paper is an alternative that allows reduced costs and higher portability, but it increases the complexity of the image analysis stage due to lane background alteration. In order to overcome this problem, a negative control lane is included in every chromatographic plate. After image preprocessing and lane detection stages, a background profile is generated by processing the negative control lane using the Discrete Wavelet Transform (DWT). This profile is then subtracted to the profiles of all other sample lanes in order to overcome the data degradation introduced by filter paper usage. For assessing the proposed background removal process, 105 TLC lanes, with and without background, were used as input for three one-class classifiers. In all cases, the best results were achieved for the lanes after background removal.

2013

Eating behaviour patterns and BMI in Portuguese higher education students

Authors
Poínhos, Rui; Oliveira, Bruno; Correia, Flora;

Publication

Abstract
[Abstract]

2013

Random rules from data streams

Authors
Almeida, E; Kosina, P; Gama, J;

Publication
SAC

Abstract
Existing works suggest that random inputs and random features produce good results in classification. In this paper we study the problem of generating random rule sets from data streams. One of the most interpretable and flexible models for data stream mining prediction tasks is the Very Fast Decision Rules learner (VFDR). In this work we extend the VFDR algorithm using random rules from data streams. The proposed algorithm generates several sets of rules. Each rule set is associated with a set of Natt attributes. The proposed algorithm maintains all properties required when learning from stationary data streams: online and any-time classification, processing each example once. Copyright 2013 ACM.

  • 3007
  • 4378