Cookies
O website necessita de alguns cookies e outros recursos semelhantes para funcionar. Caso o permita, o INESC TEC irá utilizar cookies para recolher dados sobre as suas visitas, contribuindo, assim, para estatísticas agregadas que permitem melhorar o nosso serviço. Ver mais
Aceitar Rejeitar
  • Menu
Publicações

2017

Predicting the Situational Relevance of Health Web Documents

Autores
Oroszlanyova, M; Lopes, CT; Nunes, S; Ribeiro, C;

Publicação
2017 12TH IBERIAN CONFERENCE ON INFORMATION SYSTEMS AND TECHNOLOGIES (CISTI)

Abstract
Relevance is usually estimated by search engines using document content, disregarding the user behind the search and the characteristics of the task. In this work, we look at relevance as framed in a situational context, calling it situational relevance, and analyze if it is possible to predict it using documents, users and tasks characteristics. Using an existing dataset composed of health web documents, relevance judgments for information needs, user and task characteristics, we build a multivariate prediction model for situational relevance. Our model has an accuracy of 77.17%. Our findings provide insights into features that could improve the estimation of relevance by search engines, helping to conciliate the systemic and situational views of relevance. In a near future we will work on the automatic assessment of document, user and task characteristics.

2017

A Fast and Verified Software Stack for Secure Function Evaluation

Autores
Almeida, JB; Barbosa, M; Barthe, G; Dupressoir, F; Grégoire, B; Laporte, V; Pereira, V;

Publicação
CCS'17: PROCEEDINGS OF THE 2017 ACM SIGSAC CONFERENCE ON COMPUTER AND COMMUNICATIONS SECURITY

Abstract
We present a high-assurance software stack for secure function evaluation (SFE). Our stack consists of three components: i.a verified compiler (CircGen) that translates C programs into Boolean circuits; ii. a verified implementation of Yao's SFE protocol based on garbled circuits and oblivious transfer; and iii. transparent application integration and communications via FRESCO, an open-source framework for secure multiparty computation (MPC). CircGen is a general purpose tool that builds on CompCert, a verified optimizing compiler for C. It can be used in arbitrary Boolean circuit-based cryptography deployments. The security of our SFE protocol implementation is formally verified using EasyCrypt, a tool-assisted framework for building high-confidence cryptographic proofs, and it leverages a new formalization of garbled circuits based on the framework of Bellare, Hoang, and Rogaway (CCS 2012). We conduct a practical evaluation of our approach, and conclude that it is competitive with state-of-the-art (unverified) approaches. Our work provides concrete evidence of the feasibility of building efficient, verified, implementations of higher-level cryptographic systems. All our development is publicly available.

2017

Superframe Duration Allocation Schemes to Improve the Throughput of Cluster-Tree Wireless Sensor Networks

Autores
Leao, E; Montez, C; Moraes, R; Portugal, P; Vasques, F;

Publicação
SENSORS

Abstract
The use ofWireless Sensor Network (WSN) technologies is an attractive option to support wide-scale monitoring applications, such as the ones that can be found in precision agriculture, environmental monitoring and industrial automation. The IEEE 802.15.4/ZigBee cluster-tree topology is a suitable topology to build wide-scale WSNs. Despite some of its known advantages, including timing synchronisation and duty-cycle operation, cluster-tree networks may suffer from severe network congestion problems due to the convergecast pattern of its communication traffic. Therefore, the careful adjustment of transmission opportunities (superframe durations) allocated to the cluster-heads is an important research issue. This paper proposes a set of proportional Superframe Duration Allocation (SDA) schemes, based on well-defined protocol and timing models, and on the message load imposed by child nodes (Load-SDA scheme), or by number of descendant nodes (Nodes-SDA scheme) of each cluster-head. The underlying reasoning is to adequately allocate transmission opportunities (superframe durations) and parametrize buffer sizes, in order to improve the network throughput and avoid typical problems, such as: network congestion, high end-to-end communication delays and discarded messages due to buffer overflows. Simulation assessments show how proposed allocation schemes may clearly improve the operation of wide-scale cluster-tree networks.

2017

Arrowhead framework core systems and services

Autores
Delsing, J; Eliasson, J; Albano, M; Varga, P; Ferreira, L; Derhamy, H; Hegedus, C; Pereira, PP; Carlsson, O;

Publicação
IoT Automation: Arrowhead Framework

Abstract
Introduction In Chapter 2 local clouds were discussed followed by a local cloud automation architecture in Chapter 3. The automation architecture supports the implementation of local automation clouds. Such implementation is supported by the Arrowhead Framework and its core systems and services. © 2017 by Taylor & Francis Group, LLC.

2017

Early damage detection using multivariate data-driven approaches - Application to experimental data from a cable-stayed bridge

Autores
Sousa Tomé, E; Pimentel, M; Figueiras, J;

Publicação
SHMII 2017 - 8th International Conference on Structural Health Monitoring of Intelligent Infrastructure, Proceedings

Abstract
The implementation of automatic and real-time data processing algorithms in order to reduce the big amount of data down to a human and useful scale is often pointed out as a key step for increasing the value of Structural Health Monitoring (SHM). One of the reasons usually pointed out for the low usability of the structural monitoring data is the fact that damage events are usually masked by the environmental and/or operational effects. Indeed, the robustness and accuracy of the damage (or novelty) detection methods depend on how successfully the changes in the structural response due to damage can be discerned from the normal environmental and operational effects. The process of removing the environmental and/or operational effects from the structural response is usually termed as data normalisation. In this context, the present work describes the adopted methodologies for data normalisation and novelty detection implemented in the Corgo Bridge, a cable-stayed bridge located in northern Portugal recently opened to traffic and wherein a long-term monitoring system has been installed. Data normalisation is accomplished by means of application, alone and combined, of two well-established multivariate statistical tools: multiple linear regression analysis and principal component analysis. The Hotelling T2 control chart is used to track the existence of abnormal values. The performance of the chosen data normalisation methods are evaluated and compared. The first year of data is used to establish the multivariate models and the remaining data is used to validate the fitted models and the ability for novelty/damage detection. Since the bridge is new and sound, damage scenarios are numerically simulated.

2017

Multidimensional test coverage analysis: PARADIGM-COV tool

Autores
Paiva, ACR; Vilela, L;

Publicação
CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS

Abstract
Currently, software tends to assume increasingly critical roles in our society so assuring its quality becomes ever more crucial. There are several tools and processes of software testing to help increase quality in virtually any type of software. One example is the so calledmodel-based testing (MBT) tools, that generate test cases from models. Pattern Based Graphical User Interface Testing (PBGT) is an example of a MBT new methodology that aims at systematizing and automating the Graphical User Interface (GUI) testing process. It is supported by a Tool (PBGT Tool) which provides an integrated modeling and testing environment for crafting test models based on User Interface Test Patterns (UITP) using a GUI modeling Domain Specific Language (DSL) called PARADIGM. Most of the MBT tools have a configuration phase, where test input data is provided manually by the tester, which influences the quality of the test suite generated. By adding coverage analysis to MBT tools, it is possible to give feedback and help the tester to define the configuration data needed to achieve the most valuable test suite as possible and, ultimately, contribute for increasing the quality of the software. This paper presents a multidimensional test coverage analysis approach and tool (PARADIGM-COV), developed in the context of the PBGT project, that produces coverage information both over the PARADIGM model elements and during test case execution (to identify the parts of the model that were actually exercised). It also presents a case study illustrating the benefits of having multidimensional analysis and assessing the overall test coverage approach.

  • 2170
  • 4362