Cookies
O website necessita de alguns cookies e outros recursos semelhantes para funcionar. Caso o permita, o INESC TEC irá utilizar cookies para recolher dados sobre as suas visitas, contribuindo, assim, para estatísticas agregadas que permitem melhorar o nosso serviço. Ver mais
Aceitar Rejeitar
  • Menu
Publicações

2017

A Fast and Verified Software Stack for Secure Function Evaluation

Autores
Almeida, JB; Barbosa, M; Barthe, G; Dupressoir, F; Gregoire, B; Laporte, V; Pereira, V;

Publicação
CCS'17: PROCEEDINGS OF THE 2017 ACM SIGSAC CONFERENCE ON COMPUTER AND COMMUNICATIONS SECURITY

Abstract
We present a high-assurance software stack for secure function evaluation (SFE). Our stack consists of three components: i.a verified compiler (CircGen) that translates C programs into Boolean circuits; ii. a verified implementation of Yao's SFE protocol based on garbled circuits and oblivious transfer; and iii. transparent application integration and communications via FRESCO, an open-source framework for secure multiparty computation (MPC). CircGen is a general purpose tool that builds on CompCert, a verified optimizing compiler for C. It can be used in arbitrary Boolean circuit-based cryptography deployments. The security of our SFE protocol implementation is formally verified using EasyCrypt, a tool-assisted framework for building high-confidence cryptographic proofs, and it leverages a new formalization of garbled circuits based on the framework of Bellare, Hoang, and Rogaway (CCS 2012). We conduct a practical evaluation of our approach, and conclude that it is competitive with state-of-the-art (unverified) approaches. Our work provides concrete evidence of the feasibility of building efficient, verified, implementations of higher-level cryptographic systems. All our development is publicly available.

2017

Combining dataflow applications and real-time task sets on multi-core platforms

Autores
Ali, HI; Akesson, B; Pinho, LM;

Publicação
Proceedings of the 20th International Workshop on Software and Compilers for Embedded Systems, SCOPES 2017

Abstract
Future real-time embedded systems will increasingly incorporate mixed application models with timing constraints running on the same multi-core platform. These application models are dataflow applications with timing constraints and traditional real-time applications modelled as independent arbitrary-deadline tasks. These systems require guarantees that all running applications execute satisfying their timing constraints. Also, to be cost-efficient in terms of design, they require efficient mapping strategies that maximize the use of system resources to reduce the overall cost. This work proposes an approach to integrate mixed application models (dataflow and traditional real-time applications) with timing requirements on the same multi-core platform. It comprises three main algorithms: 1) Slack-Based Merging, 2) Timing Parameter Extraction, and 3) Communication-Aware Mapping. Together, these three algorithms play a part in allowing mapping and scheduling of mixed application models in embedded real-time systems. The complete approach and the three algorithms presented have been validated through proofs and experimental evaluation. © 2017 Copyright held by the owner/author(s).

2017

Superframe Duration Allocation Schemes to Improve the Throughput of Cluster-Tree Wireless Sensor Networks

Autores
Leao, E; Montez, C; Moraes, R; Portugal, P; Vasques, F;

Publicação
SENSORS

Abstract
The use ofWireless Sensor Network (WSN) technologies is an attractive option to support wide-scale monitoring applications, such as the ones that can be found in precision agriculture, environmental monitoring and industrial automation. The IEEE 802.15.4/ZigBee cluster-tree topology is a suitable topology to build wide-scale WSNs. Despite some of its known advantages, including timing synchronisation and duty-cycle operation, cluster-tree networks may suffer from severe network congestion problems due to the convergecast pattern of its communication traffic. Therefore, the careful adjustment of transmission opportunities (superframe durations) allocated to the cluster-heads is an important research issue. This paper proposes a set of proportional Superframe Duration Allocation (SDA) schemes, based on well-defined protocol and timing models, and on the message load imposed by child nodes (Load-SDA scheme), or by number of descendant nodes (Nodes-SDA scheme) of each cluster-head. The underlying reasoning is to adequately allocate transmission opportunities (superframe durations) and parametrize buffer sizes, in order to improve the network throughput and avoid typical problems, such as: network congestion, high end-to-end communication delays and discarded messages due to buffer overflows. Simulation assessments show how proposed allocation schemes may clearly improve the operation of wide-scale cluster-tree networks.

2017

Multidimensional test coverage analysis: PARADIGM-COV tool

Autores
Paiva, ACR; Vilela, L;

Publicação
CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS

Abstract
Currently, software tends to assume increasingly critical roles in our society so assuring its quality becomes ever more crucial. There are several tools and processes of software testing to help increase quality in virtually any type of software. One example is the so calledmodel-based testing (MBT) tools, that generate test cases from models. Pattern Based Graphical User Interface Testing (PBGT) is an example of a MBT new methodology that aims at systematizing and automating the Graphical User Interface (GUI) testing process. It is supported by a Tool (PBGT Tool) which provides an integrated modeling and testing environment for crafting test models based on User Interface Test Patterns (UITP) using a GUI modeling Domain Specific Language (DSL) called PARADIGM. Most of the MBT tools have a configuration phase, where test input data is provided manually by the tester, which influences the quality of the test suite generated. By adding coverage analysis to MBT tools, it is possible to give feedback and help the tester to define the configuration data needed to achieve the most valuable test suite as possible and, ultimately, contribute for increasing the quality of the software. This paper presents a multidimensional test coverage analysis approach and tool (PARADIGM-COV), developed in the context of the PBGT project, that produces coverage information both over the PARADIGM model elements and during test case execution (to identify the parts of the model that were actually exercised). It also presents a case study illustrating the benefits of having multidimensional analysis and assessing the overall test coverage approach.

2017

A Data Mining Approach for Multivariate Outlier Detection in Postprocessing of Multitemporal InSAR Results

Autores
Bakon, M; Oliveira, I; Perissin, D; Sousa, JJ; Papco, J;

Publicação
IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING

Abstract
Displacement maps from multitemporal InSAR (MTI) are usually noisy and fragmented. Thresholding on ensemble coherence is a common practice for identifying radar scatterers that are less affected by decorrelation noise. Thresholding on coherence might, however, cause loss of information over the areas undergoing more complex deformation scenarios. If the discrepancies in the areas of moderate coherence share similar behavior, it appears important to take into account their spatial correlation for correct inference. The information over low-coherent areas might then be used in a similar way the coherence is used in thematic mapping applications such as change detection. We propose an approach based on data mining and statistical procedures for mitigating the impact of outliers in MTI results. Our approach allows for minimization of outliers in final results while preserving spatial and statistical dependence among observations. Tests from monitoring slope failures and undermined areas performed in this work have shown that this is beneficial: 1) for better evaluation of low coherent scatterers that are commonly discarded by the standard thresholding procedure, 2) for tackling outlying observations with extremes in any variable, 3) for improving spatial densities of standard persistent scatterers, 4) for the evaluation of areas undergoing more complex deformation scenarios, and 5) for the visualization purposes.

2017

Common Practices for Integrating Industrial Agents and Low Level Automation Functions

Autores
Leitao, P; Karnouskos, S; Ribeiro, L; Moutis, P; Barbosa, J; Strasser, TI;

Publicação
IECON 2017 - 43RD ANNUAL CONFERENCE OF THE IEEE INDUSTRIAL ELECTRONICS SOCIETY

Abstract
Industrial agent technologies have been integrated in key elements coupling industrial systems and software logic, which is an important issue in the design of cyber-physical systems. Although several efforts have been tried out over the last decades to integrate software agents with physical hardware devices, and some commonalities can be observed among the existing practices, there is no uniform way overall. This work presents an empirical survey of existing practices in three application area, namely factory automation, power & energy systems and building automation. It identifies pertaining common issues and discusses how they integrate low level automation functions by utilizing industrial agents. The surveyed practices reveal high diversity, customized traditional integration focusing mostly on I/O functions, without security, and an overall approach that is mostly coupled rather than embedded.

  • 1941
  • 4134