Cookies
O website necessita de alguns cookies e outros recursos semelhantes para funcionar. Caso o permita, o INESC TEC irá utilizar cookies para recolher dados sobre as suas visitas, contribuindo, assim, para estatísticas agregadas que permitem melhorar o nosso serviço. Ver mais
Aceitar Rejeitar
  • Menu
Publicações

2014

TEC4SEA-A Modular Platform for Research, Test and Validation of Technologies Supporting a Sustainable Blue Economy

Autores
Monica, P; Martins, A; Olivier, A; Matos, A; Almeida, JM; Cruz, N; Alves, JC; Salgado, H; Pessoa, L; Jorge, P; Campos, R; Ricardo, M; Pinho, C; Silva, A; Jesus, S; Silva, E;

Publicação
2014 OCEANS - ST. JOHN'S

Abstract
This paper presents the TEC4SEA research infrastructure created in Portugal to support research, development, and validation of marine technologies. It is a multidisciplinary open platform, capable of supporting research, development, and test of marine robotics, telecommunications, and sensing technologies for monitoring and operating in the ocean environment. Due to the installed research facilities and its privileged geographic location, it allows fast access to deep sea, and can support multidisciplinary research, enabling full validation and evaluation of technological solutions designed for the ocean environment. It is a vertically integrated infrastructure, in the sense that it possesses a set of skills and resources which range from pure conceptual research to field deployment missions, with strong industrial and logistic capacities in the middle tier of prototype production. TEC4SEA is open to the entire scientific and enterprise community, with a free access policy for researchers affiliated with the research units that ensure its maintenance and sustainability. The paper describes the infrastructure in detail, and discusses associated research programs, providing a strategic vision for deep sea research initiatives, within the context of both the Portuguese National Ocean Strategy and European Strategy frameworks.

2014

Automatic detection of the carotid lumen axis in B-mode ultrasound images

Autores
Rocha, R; Silva, J; Campilho, A;

Publicação
COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE

Abstract
A new approach is introduced for the automatic detection of the lumen axis of the common carotid artery in B-mode ultrasound images. The image is smoothed using a Gaussian filter and then a dynamic programming scheme extracts the dominant paths of local minima of the intensity and the dominant paths of local maxima of the gradient magnitude with the gradient pointing downwards. Since these paths are possible estimates of the lumen axis and the far wall of a blood vessel, respectively, they are grouped together into pairs. Then, a pattern of two features is computed from each pair of paths and used as input to a linear discriminant classifier in order to select the pair of paths that correspond to the common carotid artery. The estimated lumen axis is the path of local minima of the intensity that belongs to the selected pair of paths. The proposed method is suited to real time processing, no user interaction is required and the number of parameters is minimal and easy to determine. The validation was performed using two datasets, with a total of 199 images, and has shown a success rate of 99.5% (100% if only the carotid regions for which a ground truth is available are considered). The datasets have a large diversity of images, including cases of arteries with plaque and images with heavy noise, text or other graphical markings inside the artery region.

2014

DATAFLASKS: epidemic store for massive scale systems

Autores
Maia, F; Matos, M; Vilaca, R; Pereira, J; Oliveira, R; Riviere, E;

Publicação
2014 IEEE 33RD INTERNATIONAL SYMPOSIUM ON RELIABLE DISTRIBUTED SYSTEMS (SRDS)

Abstract
Very large scale distributed systems provide some of the most interesting research challenges while at the same time being increasingly required by nowadays applications. The escalation in the amount of connected devices and data being produced and exchanged, demands new data management systems. Although new data stores are continuously being proposed, they are not suitable for very large scale environments. The high levels of churn and constant dynamics found in very large scale systems demand robust, proactive and unstructured approaches to data management. In this paper we propose a novel data store solely based on epidemic (or gossip-based) protocols. It leverages the capacity of these protocols to provide data persistence guarantees even in highly dynamic, massive scale systems. We provide an open source prototype of the data store and correspondent evaluation.

2014

Specifying Dynamic Adaptations for Embedded Applications Using a DSL

Autores
Santos, AC; Cardoso, JMP; Diniz, PC; Ferreira, DR; Petrov, Z;

Publicação
Embedded Systems Letters

Abstract
Embedded systems are severely resource constrained and thus can benefit from adaptations to enhance their functionality in highly dynamic operating conditions. Adaptations, however, often require additional programming effort or complex architectural solutions, resulting in long design cycles, troublesome maintenance, and impractical use for legacy applications. In this letter, we introduce an adaptation logic for the dynamic reconfiguration of embedded applications and its implementation via a domain-specific language. We illustrate the approach in a real-world case study of a navigation application for avionics. © 2014 IEEE.

2014

Challenges in Computing Semantic Relatedness for Large Semantic Graphs

Autores
Costa, T; Leal, JP;

Publicação
PROCEEDINGS OF THE 18TH INTERNATIONAL DATABASE ENGINEERING AND APPLICATIONS SYMPOSIUM (IDEAS14)

Abstract
The research presented in this paper is part of an ongoing work to define semantic relatedness measures to any given semantic graph. These measures are based on a prior definition of a family of proximity algorithms that computes the semantic relatedness between pairs of concepts, and are parametrized by a semantic graph and a set of weighted properties. The distinctive feature of the proximity algorithms is that they consider all paths connecting two concepts in the semantic graph. These parameters must be tuned in order to maximize the quality of the semantic measure against a benchmark data set. From a previous work, the process of tuning the weight assignment is already developed and relies on a genetic algorithm. The weight tuning process, using all the properties in the semantic graph, was validated using WordNet 2.0 and the data set WordSim-353. The quality of the obtained semantic measure is better than those in the literature. However, this approach did not produce equally good results in larger semantic graphs such as WordNet 3.0, DBPedia and Freebase. This was in part due to the size of these graphs. The current approach is to select a sub-graph of the original semantic graph, small enough to enable processing and large enough to include all the relevant paths. This paper provides an overview of the ongoing work and presents a strategy to overcome the challenges raise by large semantic graphs.

2014

Risk assessment of interruption times affecting domestic and non-domestic electricity customers

Autores
Ilie I.; Hernando-Gil I.; Djokic S.;

Publicação
International Journal of Electrical Power and Energy Systems

Abstract
Legislation defined to protect domestic and non-domestic customers from long durations of interruptions includes additional requirements to system's reliability-related performance that distribution network operators (DNOs) must consider in planning the operation and maintenance process of power supply systems. DNOs are required to restore the supply to interrupted customers that fall into "unprotected" customer class within a given period of time, otherwise penalties are applied. In order to meet these requirements, comprehensive strategies must be defined based on upfront analyses. Accordingly, this paper proposes a deterministic algorithm for estimating DNOs' risk of experiencing interruptions with durations above imposed targets. Besides the Regulator-defined legislation, security of supply requirements are engaged in the development of the proposed methodology. Failure analysis of network components is used to identify interrupted customers that are grouped into power demand classes such that the duration of interruptions can be addressed following the security of supply requirements. Moreover, the penalty times defined by the Energy Regulator are engaged in the analysis and used as thresholds to quantify the penalty risk that DNOs are exposed to. The proposed methodology is applied to a typical UK distribution system, whose average reliability performance is also considered in the analysis. © 2013 Elsevier Ltd. All rights reserved.

  • 2689
  • 4201