Cookies
O website necessita de alguns cookies e outros recursos semelhantes para funcionar. Caso o permita, o INESC TEC irá utilizar cookies para recolher dados sobre as suas visitas, contribuindo, assim, para estatísticas agregadas que permitem melhorar o nosso serviço. Ver mais
Aceitar Rejeitar
  • Menu
Publicações

2016

A multi-objective model for the day-ahead energy resource scheduling of a smart grid with high penetration of sensitive loads

Autores
Soares, J; Fotouhi Ghazvini, AF; Vale, Z; de Moura Oliveira, PBD;

Publicação
APPLIED ENERGY

Abstract
In this paper, a multi-objective framework is proposed for the daily operation of a Smart Grid (SG) with high penetration of sensitive loads. The Virtual Power Player (VPP) manages the day-ahead energy resource scheduling in the smart grid, considering the intensive use of Distributed Generation (DG) and Vehicle-To-Grid (V2G), while maintaining a highly reliable power for the sensitive loads. This work considers high penetration of sensitive loads, i.e. loads such as some industrial processes that require high power quality, high reliability and few interruptions. The weighted-sum approach is used with the distributed and parallel computing techniques to efficiently solve the multi-objective problem. A two-stage optimization method is proposed using a Particle Swarm Optimization (PSO) and a deterministic technique based on Mixed-Integer Linear Programming (MILP). A realistic mathematical formulation considering the electric network constraints for the day-ahead scheduling model is described. The execution time of the large-scale problem can be reduced by using a parallel and distributed computing platform. A Pareto front algorithm is applied to determine the set of non-dominated solutions. The maximization of the minimum available reserve is incorporated in the mathematical formulation in addition to the cost minimization, to take into account the reliability requirements of sensitive and vulnerable loads. A case study with a 180-bus distribution network and a fleet of 1000 gridable Electric Vehicles (EVs) is used to illustrate the performance of the proposed method. The execution time to solve the optimization problem is reduced by using distributed computing.

2016

A review of automatic malaria parasites detection and segmentation in microscopic images

Autores
Rosado, L; Correia da Costa, JM; Elias, D; Cardoso, JS;

Publicação
Anti-Infective Agents

Abstract
Background: Malaria is a leading cause of death and disease in many developing countries, where young children and pregnant women are the most affected groups. In 2012, there were an estimated 207 million cases of malaria, which caused approximately 627 000 malaria deaths. Around 80% of malaria cases occur in Africa, where the lack of access to malaria diagnosis is largely due to a shortage of expertise, being the shortage of equipment the secondary factor. This lack of expertise for malaria diagnosis frequently results on the increase of false positives, since prescription of medication is based only on symptoms. Thus, there is an urgent need of new tools that can facilitate the rapid and easy diagnosis of malaria, especially in areas with limited access to quality healthcare services. Methods: Various image processing and analysis approaches already proposed on the literature for the detection and segmentation of malaria parasites in blood smear microscopic images were collected and reviewed. This timely review aims to support the increasing interest in the development of low cost tools that can facilitate the rapid and easy diagnosis of malaria, especially in areas with limited access to quality healthcare services. Results: Malaria parasites detection and segmentation techniques in microscopic images are, in general, still in need of improvement and further testing. Most of the methodologies reviewed in this work were tested with a limited number of images, and more studies with significantly larger datasets for the evaluation of the proposed approaches are needed. Despite promising results reported during the past years, the great majority of the computer-aided methods found on the literature for malaria diagnosis are based on images acquired under well controlled conditions and with proper microscopic equipment. However, one should take into account that 80% of malaria cases occur in Africa, where this type of equipment is scarce or even nonexistent in common healthcare facilities. Conclusion: This work collects and reviews various image processing and analysis approaches already proposed on the literature for the detection and segmentation of malaria parasites in blood smear microscopic images. This timely review aims to support the increasing interest in the development of image processing-based systems to be used in rural areas of developing countries, which might be the next future trend in malaria computer-aided diagnosis. © 2016 Bentham Science Publishers.

2016

Modeling volatility in Heat Rate Variability

Autores
Leite, A; Silva, ME; Rocha, AP;

Publicação
2016 38TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC)

Abstract
Modeling Heart Rate Variability (HRV) data has become important for clinical applications and as a research tool. These data exhibit long memory and time-varying conditional variance (volatility). In HRV, volatility is traditionally estimated by recursive least squares combined with short memory AutoRegressive (AR) models. This work considers a parametric approach based on long memory Fractionally Integrated AutoRegressive Moving Average (ARFIMA) models with heteroscedastic errors. To model the heteroscedasticity nonlinear Generalized Autoregressive Conditionally Heteroscedastic (GARCH) and Exponential Generalized Autoregressive Conditionally Heteroscedastic (EGARCH) models are considered. The latter are necessary to model empirical characteristics of conditional volatility such as clustering and asymmetry in the response, usually called leverage in time series literature. The ARFIMA-EGARCH models are used to capture and remove long memory and characterize conditional volatility in 24 hour HRV recordings from the Noltisalis database.

2016

Combining coastal geoscience mapping and photogrammetric surveying in maritime environments (Northwestern Iberian Peninsula): focus on methodology

Autores
Pires, A; Chamine, HI; Piqueiro, F; Perez Alberti, A; Rocha, F;

Publicação
ENVIRONMENTAL EARTH SCIENCES

Abstract
Digital photogrammetry and GIS-based mapping are increasingly recognised as powerful tools in littoral issues. This paper considers the interoperability framework for high-resolution imagery acquisition and the development of coastal geoscience maps. The layered system architecture of the cartographic methodology is also explained. Moreover, it highlights a new approach to assessing heterogeneous geologic, geomorphological and maritime environments. The main goal of the present study was to test a new concept for photogrammetric images in order to assist modelling techniques, spatial analysis and coastal conceptual models. This approach proposes a methodological approach to coastal zone monitoring and to maritime forcing conditions evaluating. This approach will allow: (1) the acquisition of a large archive of high-resolution imagery; (2) the development of a coastal database including the entire data field and in situ assessments; (3) the study of coastal dynamics and shoreline evolution; (4) the assessment of the rock platforms and hydraulic structures; (5) the production of coastal geosciences maps. An integrated coastal geoscience and engineering methodology was outlined in NW of Iberian Peninsula (South Galicia and North/Central Portugal regions). This paper reports on the increased knowledge of the studied regions, providing essential data concerning coastal geo-morphodynamics. The overall assessment revealed additional evidence of erosion issues, which contributes to a better understanding of the hydraulic conditions. The main results are presented in regional coastal geoscience maps and local approach-outputs that could help government, local authorities and stakeholders to develop coastal management plans and to recommend strategies.

2016

An Agent-based Model of the Earth System & Climate Change

Autores
Baghoussi, Y; Campos, PJRM; Rossetti, RJF;

Publicação
IEEE SECOND INTERNATIONAL SMART CITIES CONFERENCE (ISC2 2016)

Abstract
Simulation is a computer-based experimentation tool suitable to determine the efficacy of a previously untried decision. In this paper, we present a model of climate change. The goal behind this project is to provide a test-bed to evaluate theories related to the Earth system so as to test and evaluate metrics such as greenhouse gases and climate change in general. The proposed approach is based on a multi-agent model which has as input a representation of nature and as output the changes that will occur on Earth within a given instant of time. Most views about climate change do not take into account the real severity of the subject matter; however, the present perspective is given in a way so as to make non-experts aware of the risks that are threatening life on Earth. Just recently, the general population has developed considerable sensitivity to these issues. One important contribution of this work is to use agent-based modeling and simulation as an instructional tool that will allow people to easily understand all aspects involved in the preservation of the environment in a more aware and responsible way.

2016

Foundations of Hardware-Based Attested Computation and Application to SGX

Autores
Barbosa, M; Portela, B; Scerri, G; Warinschi, B;

Publicação
1ST IEEE EUROPEAN SYMPOSIUM ON SECURITY AND PRIVACY

Abstract
Exciting new capabilities of modern trusted hardware technologies allow for the execution of arbitrary code within environments completely isolated from the rest of the system and provide cryptographic mechanisms for securely reporting on these executions to remote parties. Rigorously proving security of protocols that rely on this type of hardware faces two obstacles. The first is to develop models appropriate for the induced trust assumptions (e.g., what is the correct notion of a party when the peer one wishes to communicate with is a specific instance of an an outsourced program). The second is to develop scalable analysis methods, as the inherent stateful nature of the platforms precludes the application of existing modular analysis techniques that require high degrees of independence between the components. We give the first steps in this direction by studying three cryptographic tools which have been commonly associated with this new generation of trusted hardware solutions. Specifically, we provide formal security definitions, generic constructions and security analysis for attested computation, key-exchange for attestation and secure outsourced computation. Our approach is incremental: each of the concepts relies on the previous ones according to an approach that is quasi-modular. For example we show how to build a secure outsourced computation scheme from an arbitrary attestation protocol combined together with a key-exchange and an encryption scheme.

  • 2336
  • 4201