2016
Autores
Cruz, MRM; Fitiwi, DZ; Santos, SF; Catalao, JPS;
Publicação
2016 13TH INTERNATIONAL CONFERENCE ON THE EUROPEAN ENERGY MARKET (EEM)
Abstract
Nowadays, there is a global consensus that integrating renewable energy sources (RES) is highly needed to meet an increasing demand for electricity and reduce the overall carbon footprint of power production. Framed in this context, the coordination of RES integration with distributed energy storage systems (DESS), along with the network's switching capability and/or network reinforcement, is expected to significantly improve system flexibility, thereby increasing chances of accommodating large-scale RES power. This paper presents an innovative method to quantify the impacts of network switching and/or reinforcement as well as installing DESSs on the level of renewable power integrated in the system. To carry out this analysis, a dynamic and multi-objective stochastic mixed integer linear programming (S-MILP) model is developed, which jointly takes into account the optimal RES-based DGs and DESS integration in coordination with distribution network reinforcement and/or switching. A standard distribution network system is used as a case study. Numerical results show the capability of DESSs integration in dramatically increasing the level of renewable DGs integrated in the system. Although case-dependent, the impact of network switching on RES power integration is not significant.
2016
Autores
Faustino Rocha, AI; Gama, A; Oliveira, PA; Alvarado, A; Fidalgo Goncalves, L; Ferreira, R; Ginja, M;
Publicação
IN VIVO
Abstract
Background/Aim: In this study, we evaluated the dimensions and volume of rat mammary tumors and the association of these variables with tumor invasiveness. Materials and Methods: Tumors were measured by caliper and ultrasonography. Volume was determined by water displacement and by application of four formulas using tumor length (L), width (W) and depth (D) or tumor weight. Results: Results confirmed the data obtained in our previous work, where we verified that mammary tumors grow as oblate spheroids. Conclusion: The determination of mammary tumor volume by applying the formula V=(4/3) x pi x(L/2) x(L/2) x(D/2) is the best way to evaluate tumor volume in vivo. Beyond volume evaluation by water displacement, the determination on the basis of tumor weight is the most accurate way to evaluate tumor volume after animal sacrifice or tumor excision. According to our results, it is not possible to predict if a tumor is invasive or non-invasive by its dimensions, volume or weight. Future work in chemically-induced mammary cancer should use ultrasonography and water displacement or tumor weight to determine tumor volume in vivo and after animal sacrifice or tumor excision, respectively.
2016
Autores
Castro, H; Monteiro, J; Pereira, A; Silva, D; Coelho, G; Carvalho, P;
Publicação
MULTIMEDIA TOOLS AND APPLICATIONS
Abstract
Over the last decade noticeable progress has occurred in automated computer interpretation of visual information. Computers running artificial intelligence algorithms are growingly capable of extracting perceptual and semantic information from images, and registering it as metadata. There is also a growing body of manually produced image annotation data. All of this data is of great importance for scientific purposes as well as for commercial applications. Optimizing the usefulness of this, manually or automatically produced, information implies its precise and adequate expression at its different logical levels, making it easily accessible, manipulable and shareable. It also implies the development of associated manipulating tools. However, the expression and manipulation of computer vision results has received less attention than the actual extraction of such results. Hence, it has experienced a smaller advance. Existing metadata tools are poorly structured, in logical terms, as they intermix the declaration of visual detections with that of the observed entities, events and comprising context. This poor structuring renders such tools rigid, limited and cumbersome to use. Moreover, they are unprepared to deal with more advanced situations, such as the coherent expression of the information extracted from, or annotated onto, multi-view video resources. The work here presented comprises the specification of an advanced XML based syntax for the expression and processing of Computer Vision relevant metadata. This proposal takes inspiration from the natural cognition process for the adequate expression of the information, with a particular focus on scenarios of varying numbers of sensory devices, notably, multi-view video.
2016
Autores
Morales, GDF; Bifet, A; Khan, L; Gama, J; Fan, W;
Publicação
Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, August 13-17, 2016
Abstract
The challenge of deriving insights from the Internet of Things (IoT) has been recognized as one of the most exciting and key opportunities for both academia and industry. Advanced analysis of big data streams from sensors and devices is bound to become a key area of data mining research as the number of applications requiring such processing increases. Dealing with the evolution over time of such data streams, i.e., with concepts that drift or change completely, is one of the core issues in IoT stream mining. This tutorial is a gentle introduction to mining IoT big data streams. The first part introduces data stream learners for classification, regression, clustering, and frequent pattern mining. The second part deals with scalability issues inherent in IoT applications, and discusses how to mine data streams on distributed engines such as Spark, Flink, Storm, and Samza. © 2016 Copyright held by the owner/author(s).
2016
Autores
Almeida, JB; Barbosa, M; Barthe, G; Dupressoir, F;
Publicação
FAST SOFTWARE ENCRYPTION (FSE 2016)
Abstract
We provide further evidence that implementing software countermeasures against timing attacks is a non-trivial task and requires domain-specific software development processes: we report an implementation bug in the s2n library, recently released by AWS Labs. This bug ( now fixed) allowed bypassing the balancing countermeasures against timing attacks deployed in the implementation of the MAC-then-Encode-then-CBC-Encrypt (MEE-CBC) component, creating a timing side-channel similar to that exploited by Lucky 13. Although such an attack could only be launched when the MEE-CBC component is used in isolation - Albrecht and Paterson recently confirmed in independent work that s2n's second line of defence, once reinforced, provides adequate mitigation against current adversary capabilities - its existence serves as further evidence to the fact that conventional software validation processes are not effective in the study and validation of security properties. To solve this problem, we define a methodology for proving security of implementations in the presence of timing attackers: first, prove black-box security of an algorithmic description of a cryptographic construction; then, establish functional correctness of an implementation with respect to the algorithmic description; and finally, prove that the implementation is leakage secure. We present a proof-of-concept application of our methodology to MEE-CBC, bringing together three different formal verification tools to produce an assembly implementation of this construction that is verifiably secure against adversaries with access to some timing leakage. Our methodology subsumes previous work connecting provable security and side-channel analysis at the implementation level, and supports the verification of a much larger case study. Our case study itself provides the first provable security validation of complex timing countermeasures deployed, for example, in OpenSSL.
2016
Autores
Ribeiro, J; Carmona, J;
Publicação
TRANSACTIONS ON PETRI NETS AND OTHER MODELS OF CONCURRENCY XI
Abstract
Given a log L, a control-flow discovery algorithm f, and a quality metric m, this paper faces the following problem: what are the parameters in f that mostly influence its application in terms of m when applied to L? This paper proposes a method to face this problem, based on sensitivity analysis, a theory which has been successfully applied in other areas. Clearly, a satisfactory solution to this problem will be crucial to bridge the gap between process discovery algorithms and final users. Additionally, recommendation techniques and meta-techniques like determining the representational bias of an algorithm may benefit from solutions to the problem considered in this paper. The method has been evaluated over a set of logs and two different miners: the inductive miner and the flexible heuristic miner, and the experimental results witness the applicability of the general framework described in this paper.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.