2015
Authors
Santos, DF; Guerreiro, A; Baptista, JM;
Publication
24TH INTERNATIONAL CONFERENCE ON OPTICAL FIBRE SENSORS
Abstract
This paper presents the performance analysis of a new geometry sensing configuration for refractive index, based on surface plasmon resonance (SPR) in photonic crystal fiber (PCF) D-type optical fiber with a thin gold layer, using the finite element method (FEM). The configuration is analyzed in terms of the loss. The results are compared with a conventional SPR D-type and with a PCF D-type optical fiber sensor for refractive index measurement. The simulation results show an improvement of the sensitivity and resolution (3.70x10(3)nm/RIU and 2.72x10(-5)RIU, respectively, when considering an accurately spectral variation detection of 0.1nm).
2015
Authors
Mendes Moreira, J; Moreira Matias, L; Gama, J; de Sousa, JF;
Publication
INFORMATION SCIENCES
Abstract
Nowadays, every public transportation company uses Automatic Vehicle Location (AVL) systems to track the services provided by each vehicle. Such information can be used to improve operational planning. This paper describes an AVL-based evaluation framework to test whether the actual Schedule Plan fits, in terms of days covered by each schedule, the network's operational conditions. Firstly, clustering is employed to group days with similar profiles in terms of travel times (this is done for each different route). Secondly, consensus clustering is used to obtain a unique set of clusters for all routes. Finally, a set of rules about the groups content is drawn based on appropriate decision variables. Each group will correspond to a different schedule and the rules identify the days covered by each schedule. This methodology is simultaneously an evaluator of the schedules that are offered by the company (regarding its coverage) and an advisor on possible changes to such offer. It was tested by using data collected for one year in a company running in Porto, Portugal. The results are sound. The main contribution of this paper is that it proposes a way to combine Machine Learning techniques to add a novel dimension to the Schedule Plan evaluation methods: the day coverage. Such approach meets no parallel in the current literature.
2015
Authors
Ortega, A; Pedrosa, J; Heyde, B; Tong, L; D'Hooge, J;
Publication
2015 IEEE International Ultrasonics Symposium, IUS 2015
Abstract
Fast volumetric cardiac imaging requires to reduce the number of transmit events within a single volume. One way of achieving this is by limiting the field-of-view (FOV) of the recording to the anatomically relevant domain only (e.g. the myocardium when investigating cardiac mechanics). Although fully automatic solutions towards myocardial segmentation exist, translating that information in a fast ultrasound scan sequence is not trivial. The aim of this study was therefore to develop a methodology to automatically define the FOV from a volumetric dataset in the context of anatomical scanning. Hereto, a method is proposed where the anatomical relevant space is automatically identified as follows. First, the left ventricular myocardium is localized in the volumetric ultrasound recording using a fully automatic real-time segmentation framework (i.e. BEAS). Then, the extracted meshes are employed to define a binary mask identifying myocardial voxels only. Later, using these binary images, the percentage of pixels along a given image line that belong to the myocardium is calculated. Finally, a spatially continuous FOV that covers 'T' percentage of the myocardium is found by means of a ring-shaped template matching, giving as a result the opening angle and 'thickness' for a conical scan. This approach was tested on 27 volumetric ultrasound datasets, a T = 85% was used. The mean initial opening angle for a conical scan was of 19.67±8.53° while the mean 'thickness' of the cone was 19.01±3.35°. Therefore, a reduction of 48.99% in the number of transmit events was achieved, resulting in a frame rate gain factor of 1.96. As a conclusion, anatomical scanning in combination with new scanning sequences techniques can increase frame rate significantly while keeping information of the relevant structures for functional imaging. © 2015 IEEE.
2015
Authors
Schwartz, MP; Hou, ZG; Propson, NE; Zhang, J; Engstrom, CJ; Costa, VS; Jiang, P; Nguyen, BK; Bolin, JM; Daly, W; Wang, Y; Stewart, R; Page, CD; Murphy, WL; Thomson, JA;
Publication
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA
Abstract
Human pluripotent stem cell-based in vitro models that reflect human physiology have the potential to reduce the number of drug failures in clinical trials and offer a cost-effective approach for assessing chemical safety. Here, human embryonic stem (ES) cell-derived neural progenitor cells, endothelial cells, mesenchymal stem cells, and microglia/macrophage precursors were combined on chemically defined polyethylene glycol hydrogels and cultured in serum-free medium to model cellular interactions within the developing brain. The precursors self-assembled into 3D neural constructs with diverse neuronal and glial populations, interconnected vascular networks, and ramified microglia. Replicate constructs were reproducible by RNA sequencing (RNA-Seq) and expressed neurogenesis, vasculature development, and microglia genes. Linear support vector machines were used to construct a predictive model from RNA-Seq data for 240 neural constructs treated with 34 toxic and 26 nontoxic chemicals. The predictive model was evaluated using two standard hold-out testing methods: a nearly unbiased leave-one-out cross-validation for the 60 training compounds and an unbiased blinded trial using a single hold-out set of 10 additional chemicals. The linear support vector produced an estimate for future data of 0.91 in the cross-validation experiment and correctly classified 9 of 10 chemicals in the blinded trial.
2015
Authors
Lopes, MA; Almeida, AS; Almada Lobo, B;
Publication
HUMAN RESOURCES FOR HEALTH
Abstract
Background: Planning the health-care workforce required to meet the health needs of the population, while providing service levels that maximize the outcome and minimize the financial costs, is a complex task. The problem can be described as assessing the right number of people with the right skills in the right place at the right time, to provide the right services to the right people. The literature available on the subject is vast but sparse, with no consensus established on a definite methodology and technique, making it difficult for the analyst or policy maker to adopt the recent developments or for the academic researcher to improve such a critical field. Methods: We revisited more than 60 years of documented research to better understand the chronological and historical evolution of the area and the methodologies that have stood the test of time. The literature review was conducted in electronic publication databases and focuses on conceptual methodologies rather than techniques. Results: Four different and widely used approaches were found within the scope of supply and three within demand. We elaborated a map systematizing advantages, limitations and assumptions. Moreover, we provide a list of the data requirements necessary to implement each of the methodologies. We have also identified past and current trends in the field and elaborated a proposal on how to integrate the different methodologies. Conclusion: Methodologies abound, but there is still no definite approach to address HHR planning. Recent literature suggests that an integrated approach is the way to solve such a complex problem, as it combines elements both from supply and demand, and more effort should be put in improving that proposal.
2015
Authors
Sanchez, A; Oliveira, N; Barbosa, LS; Henriques, P;
Publication
SCIENCE OF COMPUTER PROGRAMMING
Abstract
Continuous evolution towards very large, heterogeneous, highly dynamic computing systems entails the need for sound and flexible approaches to deal with system modification and re-engineering. The approach proposed in this paper combines an analysis stage, to identify concrete patterns of interaction in legacy code, with an iterative re-engineering process at a higher level of abstraction. Both stages are supported by the tools CoordPat and Archery, respectively. Bi-directional model transformations connecting code level and design level architectural models are defined. The approach is demonstrated in a (fragment of a) case study.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.