Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
Publications

2018

Segmentation of kidney and renal collecting system on 3D computed tomography images

Authors
Oliveira, B; Torres, HR; Queiros, SF; Morais, P; Fonseca, JC; D'hooge, J; Rodrigues, NF; Vilaça, JL;

Publication
6th IEEE International Conference on Serious Games and Applications for Health, SeGAH 2018, Vienna, Austria, May 16-18, 2018

Abstract
Surgical training for minimal invasive kidney interventions (MIKI) has huge importance within the urology field. Within this topic, simulate MIKI in a patient-specific virtual environment can be used for pre-operative planning using the real patient's anatomy, possibly resulting in a reduction of intra-operative medical complications. However, the validated VR simulators perform the training in a group of standard models and do not allow patient-specific training. For a patient-specific training, the standard simulator would need to be adapted using personalized models, which can be extracted from pre-operative images using segmentation strategies. To date, several methods have already been proposed to accurately segment the kidney in computed tomography (CT) images. However, most of these works focused on kidney segmentation only, neglecting the extraction of its internal compartments. In this work, we propose to adapt a coupled formulation of the B-Spline Explicit Active Surfaces (BEAS) framework to simultaneously segment the kidney and the renal collecting system (CS) from CT images. Moreover, from the difference of both kidney and CS segmentations, one is able to extract the renal parenchyma also. The segmentation process is guided by a new energy functional that combines both gradient and region-based energies. The method was evaluated in 10 kidneys from 5 CT datasets, with different image properties. Overall, the results demonstrate the accuracy of the proposed strategy, with a Dice overlap of 92.5%, 86.9% and 63.5%, and a point-to-surface error around 1.6 mm, 1.9 mm and 4 mm for the kidney, renal parenchyma and CS, respectively. © 2018 IEEE.

2018

Evaluation of Oversampling Data Balancing Techniques in the Context of Ordinal Classification

Authors
Domingues, I; Amorim, JP; Abreu, PH; Duarte, H; Santos, JAM;

Publication
2018 International Joint Conference on Neural Networks, IJCNN 2018, Rio de Janeiro, Brazil, July 8-13, 2018

Abstract
Data imbalance is characterized by a discrepancy in the number of examples per class of a dataset. This phenomenon is known to deteriorate the performance of classifiers, since they are less able to learn the characteristics of the less represented classes. For most imbalanced datasets, the application of sampling techniques improves the classifier's performance. For small datasets, oversampling has been shown to be the most appropriate strategy since it augments the original set of samples. Although several oversampling strategies have been proposed and tested over the years, the work has mostly focused on binary or multi-class tasks. Motivated by medical applications, where there is often an order associated with the classes (increasing likelihood of malignancy, for instance), the present work tests some existing oversampling techniques in ordinal contexts. Moreover, four new oversampling techniques are proposed. Experiments were made both on private and public datasets. Private datasets concern the assessment of response to treatment on oncologic diseases. The 15 public datasets were chosen since they are widely used in the literature. Results show that data balance techniques improve classification results on ordinal imbalanced datasets, even when these techniques are not specifically designed for ordinal problems. With our pipeline, better or equal to published results were obtained for 10 out of the 15 public datasets with improvements upon a decrease of 0.43 on MMAE.

2018

Creating entrepreneurial universities in an emerging economy: Evidence from Brazil

Authors
Dalmarco, G; Hulsink, W; Blois, GV;

Publication
Technological Forecasting and Social Change

Abstract

2018

A rapid prototyping tool to produce 360º video-based immersive experiences enhanced with virtual/multimedia elements

Authors
Adão, T; Pádua, L; Fonseca, M; Agrellos, L; Sousa, JJ; Magalhães, L; Peres, E;

Publication
CENTERIS 2018 - International Conference on ENTERprise Information Systems / ProjMAN 2018 - International Conference on Project MANagement / HCist 2018 - International Conference on Health and Social Care Information Systems and Technologies 2018, Lisbon, Portugal

Abstract
While the popularity of virtual reality (VR) grows in a wide range of application contexts - e.g. entertainment, training, cultural heritage and medicine -, its economic impact is expected to reach around 15bn USD, by the year of 2020. Within VR field, 360video has been sparking the interest of development and research communities. However, editing tools supporting 360panoramas are usually expensive and/or demand programming skills and/or advanced user knowledge. Besides, application approaches to quickly and intuitively set up such 360video-based VR environments complemented with diverse types of parameterizable virtual assets and multimedia elements are still hard to find. Thereby, this paper aims to propose a system specification to simply and rapidly configure immersive VR environments composed of surrounding 360video spheres that can be complemented with parameterizable multimedia contents - namely 3D models, text and spatial sound -, whose behavior can be either time-range or user-interaction dependent. Moreover, a preliminary prototype that follows a substantial part of the previously mentioned specification and implements the enhancement of 360videos with time-range dependent virtual assets is presented. Preliminary tests evaluating usability and user satisfaction were also carried out with 30 participants, from which encouraging results were achieved. © 2018 The Authors. Published by Elsevier Ltd..

2018

Dynamic Capabilities, Marketing and Innovation Capabilities and their Impact on Competitive Advantage and Firm Performance

Authors
Ferreira, J; Cardim, S; Branco, F;

Publication
2018 13TH IBERIAN CONFERENCE ON INFORMATION SYSTEMS AND TECHNOLOGIES (CISTI)

Abstract
The objective of this paper is to examine the impact of dynamic capabilities (DCs) on competitiveness and performance of companies, considering the mediating role played by marketing capabilities (MCs) and innovation capabilities (ICs). The investigation of these effects is performed considering the moderating role of ambidexterity on the proposed relationships. This investigation advances a theoretical model tested using structural equation modelling (SEM). A 90-item questionnaire exploring the relationships between DCs and marketing and innovation was developed, and a total of 387 valid questionnaires were collected from a sample of Portuguese enterprises. The results show the existence of a positive direct and indirect influence of DCs on competitive advantage and performance variables as well as a direct impact on MCs and ICs. This study contributes to filling the gap in the existing research on the direct impact of DC's variables on competitive advantage and performance by considering mediating role of MCs and ICs.

2018

The MAL Interactors Animator: Supporting model validation through animation

Authors
Campos, JC; Sousa, N;

Publication
PROCEEDINGS OF THE ACM SIGCHI SYMPOSIUM ON ENGINEERING INTERACTIVE COMPUTING SYSTEMS (EICS'18)

Abstract
The IVY workbench is a model checking based tool for the analysis of interactive system designs. Experience shows that there is a need to complement the analytic power of model checking with support for model validation and analysis of verification results. Animation of the model provides this support by allowing iterative exploration of its behaviour. This paper introduces a new model animation plugin for the IVY workbench. The plugin (AniMAL) complements the modelling and verification capabilities of IVY by providing users with the possibility to interact directly with the model.

  • 1831
  • 4203