2007
Autores
Cardoso, JS; Cardoso, JCS; Corte Real, L;
Publicação
2007 IEEE Workshop on Motion and Video Computing, WMVC 2007
Abstract
Automatic spatial video segmentation is a problem without a general solution at the current state-of-the-art. Most of the difficulties arise from the process of capturing images, which remain a very limited sample of the scene they represent. The capture of additional information, in the form of depth data, is a step forward to address this problem. We start by investigating the use of depth data for better image segmentation; a novel segmentation framework is proposed, with depth being mainly used to guide a segmentation algorithm on the colour information. Then, we extend the method to also incorporate motion information in the segmentation process. The effectiveness and simplicity of the proposed method is documented with results on a selected set of images sequences. The achieved quality raises the expectation for a significant improvement on operations relying on spatial video segmentation as a pre-process. ©2007 IEEE.
2007
Autores
Lagrange, M; Martins, LG; Teixeira, LF; Tzanetakis, G;
Publicação
2007 INTERNATIONAL WORKSHOP ON CONTENT-BASED MULTIMEDIA INDEXING, PROCEEDINGS
Abstract
In this paper, we study the use of audio and visual cues to perform speaker segmentation of audiovisual recordings of formal meetings such as interviews, lectures, or courtroom sessions. The sole use of audio cues for such recordings can be ineffective due to low recording quality and high level of background noise. We propose to use additional cues from the video stream by exploiting the relative static locations of speakers among the scene. The experiments show that the combination of those multiple cues helps to identify more robustly the transitions arriong speakers.
2007
Autores
Teixeira, LF; Corte Real, L;
Publicação
2007 IEEE Workshop on Motion and Video Computing, WMVC 2007
Abstract
The extraction of relevant objects (foreground) from a background is an important first step in many applications. We propose a technique that tackles this problem using a cascade of change detection tests, including noise-induced, illumination variation and structural changes. An objective comparison of pixel-wise modelling methods is first presented. Given its best relation performance/complexity, the mixture of Gaussians was chosen to be used in the proposed method to detect structural changes. Experimental results show that the cascade technique consistently outperforms the commonly used mixture of Gaussians, without additional post-processing and without the expense of processing overheads. ©2007 IEEE.
2007
Autores
Fitzal, F; Krois, W; Trischler, H; Wutzel, L; Riedl, O; Kuehbelboeck, U; Wintersteiner, B; Cardoso, MJ; Dubsky, P; Gnant, M; Jakesz, R; Wild, T;
Publicação
BREAST
Abstract
The cosmetic result after breast surgery is an important marker in clinical studies. Most authors used subjective scales to judge breast cosmesis. However, inter-observer discrepancies are very high and the use of such subjective scales for prospective trials is highly disputed. In this study we present for the first time a new invented breast symmetry index This BSI is calculated by subtracting the size and the shape between both breasts (frontal view and side view). The BSI is measured with a software system called breast analysing too( (BAT (c))) from digital. photographs. The photographs of 27 patients have been analysed with this software by different physicians to evolve inter-observer reproducibility. The Harris scale for subjective cosmetic analyses has been correlated with the BSI. In our study the inter-observer reproducibility was excellent (Pearson correlation r = 0.9; p < 0.05) and the 1351 was able to significantly differentiate between good and bad cosmesis (BSI values from 0%d to 30%d is good, BSI > 30%d is bad cosmesis). Thus the BSI may be used for clinical studies.
2007
Autores
Cardoso, MJ; Comba, AS; Moura, AJ; Magalhaes, A; Goncalves, V;
Publicação
EJC SUPPLEMENTS
Abstract
2007
Autores
Cardoso, MJ;
Publicação
BREAST CARE
Abstract
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.