Cookies
O website necessita de alguns cookies e outros recursos semelhantes para funcionar. Caso o permita, o INESC TEC irá utilizar cookies para recolher dados sobre as suas visitas, contribuindo, assim, para estatísticas agregadas que permitem melhorar o nosso serviço. Ver mais
Aceitar Rejeitar
  • Menu
Publicações

2012

Image Analysis and Recognition - 9th International Conference, ICIAR 2012, Aveiro, Portugal, June 25-27, 2012. Proceedings, Part II

Autores
Campilho, AJC; Kamel, MS;

Publicação
ICIAR (2)

Abstract

2012

A region-based algorithm for automatic bone segmentation in volumetric CT

Autores
Rodrigues, PL; Moreira, AHJ; Fonseca, JC; Pinho, AC; Rodrigues, NF; Vilaca, JL;

Publicação
Image Processing: Methods, Applications and Challenges

Abstract
In Computed Tomography (CT), bone segmentation is considered an important step to extract bone parameters, which are frequently useful for computer-aided diagnosis, surgery and treatment of many diseases such as osteoporosis. Consequently, the development of accurate and reliable segmentation techniques is essential, since it often provides a great impact on quantitative image analysis and diagnosis outcome. This chapter presents an automated multistep approach for bone segmentation in volumetric CT datasets. It starts with a three-dimensional (3D) watershed operation on an image gradient magnitude. The outcome of the watershed algorithm is an over-partioning image of many 3D regions that can be merged, yielding a meaningful image partitioning. In order to reduce the number of regions, a merging procedure was performed that merges neighbouring regions presenting a mean intensity distribution difference of ±15%. Finally, once all bones have been distinguished in high contrast, the final 3D bone segmentation was achieved by selecting all regions with bone fragments, using the information retrieved by a threshold mask. The bones contours were accurately defined according to the watershed regions outlines instead of considering the thresholding segmentation result. This new method was tested to segment the rib cage on 185 CT images, acquired at the São João Hospital of Porto (Portugal) and evaluated using the dice similarity coefficient as a statistical validation metric, leading to a coefficient mean score of 0.89. This could represent a step forward towards accurate and automatic quantitative analysis in clinical environments and decreasing time-consumption, user dependence and subjectivity.

2012

Using Serious Games to Train Evacuation Behaviour

Autores
Ribeiro, J; Almeida, JE; Rossetti, RJF; Coelho, A; Coelho, AL;

Publicação
SISTEMAS Y TECNOLOGIAS DE INFORMACION, VOLS 1 AND 2

Abstract
Emergency evacuation plans and evacuation drills are mandatory in public buildings in many countries. Their importance is considerable when it comes to guarantee safety and protection during a crisis. However, sometimes discrepancies arise between the goals of the plan and its outcomes, because people find it hard to take them very seriously, or due to the financial and time resources required. Serious games are a possible solution to tackle this problem. They have been successfully applied in different areas such as health care and education, since they can simulate an environment/task quite accurately, making them a practical alternative to real-life simulations. This paper presents a serious game developed using Unity3D to recreate a virtual fire evacuation training tool. The prototype application was deployed which allowed the validation by user testing. A sample of 30 individuals tested the evacuating scenario, having to leave the building during a fire in the shortest time possible. Results have shown that users effectively end up learning some evacuation procedures from the activity, even if only to look for emergency signs indicating the best evacuation paths. It was also evidenced that users with higher video game experience had a significantly better performance.

2012

PSP PAIR: Automated Personal Software Process Performance Analysis and Improvement Recommendation

Autores
Duarte, CB; Faria, JP; Raza, M;

Publicação
2012 EIGHTH INTERNATIONAL CONFERENCE ON THE QUALITY OF INFORMATION AND COMMUNICATIONS TECHNOLOGY (QUATIC 2012)

Abstract
High-maturity software development processes, making intensive use of metrics and quantitative methods, such as the Personal Software Process (PSP) and the Team Software Process (TSP), can generate a significant amount of data that can be periodically analyzed to identify performance problems, determine their root causes and devise improvement actions. Currently, there are several tools that automate data collection and produce performance charts for manual analysis in the context of the PSP/TSP, but practically no tool support exists for automating the data analysis and the recommendation of improvement actions. Manual analysis of this performance data is problematic because of the large amount of data to analyze and the time and expertise required. Hence, we propose in this paper a performance model and a tool (named PSP PAIR) to automate the analysis of performance data produced in the context of the PSP, namely, identify performance problems and their root causes, and recommend improvement actions. The work presented is limited to the analysis of the time estimation performance of PSP developers, but is extensible to other performance indicators and development processes.

2012

Adaptive tool for automatic data collection of real electricity markets

Autores
Praca, I; Sousa, TM; Freitas, A; Pinto, T; Vale, Z; Silva, M;

Publicação
2012 23RD INTERNATIONAL WORKSHOP ON DATABASE AND EXPERT SYSTEMS APPLICATIONS (DEXA)

Abstract
The study of electricity markets operation has been gaining an increasing importance in last years, as result of the new challenges that the electricity markets restructuring produced. This restructuring increased the competitiveness of the market, but with it its complexity. The growing complexity and unpredictability of the market's evolution consequently increases the decision making difficulty. Therefore, the intervenient entities are forced to rethink their behaviour and market strategies. Currently, lots of information concerning electricity markets is available. These data, concerning innumerous regards of electricity markets operation, is accessible free of charge, and it is essential for understanding and suitably modelling electricity markets. This paper proposes a tool which is able to handle, store and dynamically update data. The development of the proposed tool is expected to be of great importance to improve the comprehension of electricity markets and the interactions among the involved entities.

2012

Multiadaptive Sampling for Lightweight Network Measurements

Autores
Silva, JMC; Lima, SR;

Publicação
2012 21ST INTERNATIONAL CONFERENCE ON COMPUTER COMMUNICATIONS AND NETWORKS (ICCCN)

Abstract
Facing the huge traffic volumes involved in today's networks it is of utmost importance to deploy efficient network measurement solutions to assist network management and traffic engineering tasks correctly, without interfering with normal network operation. Sampling techniques contribute effectively for this purpose as the amount of traffic processed is reduced, ideally without endangering the accuracy of network statistical behavior estimation. Although recent proposals of sampling techniques tend to improve the correctness of the estimation process, their underlying overhead is yet considerably when handling high traffic volumes. This paper proposes a new traffic sampling technique for performing lightweight network measurements. This technique, based on linear prediction, is multiadaptive regarding the packet sampling process, allowing to reduce significantly the amount of traffic under analysis while maintaining the representativeness of network samples for accurate network parameters' estimation. The performance evaluation of the sampling technique demonstrates the effectiveness and versatility of the proposal when considering real traces representing distinct traffic load scenarios. The statistical analysis provided evinces that the present solution outperforms classic sampling techniques, both in accuracy and amount of data involved in the measurement process.

  • 3311
  • 4362