2023
Authors
Franco-Goncalo, P; Alves-Pimenta, S; Goncalves, L; Colaco, B; Leite, P; Ribeiro, A; Ferreira, M; McEvoy, F; Ginja, M;
Publication
FRONTIERS IN VETERINARY SCIENCE
Abstract
Adequate radiographic positioning on the X-ray table is paramount for canine hip dysplasia (HD) screening. The aims of this study were to evaluate femoral parallelism on normal ventrodorsal hip extended (VDHE) view and the effect of femoral angulation (FA) on Norberg Angle (NA) and Hip Congruency Index (HCI). The femoral parallelism was evaluated comparing the alignment of the long femoral axis with the long body axis in normal VDHE views and the effect of FA on NA and HCI on repeated VDHE views with different levels of FA. The femoral long axis in normal VDHE views showed a ranged of FA from -4.85 degrees to 5.85 degrees, mean +/- standard deviation (SD) of -0.06 +/- 2.41 degrees, 95% CI [-4.88, 4.76 degrees]. In the paired views, the mean +/- SD femur adduction of 3.69 +/- 1.96 degrees led to a statistically significant decrease NA, and HCI, and femur abduction of 2.89 +/- 2.12 led to a statistically significant increase in NA and HCI (p < 0.05). The FA differences were also significantly correlated with both NA differences (r = 0.83) and HCI differences (r = 0.44) (p < 0.001). This work describes a methodology that allows evaluation of femoral parallelism in VDHE views and the results suggest that femur abduction yielded more desirable NA and HCI values and adduction impaired NA and HCI values. The positive linear association of FA with NA and HCI allows the use of regression equations to create corrections, to reduce the influence of poor femoral parallelism in the HD scoring.
2023
Authors
Silva-Reis, R; Faustino-Rocha, AI; Silva, J; Valada, A; Azevedo, T; Anjos, L; Gonçalves, L; Pinto, MdL; Ferreira, R; Silva, AMS; Cardoso, SM; Oliveira, PA;
Publication
Animals
Abstract
2024
Authors
Loureiro, C; Filipe, V; Franco-Gonçalo, P; Pereira, AI; Colaço, B; Alves-Pimenta, S; Ginja, M; Gonçalves, L;
Publication
OPTIMIZATION, LEARNING ALGORITHMS AND APPLICATIONS, PT II, OL2A 2023
Abstract
Radiography is the primary modality for diagnosing canine hip dysplasia (CHD), with visual assessment of radiographic features sometimes used for accurate diagnosis. However, these features typically constitute small regions of interest (ROI) within the overall image, yet they hold vital diagnostic information and are crucial for pathological analysis. Consequently, automated detection of ROIs becomes a critical preprocessing step in classification or segmentation systems. By correctly extracting the ROIs, the efficiency of retrieval and identification of pathological signs can be significantly improved. In this research study, we employed the most recent iteration of the YOLO (version 8) model to detect hip joints in a dataset of 133 pelvic radiographs. The best-performing model achieved a mean average precision (mAP50:95) of 0.81, indicating highly accurate detection of hip regions. Importantly, this model displayed feasibility for training on a relatively small dataset and exhibited promising potential for various medical applications.
2024
Authors
Franco-Gonçalo, P; Leite, P; Alves-Pimenta, S; Colaço, B; Gonçalves, L; Filipe, V; Mcevoy, F; Ferreira, M; Ginja, M;
Publication
VETERINARY SCIENCES
Abstract
Canine hip dysplasia (CHD) screening relies on accurate positioning in the ventrodorsal hip extended (VDHE) view, as even mild pelvic rotation can affect CHD scoring and impact breeding decisions. This study aimed to assess the association between pelvic rotation and asymmetry in obturator foramina areas (AOFAs) and to develop a computer vision model for automated AOFA measurement. In the first part, 203 radiographs were analyzed to examine the relationship between pelvic rotation, assessed through asymmetry in iliac wing and obturator foramina widths (AOFWs), and AOFAs. A significant association was found between pelvic rotation and AOFA, with AOFW showing a stronger correlation (R-2 = 0.92, p < 0.01). AOFW rotation values were categorized into minimal (n = 71), moderate (n = 41), marked (n = 37), and extreme (n = 54) groups, corresponding to mean AOFA +/- standard deviation values of 33.28 +/- 27.25, 54.73 +/- 27.98, 85.85 +/- 41.31, and 160.68 +/- 64.20 mm(2), respectively. ANOVA and post hoc testing confirmed significant differences in AOFA across these groups (p < 0.01). In part two, the dataset was expanded to 312 images to develop the automated AOFA model, with 80% allocated for training, 10% for validation, and 10% for testing. On the 32 test images, the model achieved high segmentation accuracy (Dice score = 0.96; Intersection over Union = 0.93), closely aligning with examiner measurements. Paired t-tests indicated no significant differences between the examiner and model's outputs (p > 0.05), though the Bland-Altman analysis identified occasional discrepancies. The model demonstrated excellent reliability (ICC = 0.99) with a standard error of 17.18 mm(2). A threshold of 50.46 mm(2) enabled effective differentiation between acceptable and excessive pelvic rotation. With additional training data, further improvements in precision are expected, enhancing the model's clinical utility.
2024
Authors
Loureiro, C; Gonçalves, L; Leite, P; Franco Gonçalo, P; Pereira, AI; Colaço, B; Alves Pimenta, S; McEvoy, F; Ginja, M; Filipe, V;
Publication
Multimedia Tools and Applications
Abstract
Radiographic canine hip dysplasia (CHD) diagnosis is crucial for breeding selection and disease management, delaying progression and alleviating the associated pain. Radiography is the primary imaging modality for CHD diagnosis, and visual assessment of radiographic features is sometimes used for accurate diagnosis. Specifically, alterations in femoral neck shape are crucial radiographic signs, with existing literature suggesting that dysplastic hips have a greater femoral neck thickness (FNT). In this study we aimed to develop a three-stage deep learning-based system that can automatically identify and quantify a femoral neck thickness index (FNTi) as a key metric to improve CHD diagnosis. Our system trained a keypoint detection model and a segmentation model to determine landmark and boundary coordinates of the femur and acetabulum, respectively. We then executed a series of mathematical operations to calculate the FNTi. The keypoint detection model achieved a mean absolute error (MAE) of 0.013 during training, while the femur segmentation results achieved a dice score (DS) of 0.978. Our three-stage deep learning-based system achieved an intraclass correlation coefficient of 0.86 (95% confidence interval) and showed no significant differences in paired t-test compared to a specialist (p > 0.05). As far as we know, this is the initial study to thoroughly measure FNTi by applying computer vision and deep learning-based approaches, which can provide reliable support in CHD diagnosis. © The Author(s) 2024.
2024
Authors
Pires, D; Filipe, V; Gonçalves, L; Sousa, A;
Publication
WIRELESS MOBILE COMMUNICATION AND HEALTHCARE, MOBIHEALTH 2023
Abstract
Growing obesity has been a worldwide issue for several years. This is the outcome of common nutritional disorders which results in obese individuals who are prone to many diseases. Managing diet while simultaneously dealing with the obligations of a working adult can be difficult. Today, people have a very fast-paced life and sometimes neglect food choices. In order to simplify the interpretation of the Nutri-score labeling this paper proposes a method capable of automatically reading food labels with this format. This method is intended to support users when choosing the products to buy based on the letter identification of the label. For this purpose, a dataset was created, and a prototype mobile application was developed using a deep learning network to recognize the Nutri-score information. Although the final solution is still in progress, the reading module, which includes the proposed method, achieved an encouraging and promising accuracy (above 90%). The upcoming developments of the model include information to the user about the nutritional value of the analyzed product combining it's Nutri-score label and composition.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.