Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
Publications

2024

Lightweight 3D CNN for the Segmentation of Coronary Calcifications and Calcium Scoring

Authors
Santos, R; Baeza, R; Filipe, VM; Renna, F; Paredes, H; Pedrosa, J;

Publication
2024 IEEE 22ND MEDITERRANEAN ELECTROTECHNICAL CONFERENCE, MELECON 2024

Abstract
Coronary artery calcium is a good indicator of coronary artery disease and can be used for cardiovascular risk stratification. Over the years, different deep learning approaches have been proposed to automatically segment coronary calcifications in computed tomography scans and measure their extent through calcium scores. However, most methodologies have focused on using 2D architectures which neglect most of the information present in those scans. In this work, we use a 3D convolutional neural network capable of leveraging the 3D nature of computed tomography scans and including more context in the segmentation process. In addition, the selected network is lightweight, which means that we can have 3D convolutions while having low memory requirements. Our results show that the predictions of the model, trained on the COCA dataset, are close to the ground truth for the majority of the patients in the test set obtaining a Dice score of 0.90 +/- 0.16 and a Cohen's linearly weighted kappa of 0.88 in Agatston score risk categorization. In conclusion, our approach shows promise in the tasks of segmenting coronary artery calcifications and predicting calcium scores with the objectives of optimizing clinical workflow and performing cardiovascular risk stratification.

2024

Patterns of Data Anonymization

Authors
Monteiro, M; Correia, FF; Queiroz, PGG; Ramos, R; Trigo, D; Gonçalves, G;

Publication
Proceedings of the 29th European Conference on Pattern Languages of Programs, People, and Practices, EuroPLoP 2024, Irsee, Germany, July 3-7, 2024

Abstract
Over the years, sensitive data has been growing in software systems. To comply with ethical and legal requirements, the General Data Protection Regulation (GDPR) recommends using pseudonymization and anonymization techniques to ensure appropriate protection and privacy of personal data. Many anonymization techniques have been described in the literature, such as generalization or suppression, but deciding which methods to use in different contexts is not a straightforward task. Furthermore, anonymization poses two major challenges: choosing adequate techniques for a given context and achieving an optimal level of privacy while maintaining the utility of the data for the context within which it is meant to be used. To address these challenges, this paper describes four new design patterns: Generalization, Hierarchical Generalization, Suppress Outliers, and Relocate Outliers, building on existing literature to offer solutions for common anonymization challenges, including avoiding linkage attacks and managing the privacy-utility trade-off. © 2024 Copyright held by the owner/author(s).

2024

Data Collection Pipeline for Low-Resource Languages: A Case Study on Constructing a Tetun Text Corpus

Authors
de Jesus G.; Nunes S.;

Publication
2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation, LREC-COLING 2024 - Main Conference Proceedings

Abstract
This paper proposes Labadain Crawler, a data collection pipeline tailored to automate and optimize the process of constructing textual corpora from the web, with a specific target to low-resource languages. The system is built on top of Nutch, an open-source web crawler and data extraction framework, and incorporates language processing components such as a tokenizer and a language identification model. The pipeline efficacy is demonstrated through successful testing with Tetun, one of Timor-Leste's official languages, resulting in the construction of a high-quality Tetun text corpus comprising 321.7k sentences extracted from over 22k web pages. The contributions of this paper include the development of a Tetun tokenizer, a Tetun language identification model, and a Tetun text corpus, marking an important milestone in Tetun text information retrieval.

2024

Real-Time Parallel Programming for Homogeneous Multicores

Authors
Pinho, LM;

Publication
2024 IEEE 14TH INTERNATIONAL SYMPOSIUM ON INDUSTRIAL EMBEDDED SYSTEMS, SIES

Abstract
Developing real-time systems applications requires programming paradigms that can handle the specification of concurrent activities and timing constraints, and controlling execution on a particular platform. The increasing need for high-performance, and the use of fine-grained parallel execution, makes this an even more challenging task. This paper explores the state-of-the-art and challenges in real-time parallel application development, focusing on two research directions: one from the high- performance domain (using OpenMP) and another from the real-time and critical systems field (based on Ada). The paper reviews the features of each approach and highlights remaining open issues.

2024

D-Shaped Photonic Crystal Fiber SPR Sensor for Humidity Monitoring in Oils

Authors
Romeiro, F; Rodrigues, JB; Miranda, C; Cardoso, P; Silva, O; Costa, CWA; Giraldi, MR; Santos, L; Guerreiro, A;

Publication
EPJ Web of Conferences

Abstract
This theoretical study presents a D-shaped photonic crystal fiber (PCF) surface plasmon resonance (SPR) based sensor designed for humidity detection in transformer oil. Humidity refers to the presence of water dissolved or suspended in the oil, which can affect its dielectric properties and, consequently, the efficiency and safety of the transformer's operation, failures in the sealing system and the phenomenon of condensation can be the main sources of this humidity. This sensor leverages the unique properties of the coupling between surface plasmons and fiber guided mode at the Au-PCF interface to enhance the sensitivity to humidity changes in the external environment. The research demonstrated the sensor's efficacy in monitoring humidity levels ranging from 0% to 100% with an average sensitivity of measured at 1106.1 nm/RIU. This high sensitivity indicates a substantial shift in the resonance wavelength corresponding to minor changes in the refractive index caused by varying humidity levels, which is critically important in the context of transformer maintenance and safety. Transformer oil serves as both an insulator and a coolant, and its humidity level is a key parameter influencing the performance and longevity of transformers. Excessive humidity can lead to insulation failure and reduced efficiency and, therefore, the ability to accurately detect and monitor humidity levels in transformer oil can significantly enhance preventive maintenance strategies, reduce downtime, and prevent potential failures, ensuring the reliable operation of electrical power systems. © The Authors.

2024

METIS RTC as a computationally heavy system

Authors
Coppejans, H; Bertram, T; Briegel, F; Feldt, M; Kulas, M; Scheithauer, S; Correia, C; Obereder, A;

Publication
SOFTWARE AND CYBERINFRASTRUCTURE FOR ASTRONOMY VIII

Abstract
METIS, the Mid-infrared ELT Imager and Spectrograph, will operate an internal Single Conjugate Adaptive Optics (SCAO) system, which will mainly serve the science cases targeting exoplanets and disks around bright stars. The Extremely Large Telescope (ELT) is expected to have its first light in 2028, and the entire instrument recently passed its final design phase. The adaptive optics (AO) of METIS SCAO is designed to correct for atmospheric distortions, and is essential for diffraction-limited observations with METIS. The computational and data transfer requirements for these next generation ELT AO Real-Time Computers (RTCs) are enormous, and require advanced data processing and pipelining techniques. METIS SCAO will use a pyramid wavefront sensor (WFS), which captures incoming wavefronts at 1 kHz with a raw throughput of 148 MB/s. The RTC will ingest these WFS images on a frame-by-frame basis, compute the corrections and send them to the deformable mirror M4 and the tip/tilt mirror M5. The RTC is split up into two distinct systems: the Hard Real-Time Computer (HRTC) and the Soft Real-Time Computer (SRTC). The HRTC is responsible for computing the time sensitive wavefront control loop, while the SRTC is responsible for supervising and optimising the HRTC. A working prototype for the HRTC has been completed and operates with an RTC computation time of roughly 372 mu s. This computation is memory limited and runs on two NVIDIA A100 GPUs. This paper shows a breakdown of the HRTC on a CUDA kernel level, focusing on the tasks that run on the GPUs. We also present the performance of the HRTC and possible improvements for it.

  • 187
  • 4200